Training with class weights

Hi everyone,

I am new to the huggingface library and I am fine-tuning a bert model for an imbalanced dataset with custom Trainer class and class weights:

from torch import nn
from transformers import Trainer


class CustomTrainer(Trainer):
    def compute_loss(self, model, inputs, return_outputs=False):
        labels = inputs.pop("labels")
        # forward pass
        outputs = model(**inputs)
        logits = outputs.get("logits")
        # compute custom loss (suppose one has 2 labels with different weights)
        loss_fct = nn.CrossEntropyLoss(weight=torch.tensor([8.0, 1.0], device=model.device))
        loss = loss_fct(logits.view(-1, self.model.config.num_labels), labels.view(-1))
        return (loss, outputs) if return_outputs else loss

However, I got same results while training with class weights [10.0, 1.0] and [9.0, 1.0]. Is that normal? Or is there something wrong with my initialization? (I am initializing the model with model_init option.)

Thanks for reading!