Compute_loss in Trainer - Evaluate / class_weights

Hello guys,

I have religiously followed the task “Token Classification” from Hugging Face (here)

The guide is great. My model is working well. Evaluate gave me all the metrics and the F1_score.

Unfortunately, my dataset is heavily unbalanced, so I decided to just add a custom loss using the following documentation from Trainer.

Override the loss function

class CustomTrainer(Trainer):
    def compute_loss(self, model, inputs, return_outputs=False):
        labels = inputs.get("labels")
        # forward pass
        outputs = model(**inputs)
        logits = outputs.get("logits")
        # compute custom loss
        loss_fct = nn.CrossEntropyLoss(weight=class_weights)
        loss = loss_fct(logits.view(-1, self.model.config.num_labels), labels.view(-1))
        return (loss, outputs) if return_outputs else loss

I got the following error on the evaluation after the first Epoch.

Epoch	Training Loss	Validation Loss	Precision	Recall	F1	Accuracy
1	No log	-0.188098	0.000000	0.000000	0.000000	0.758095
2	No log	-0.195945	0.000000	0.000000	0.000000	0.758095

/usr/local/lib/python3.10/dist-packages/seqeval/metrics/v1.py:57: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.
  _warn_prf(average, modifier, msg_start, len(result))
/usr/local/lib/python3.10/dist-packages/seqeval/metrics/v1.py:57: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 due to no predicted samples. Use `zero_division` parameter to control this behavior.
  _warn_prf(average, modifier, msg_start, len(result))
TrainOutput(global_step=198, training_loss=0.12444540466925111, metrics={'train_runtime': 3245.6095, 'train_samples_per_second': 0.975, 'train_steps_per_second': 0.061, 'total_flos': 56929757663484.0, 'train_loss': 0.12444540466925111, 'epoch': 2.0})

I tried to add label_names=[“labels”] in the training_args…

Thank you in advance!