Class weights for bertForSequenceClassification

I have an unbalanced data with a couple of classes with relatively smaller sample sizes. I am wondering if there is a way to assign the class weights to BertFor SequenceClassification class, maybe in BertConfig ?, as we can do in nn.CrossEntropyLoss.

Thank you, in advance!

No, you need to compute the loss outside of the model for this. If you’re using Trainer, see here on how to change the loss form the default computed by the model.

Hi Sylvain,

Glad to hear from you, outside of FastAI :slight_smile: Well, I am here as “Beginner” and will have to study more about Trainer. In the meantime, I tried the following:

  1. run BertForSequenceClassification as usual
  2. Take out logits from output (discard the loss from Bert run)
  3. calculate new loss from nn.CrossEntropyLoss
  4. and then calculate loss.backward()

Model runs okay, but I am not sure if this is a legitimate approach…

Thanks.

That is the correct approach!

Fantastic!!! This means a lot that I got support from somebody like you, Sylvain! Thanks!! :laughing: