Multilabel text classification Trainer API

Hi all,
Can someone help me to do a multilabel classification with the Trainer API ?

Sure, all you need to do is make sure the problem_type of the model’s configuration is set to multi_label_classification, e.g.:

from transformers import BertForSequenceClassification

model = BertForSequenceClassification.from_pretrained("bert-base-uncased", num_labels=10, problem_type="multi_label_classification")

This will make sure the appropriate loss function is used (namely, binary cross entropy). Note that the current version of Transformers does not support this problem_type for any model, but the next version of Transformers will (as per PR #14180).

I suggest taking a look at the example notebook to do multi-label classification using the Trainer.

Update: I made a notebook myself to illustrate how to fine-tune any encoder-only Transformer model for multi-label text classification: Transformers-Tutorials/Fine_tuning_BERT_(and_friends)_for_multi_label_text_classification.ipynb at master · NielsRogge/Transformers-Tutorials · GitHub

2 Likes

Hi, may I ask here where we can find which models are supported for multilabel classification?
Thank you in advance

Hello @nielsr !
Thanks a lot for your example ! I’ve tried it on my data and accuracy stay at 0 and roc auc at 0.5. I’m clearly having an issue but I can’t find why.
I have 420 labels, which may be the reason why i’m having this issue ?
I’m beginner and I clearly don’t know where to start to fix this, any help would be greatly appreciated :slight_smile:

Thanks !

1 Like

I am too a beginner but 420 labels a lot, like if there’s a lot of training data then it’s fine but . . . if everything’s right with the code then - less data may be the problem.