Using finBert for 7-class sequence classification

Hi,

I am a beginner to Hugging Face/Transformers and a question of a more conceptual manner have risen.

My end goal is to use ProsusAI/finbert (ProsusAI/finbert · Hugging Face) for a Sequence Classification problem with 7 classes.

The two different statements yield a Bert-model with and without header:

from transformers import AutoModel, AutoModelForSequenceClassification
checkpoint = "ProsusAI/finbert"
model_pretrained = AutoModel.from_pretrained(checkpoint)
model_finetuned = AutoModelForSequenceClassification(checkpoint)

The pretrained model is loaded without a header (meaning, the “forward”-method does not include labels) - this is a BertModel. The finetuned model is loaded with the header that includes a pre-defined architecture, which involves only 3 class-prediction - this is a BertModelForClassificaiton.

My question is, how do I create a BertModelForClassification for a 7-class prediction problem using the pre-trained model and weights? Unfortunately, this does not work since the finetuned model comes with pre-specified architecture:

from transformers import AutoModel, AutoModelForSequenceClassification
checkpoint = "ProsusAI/finbert"
model_finetuned = AutoModelForSequenceClassification(checkpoint, num_labels = 7)

The only solution I was able to find myself is to code/modify the output-layer myself using pytorch. But it this the most straight-forward solution?

I hope it makes sense.

/Mathias

1 Like

I have the same problem. I set num_labels=2 but the model still tries to use 3 labels. Don’t know how to fix.