Predicting On New Text With Fine-Tuned Multi-Label Model

This is very much a beginner question. I am trying to load a fine tuned model for multi-label text classification. Fine-tuned on 11 labels, bert-base-uncased.

I just want to feed new text to the model and get the labels predicted to be associated with the text.

I have looked everywhere and cannot find an example of how to actually load and use a fine-tuned model on new data after fine tuning is complete.

I have the model saved via:

model.save_pretrained("fine_tuned_model")

I can load the model back with:

model = AutoModelForSequenceClassification.from_pretrained("fine_tuned_model", from_tf=False, config=config)

From here I am stuck. I have been told to use model.predict("text") but when I do I get the following error:

'BertForSequenceClassification' object has no attribute 'predict'

I hope this makes sense and any help would be greatly appreciated.

1 Like

Hello and welcome to our Forum :wave::hugs:
I’d suggest you to use the Text Classification pipeline instead.

from transformers import pipeline
clf = pipeline("text-classification", fine_tuned_model)
answer = clf("text)
2 Likes

Hey @merve, thank you so much for the suggestion and assistance! The code you provided works but only returns one of my 11 labels per text entered:

#Output: [{'label': 'LABEL_7', 'score': 0.9935219883918762}]

Is there a way to get the scores for all labels (I have 11) and have the label values mapped to their respective labels? So the output would include 11 returns with scores?

For instance, instead of ‘LABEL_7’ it would say ‘Technology’ and so on and so forth.

1 Like

To get all scores, pipeline has a parameter

clf("text", return_all_scores=True)

For the label being LABEL_7, you need to check out the config.json in your repo. See for example id2label and label2id in config.json · distilbert-base-uncased-finetuned-sst-2-english at main.

5 Likes

Hey @osanseviero , thank you so much! I really appreciate the kind response!