This is very much a beginner question. I am trying to load a fine tuned model for multi-label text classification. Fine-tuned on 11 labels, bert-base-uncased.
I just want to feed new text to the model and get the labels predicted to be associated with the text.
I have looked everywhere and cannot find an example of how to actually load and use a fine-tuned model on new data after fine tuning is complete.
I have the model saved via:
model.save_pretrained("fine_tuned_model")
I can load the model back with:
model = AutoModelForSequenceClassification.from_pretrained("fine_tuned_model", from_tf=False, config=config)
From here I am stuck. I have been told to use model.predict("text") but when I do I get the following error:
'BertForSequenceClassification' object has no attribute 'predict'
I hope this makes sense and any help would be greatly appreciated.
Is there a way to get the scores for all labels (I have 11) and have the label values mapped to their respective labels? So the output would include 11 returns with scores?
For instance, instead of ‘LABEL_7’ it would say ‘Technology’ and so on and so forth.