Fine-tuned pre-trained Roberta model on different labels

Hi,

I was wondering if its possible to save the model parameters of a fine-tuned roberta model on sentiment analysis of 2 classes and then load it into another model to solve the task of text classification of different number of classes, example 4.

Example :
if i trained RobertaForSequenceClassification.from_pretrained(‘roberta-base’, num_labels = 2).
I would like to save only the base model without the classification head and create a new class which takes in the saved model and insert a classification head of output = 4, while freezing the base model and only fine-tuning the head for the new task.

Would that work? and how would i go above saving and loading only the base model?

1 Like