Error while training a custom pretrained model

Hi,

I trained a model as follows:
checkpoint = “bert-base-uncased”
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForSequenceClassification.from_pretrained(checkpoint, num_labels=5)

After 3 epochs I wanted to finetune this model again with the following code (I tried all three versions):
#1
model = BertForSequenceClassification.from_pretrained(“C:/Users/THINK/Dysk Google/_Priv/_Courses/Huggingface/Fine_tuned_models/rmp_eval_bert_base_uncased/”)
#2
model = BertModel.from_pretrained(‘C:/Users/THINK/Dysk Google/_Priv/_Courses/Huggingface/Fine_tuned_models/rmp_eval_bert_base_uncased/’)
#3
config = BertConfig.from_json_file(‘C:/Users/THINK/Dysk Google/_Priv/_Courses/Huggingface/Fine_tuned_models/rmp_eval_bert_base_uncased/config.json’)
model = BertModel.from_pretrained(‘C:/Users/THINK/Dysk Google/_Priv/_Courses/Huggingface/Fine_tuned_models/rmp_eval_bert_base_uncased/’, config=config)

But in each case I got the error:
TypeError: forward() got an unexpected keyword argument ‘labels’

The input format is the same as before.
The model was saved as:
output_dir = ‘C:/Users/THINK/Dysk Google/_Priv/_Courses/Huggingface/Fine_tuned_models/rmp_diff_bert_base_uncased’
model.save_pretrained(output_dir)

Can you help me, what code I should use to load a custom pretrained bert_base_uncased model?
Thank you.

It’s logical you got that error for the last two lines since BertModel does not accept a labels argument. I don’t think you had that exact error for the first one as it does accept a labels argument.

After your comment I tried to replicate the earlier error on another computer, and I could not. The additional fine-tuning of custom BertForSequenceClassification model saved earlier was done without any error. So you were right. Thank you.

1 Like