Hi,
I trained a model as follows:
checkpoint = “bert-base-uncased”
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForSequenceClassification.from_pretrained(checkpoint, num_labels=5)
After 3 epochs I wanted to finetune this model again with the following code (I tried all three versions):
#1
model = BertForSequenceClassification.from_pretrained(“C:/Users/THINK/Dysk Google/_Priv/_Courses/Huggingface/Fine_tuned_models/rmp_eval_bert_base_uncased/”)
#2
model = BertModel.from_pretrained(‘C:/Users/THINK/Dysk Google/_Priv/_Courses/Huggingface/Fine_tuned_models/rmp_eval_bert_base_uncased/’)
#3
config = BertConfig.from_json_file(‘C:/Users/THINK/Dysk Google/_Priv/_Courses/Huggingface/Fine_tuned_models/rmp_eval_bert_base_uncased/config.json’)
model = BertModel.from_pretrained(‘C:/Users/THINK/Dysk Google/_Priv/_Courses/Huggingface/Fine_tuned_models/rmp_eval_bert_base_uncased/’, config=config)
But in each case I got the error:
TypeError: forward() got an unexpected keyword argument ‘labels’
The input format is the same as before.
The model was saved as:
output_dir = ‘C:/Users/THINK/Dysk Google/_Priv/_Courses/Huggingface/Fine_tuned_models/rmp_diff_bert_base_uncased’
model.save_pretrained(output_dir)
Can you help me, what code I should use to load a custom pretrained bert_base_uncased model?
Thank you.