Hi, I am using the Trainer
with BERT for a classification Task. I saved the model with this command:
model = transformers.AutoModelForSequenceClassification.from_pretrained("bert-base-uncased")
...
trainer.train()
_ = trainer.save_model("history/")
Since BERT does not use a classification head, this code adds one, if I remember right. However, it appears that when I load this model after training with
config = transformers.AutoConfig.from_pretrained("history")
model = transformers.AutoModelForSequenceClassification.from_config(config)
tokenizer = transformers.AutoTokenizer.from_pretrained("learning/checkpoint-31500")
pipe = transformers.TextClassificationPipeline(
model=model,
tokenizer=tokenizer,
return_all_scores=True)
It seems to create a new head again, not use the one it originally trained. This makes loading and using a trained model not possible.
How can I save the head that Trainer
trained for bert within the model so I can load and use it again?
Thanks in advance!