I have a pretrained BERT model for a classification task trained on the pytorch_pretrained_bert library. I would like to use the initial weights from this model for further training with transformers library.
When I try to load this model, I get the following runtime error. Does anyone know how to resolve this?
model = BertClassification(weight_path= pretrained_weights_path, num_labels=num_labels)
state_dict = torch.load(fine_tuned_weight_path,map_location=âcuda:0â)
model.load_state_dict(state_dict)
RuntimeError: Error(s) in loading state_dict for BertClassification:
Missing key(s) in state_dict: âbert.embeddings.position_idsâ.
Thanks very much.