Loading model from pytorch_pretrained_bert into transformers library

I have a pretrained BERT model for a classification task trained on the pytorch_pretrained_bert library. I would like to use the initial weights from this model for further training with transformers library.

When I try to load this model, I get the following runtime error. Does anyone know how to resolve this?

model = BertClassification(weight_path= pretrained_weights_path, num_labels=num_labels)
state_dict = torch.load(fine_tuned_weight_path,map_location=‘cuda:0’)
model.load_state_dict(state_dict)

RuntimeError: Error(s) in loading state_dict for BertClassification:
Missing key(s) in state_dict: “bert.embeddings.position_ids”.

Thanks very much.

Hi. Please help with this. I am facing the same issue.

1 Like

Hi. This is probably caused by the transformer verison. You might downgrade your transformer version from 4.4 to 2.8 with pip install transformers==2.8.0