Missing keys "model.embeddings.position_ids" when loading model using state_dict

I have saved the model like

model_state_dict = model.module.state_dict()
torch.save({‘model_state_dict’: model_state_dict}, osp.join(save_dir, ‘best.ckpt’))

Now I try to load the model like

model_path = “./models/best.ckpt”
ckpt = torch.load(model_path)
model.load_state_dict(ckpt[‘model_state_dict’])

then it has the error. There are “position_ids” in model, but not in the saved ckpt file.
Anyway to skip loading position_ids ?

Hi @jyliu, is there any specif reason for not using .save_pretrained and .from_pretrained

Thanks for your reply. Because the bert-model is a part of my whole model. I directly saved the whole model. So what is the best practice under this situation ?

Will it be possible for you to create colab for this ?

Also, to take advantage of .from_pretrained and .save_pretrained you can sub-class the BertPretrainedModel and add the additional layers in it. See these task specific bert models, they use bert and additional layer on top of it and subclass BertPreTrainedModel.

let me know if this solves your problem.

Thanks. This will work for me.