Hello,
I want to use HateBERT from the paper’s repository, which is a BERT model that its pre-training was extended in abusive language.
In order to do that, I created a BERT model (bert-base-uncased
in PyTorch
) and tried to load HateBERT’s weights with load_state_dict()
(after having made minor changes in parameters names, to match BERT’s).
load_state_dict()
throws the error:
RuntimeError: Error(s) in loading state_dict for BertModel:
Missing key(s) in state_dict: "embeddings.position_ids".
which means that the BERT model requires embeddings.position_ids
. I checked this tensor and it is not a PyTorch
parameters tensor, just a tensor. From the error message, it is also evident that all other parameters match, otherwise their names would also be mentioned. Can someone explain?