Some weights of BertModel were not initialized from the model checkpoint

I able to train on a word level, after that i test with fill-mask pipeline and get below warning

I get this error :

Some weights of BertModel were not initialized from the model checkpoint at ./output_model and are newly initialized: ['bert.pooler.dense.weight', 'bert.pooler.dense.bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
tokenizer = PreTrainedTokenizerFast(tokenizer_file="./my-tokenizer.json")
model = BertForMaskedLM(config=BertConfig(vovab_size=1000000))
< after training>

fill_mask = pipeline(
    "fill-mask",
    model="./output_model",
    tokenizer=tokenizer
) # output warning above 

2 Likes

Hi,

I also get the same warning, while using AutoModelForMaskedLM on a fill-mask pipeline. Even though I finetuned it with AutoModelForMaskedLM…

Having randomly initialized layers should not be good for using the model. Is there any help to solve it?

This is interesting, thanks for reporting.

I’m opening an issue on Github as I’m encountering a similar issue.