Uninitiallized weights with supposed correct architecture

Why when running this code below (with any squad model),

from transformers import AutoTokenizer, AutoModelForQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("mrm8488/spanbert-finetuned-squadv2")
model = AutoModelForQuestionAnswering.from_pretrained("mrm8488/spanbert-finetuned-squadv2")

I receive this warning?

Some weights of the model checkpoint at mrm8488/spanbert-finetuned-squadv1 were not used when initializing BertForQuestionAnswering: ['bert.pooler.dense.bias', 'bert.pooler.dense.weight']
- This IS expected if you are initializing BertForQuestionAnswering from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing BertForQuestionAnswering from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).

This is just a warning and is normal. You’ll find this warning with other models as well.

What it indicates is that if you are loading a model from a checkpoint for further fine-tuning on a similar task, this warning should not occur.

But say, you load a model say Bert from its pre-training checkpoint on a different task, for e.g. sequence classification, this warning is normal.