Missing keys when loading a model checkpoint (transformer)

Downloaded bert transformer model locally, and missing keys exception is seen prior to any training

Torch 1.8.0 , Cuda 10.1 transformers 4.6.1
bert model was locally saved using git command

git clone https://huggingface.co/bert-base-uncased

in the code below i am loading the model and state_dict from the same exact folder where i have downloaded the model fresh from the huggingface repo.

from transformers import AutoModel
import torch
state_dict = torch.load(".../bert-base-uncased/pytorch_model.bin")
model = AutoModel.from_pretrained("..../bert-base-uncased/")
model.load_state_dict(state_dict, strict=False)

Exception

IncompatibleKeys(missing_keys=['embeddings.position_ids', 'embeddings.word_embeddings.weight' …

Any help is appreciated.