I am using TFGPT2LMHeadModel and GPT2LMHeadModel.When i use GPT2LMHeadModel weight to initialize TFGPT2LMHeadModel, there is some weight is not used.I'm comfirm the config file is the same one, but why is it happened?

I am using TFGPT2LMHeadModel and GPT2LMHeadModel.When i use GPT2LMHeadModel weight to initialize TFGPT2LMHeadModel, there is some weight is not used.I’m comfirm the config file is the same one, but why is it happened?
information as follow:

Some weights of the PyTorch model were not used when initializing the TF 2.0 model TFGPT2LMHeadModel: [‘transformer.h.3.attn.masked_bias’, ‘transformer.h.0.attn.masked_bias’, ‘transformer.h.2.attn.masked_bias’, ‘transformer.h.4.attn.masked_bias’, ‘lm_head.weight’, ‘transformer.h.1.attn.masked_bias’, ‘transformer.h.5.attn.masked_bias’]