How to save tokenizer without padding token ID being changed?

Hi,

I basically add a padding token to GPT tokenizer using the following

pad_token = '[pad]'
tokenizer = AutoTokenizer.from_pretrained("openai-gpt", use_fast=False)
tokenizer.pad_token = pad_token

I then store this tokenizer using the following code

print(tokenizer.pad_token_id) -- returns 0
tokenizer.save_pretrained("~/gpt_finetuned-2", use_fast = False)

Then when I load this tokenizer again, the tokenizer.pad_token_id now returns 40478. Why did it change token id??

tokenizer = OpenAIGPTTokenizer.from_pretrained('~/gpt_finetuned-2', use_fast = False)
print(tokenizer.pad_token_id) -- returns 40478