Adding a new mask_token for BERT-like models/tokenizers

Hello!

Quick question about adding special tokens. I want to add a new mask token and train its embedding from scratch. Will this code reinitialize the special token embedding or keep the same one as the original mask token? If not, how could I go about doing so?

tokenizer = AutoTokenizer.from_pretrained(“bert-base”)
model = AutoModelForMaskedLM.from_pretrained(“bert-base”)
special_tokens_dict = {“mask_token”: “[NEW]”}
num_added_toks = tokenizer.add_special_tokens(special_tokens_dict)
model.resize_token_embeddings(len(tokenizer))