How can I pretrain a new model re-initializing with my own vocab?

I want to pretrain with BERT model defined in HuggingFace transformer maintaining all the architectural details and config stuffs, but re-initializing the vocabs (removing pre-defined ones) and also re-initializing the weights.

Definitely speaking, I want to change tokenize method and vocabs but want to still use BERT architectures. Is it possible? II couldn’t find any useful stuffs related to it. So much thx!!