Fine tunning pretrained bert with new vocabulary

hello,
i want to fine tune pretrained bert model for masked word prediction, but i have a different vocabulary, for example: “ACADMBLD123”.
Is there anyway to add my vocabs to the model, and train it to predicted the masked word in my sequence of special vocabs?
Thanks