Using custom embeddings for pre-training model for new vocabulary

Hi,
I have my own custom embeddings that I want to use for pre-trainings the BertMaskedForLM model for an entirely new vocabulary.

How do I pass these custom embeddings ?