Pre - Train model with inputs_embeds

I am using BertForPreTraining to pre-train a model from scratch, is there a way that the input for the model would be vectors to inputs_embeds instead of tokens (input_ids) , while still implementing masking tasks? How can I train a model on MLM without the option to use [MASK] token?