Training of new ELECTRA or ConvBERT language model possible?

Hi HF team,

I would like to know if it is possible to train ELECTRA (or ConvBERT) language models (self-supervised) with the HF transformers code from scratch.

AFAIK this functionality is not available in Transformers and you have to use the ELECTRA or ConvBERT code to train that language models from scratch and use TPUs.

Is that right?
Maybe @sgugger or @stefan-it could help?