Train a transformer from scratch

Hello,guys!
Huggingface/transformers is such a convenient library to use when it comes to all sorts of pertained model. But I am wondering is there a convenient way to train a model from scratch ?
If I want to rebuild the model in Attention is all you need , the first thought came into my mind is change modeling_bart.py to adapt to Attention is all you need setting and do not from_pretrained, Is there any better way to do it ?
I am looking forward to your reply