Open Source untrained transformer language model?

I’d like to start playing around with transformer language models, tweaking parameters and making changes to see how it effects the results and such, and I’d rather not go through the process of building one from scratch. It would be nice if I had a baseline model and training data that are known to create a decent text generation language model once trained, but all I can find online are pre-trained weights.

So far the closest thing to what I want is this tutorial: Transformer model for language understanding  |  Text  |  TensorFlow which contains source code and a dataset for training a language model, although it creates a model for translating between languages, while I’d prefer just general text generation. Does anything like this exist?

I don’t really care what base language/libraries it uses, I’m familiar with most. I also don’t mind if the model is moderately big, as long as I can do training in a few days on an RTX 3080.