Is there any example of training BART for text-to-text generation?

Hey guys, according to HuggingFace docs BART is one of the models that is a good fit for text-to-text generation.

I’m looking for a code example of building a custom tokenizer and training this model from scratch using a custom dataset. Are there any code examples for this?

Here’s the dataset I’m trying to use: MihaiIonascu/NL-to-IaC-train · Datasets at Hugging Face