Notebooks are now automatically created from the tutorials in the documentation of transformers. you can find them all here or click on the brand new “Open in colab” button on the doc pages that have one.
Plan is to add more tutorials and the corresponding notebooks very soon. I’ll focus on integration with nlp datasets and the Trainer in the next few weeks, but if you have ideas of tutorials you’d like to see, please let me know here.
@mrm8488, this may not be exactly what you’re looking for but I have been finetuning GPT2 with a language model head using my own loss function. I tried to detail in this post as much of the implementation as possible. Maybe parts of it will be useful for you as well since I am also using Trainer.