Tutorial notebooks

Notebooks are now automatically created from the tutorials in the documentation of transformers. you can find them all here or click on the brand new “Open in colab” button on the doc pages that have one.

Plan is to add more tutorials and the corresponding notebooks very soon. I’ll focus on integration with nlp datasets and the Trainer in the next few weeks, but if you have ideas of tutorials you’d like to see, please let me know here.

9 Likes

Awesome! Thanks @sgugger

1 Like

I would like to see a tutorial about integrating nlp with train a LM from scratch

1 Like

Very excited for nlp datasets and the Trainer as I’ve had difficulty implementing them after the recent changes to load_dataset()

If I’m correct, the examples are planned to be converted so that they will use the nlp datasets package. @mrm8488 @rbin

1 Like

Yes, I am waiting for it!

@mrm8488, this may not be exactly what you’re looking for but I have been finetuning GPT2 with a language model head using my own loss function. I tried to detail in this post as much of the implementation as possible. Maybe parts of it will be useful for you as well since I am also using Trainer.

1 Like

Thank you so much! I will give a try!

Hi, I think an example of multi-task learning would be great. Especially one with different loss functions for each task.

1 Like

do you have any multi-task learning code for reference?