Using Huggingface Trainer for custom models

Say I want to train a simple LSTM or MLP with Trainer (Pytroch nn.Modules). Do I just need to ensure the model adheres to the following?

Is there an example of using Trainer to train models that are not HF Transformers models? Best practices?

1 Like

I think HF trainer API is specifically for transformers but not for other models.

We don’t have an example, but as long as you follow the recommendation in that list of the documentation, you should be fine.

1 Like

and if you use it successfully and want to do a short writeup, publish it, we’ll make sure to share your writeup!

1 Like

Confirmed that you can train a simple LSTM or MLP with Trainer. This is nice since I can just stay within the HF ecosystem. I’m not sure I’ll have time to do a write-up but as long as you follow that list in the original post, it will work.

1 Like