When to use AutoModelForSeq2SeqLM?

Can anyone tell me when to use AutoModelForSeq2SeqLM. Is it generally used for translation? Can I use AutoModelForSeq2SeqLM for fine tuning a custom task using t5 model. If not when to use AutoModelForSeq2SeqLM and T5ForConditionalGeneration?

When to use

  1. AutoModelForSeq2SeqLM.from_pretrained(‘t5-base’)
  2. T5ForConditionalGeneration.from_pretrained(‘t5-base’)


AutoModelForSeq2SeqLM can be used to load any seq2seq (or encoder-decoder) model that has a language modeling (LM) head on top. These include BART, PEGASUS, T5, etc. You can check the full list of supported models in the docs: Auto Classes

So when you do AutoModelForSeq2SeqLM.from_pretrained(‘t5-base’), it will actually load a T5ForConditionalGeneration for you behind the scenes.