Max Seq Lengths

Hi,
i was wondering what the default max seq lengths are when using trainer api?

I am fine tuning on a RoBERTa based model

1 Like

are you wondering what values or if to set it in the tokenizer and trainer?

1 Like