Hi,
i was wondering what the default max seq lengths are when using trainer api?
I am fine tuning on a RoBERTa based model
Hi,
i was wondering what the default max seq lengths are when using trainer api?
I am fine tuning on a RoBERTa based model
are you wondering what values or if to set it in the tokenizer and trainer?