Seq-2-Seq Predictions for Longer Sequences and Question for compute metrics function

Hello all I am new to the Hugging Face forums,and pretty new tot the Transformers library in general, I have two questions that I was specially looking for answers, pardon for my stupidity.

TLDR


Transformers Noob

Finetuning Seq-2-Seq models on an internal dataset

- Unable to understand how to change the model configs for longer sequence length predictions

- Unable to understand how to write my own compute metrics function for eval section of the the trainier

I am working on a problem which is Seq-2-Seq which takes user prompts as an inputs and generates text which is later used for downstream processing of certain rule based systems. The data which I am using is an internal data which can’t be shared to the public, right now I have tried to finetune google/t5-base and google/mt5-base on my dataset, the outputs that I have gotten are pretty good too, but the only issue I see is that the outputs sequence is not complete (there is more text to be generated) which I suspect is because the default output length is not long enough

Also I am using the trainer API for training my model, I don’t really understand the compute metrics function and how it’s supposed to work, the default ones that I find on the forums throw some compute errors, can anyone please help me figuring out these two things thank you :slight_smile: