Hi I am new to transformers. I am trying to train mT5, but I couldn’t find “max_position_embeddings” configuration. How to define the maximum sequence length in mT5?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
In transformer, if the text exceeds max_seq_length, how to deal with it | 0 | 363 | January 19, 2024 | |
Increasing pretrained CLIP max possible text sequence length | 2 | 1276 | November 7, 2024 | |
Max Seq Lengths | 1 | 530 | December 6, 2024 | |
Fine-tuning BERT with sequences longer than 512 tokens | 7 | 26577 | April 4, 2022 | |
Seq-2-Seq Predictions for Longer Sequences and Question for compute metrics function | 0 | 451 | December 16, 2021 |