Hi I am new to transformers. I am trying to train mT5, but I couldn’t find “max_position_embeddings” configuration. How to define the maximum sequence length in mT5?
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Max length transformers problem | 0 | 132 | March 4, 2023 | |
| Claritifcation about the `max_position_embeddings` argument | 1 | 511 | January 27, 2023 | |
| In transformer, if the text exceeds max_seq_length, how to deal with it | 0 | 387 | January 19, 2024 | |
| Error using `max_length` in transformers | 3 | 2708 | February 26, 2021 | |
| Possibly incorrect sequence length warning for sequences greater than model_max_length | 0 | 1396 | April 18, 2022 |