mT5 maximum sequence length

Hi I am new to transformers. I am trying to train mT5, but I couldn’t find “max_position_embeddings” configuration. How to define the maximum sequence length in mT5?