Hello, im just curious about the value of model_max_length in some tokenizer configs.
Some model have a value, e.g.: all T5-based models have a model_max_length of 512.
In some models (e.g.: MBZUAI/bactrian-x-llama-13b-merged) there is no value set but the default VERY_LARGE_INTEGER.
Do some tokenizers have no limit? Did the authors “forget” to enter the value?
How can someone find out what the maximum length is?