Model_max_length error in some models

hi there,
I find that for some models, in

tokenizer = AutoTokenizer.from_pretrained(model_name)

such as mistralai/Mistral-7B-Instruct-v0.2, llama-2 series, the model_max_length=1000000000000000019884624838656. which is not correct.

for some models, such as lmsys/vicuna-7b-v1.5-16k, the model_max_length=16384, which is correct.

May I know the reason ?