[Tokenizers]What this max_length number?

When I called FastTokenizer, I could see the strange number of “model_max_length” as “1000000000000000019884624838656”. What is the meaning of the strange model max length?

from transformers import AutoTokenizer
model_name = 'microsoft/mdeberta-v3-base'

tokenizer = AutoTokenizer.from_pretrained(model_name)
vars(tokenizer)

1 Like

It’s just the largest integer in this precision, because this model does not have a max length.

2 Likes

fyi this can happen for llama2-7b.