When I called FastTokenizer, I could see the strange number of “model_max_length” as “1000000000000000019884624838656”. What is the meaning of the strange model max length?
from transformers import AutoTokenizer
model_name = 'microsoft/mdeberta-v3-base'
tokenizer = AutoTokenizer.from_pretrained(model_name)
vars(tokenizer)