Hugging Face Forums
Inference input token number set as the max length always?
Beginners
vivek9840
April 21, 2024, 8:28pm
11
these are the sources which i learned from. might help you.
show post in topic
Related topics
Topic
Replies
Views
Activity
Sequences shorter than model's input window size
🤗Transformers
2
1155
January 4, 2022
Why does padding = 'max_length' cause much slower model inference?
Models
1
612
June 8, 2023
How to set 'max_length' properly when using pipeline?
🤗Transformers
4
1103
November 18, 2024
[Tokenizers]What this max_length number?
Beginners
3
2372
March 3, 2025
Confused about max_length and max_new_tokens
🤗Transformers
7
34767
September 5, 2024