Hugging Face Forums
Inference input token number set as the max length always?
Beginners
vivek9840
April 21, 2024, 8:28pm
11
these are the sources which i learned from. might help you.
show post in topic
Related topics
Topic
Replies
Views
Activity
Why does padding = 'max_length' cause much slower model inference?
Models
1
633
June 8, 2023
Purpose of padding and truncating
Beginners
7
3401
August 3, 2020
Padding strategy for classification
Beginners
3
2516
July 20, 2020
Importance of padding for tokens and same size inputs for transformers
🤗Transformers
1
692
October 22, 2021
Does T5 truncate input longer than 512 internally?
🤗Transformers
2
12582
February 12, 2021