I used huggingface code to train bert. Using mlm, what should be the maximum length of the characters entered? How do I change this length?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
SQuAD/BERT: Why max_length=384 by default and not 512? | 1 | 2407 | November 15, 2021 | |
Training BERT with new tokenizer and vocabulary | 0 | 422 | April 10, 2023 | |
BERT MLM model fine-tune on small data bad results | 0 | 97 | April 14, 2024 | |
Training BERT from scratch (MLM+NSP) on a new domain | 10 | 6084 | February 2, 2024 | |
Sequence Length in Continued Pretraining (MLM) & Masking Strategies | 0 | 1154 | January 6, 2022 |