Hello, i want to fintetune a Deberta model for sequence classification with limited compute so i will need to freeze some layers. Through EDA i noticed that the presence of unknown tokens for Deberta Tokenizer is strongly correlated to labels, so i was planning to add some of them to the tokenizer. My question is if is it possible to freeze embeddings for known tokens while learning the representation of the new ones or if i just need to unfreeze the whole module. Thank you so much!
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Add new tokens and learn the embeddings of the new tokens and keeping all the other parametes frozen | 0 | 456 | April 30, 2021 | |
How to Finetune Deberta Model on SQUAD dataset? | 2 | 1129 | January 27, 2021 | |
Working with named entities with bert | 2 | 311 | August 30, 2020 | |
Freezing weights of new tokens in the input embedding | 2 | 568 | September 25, 2024 | |
Fine-Tuning DeBERTa Produces Non-Results | 3 | 2904 | September 21, 2022 |