I am following the tutorial from chapter 7 of the NLP course in order to fine tune a Bert model on my own dataset, for the same purpose of the tutorial: fine tuning a model for domain adaption. The code in the tutorial uses a AutoModelForMaskedLM object. I was wondering if its reasonable to use just the AutoModel class instead. My thinking is that i want to fine tune a model to get better embeddings for my dataset, not necessarily for any specific task. For clarification, the code I am following is here.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Train MLM on my own domain and fine tune on downstream classification task | 3 | 1022 | April 16, 2024 | |
Fine-tuning BERT Model on domain specific language and for classification | 7 | 8473 | November 14, 2024 | |
How to use AutoModel | 0 | 2001 | May 4, 2021 | |
Difference between "Auto Model" and "Auto Model For Token Classification" in BERT fine tuning | 1 | 1780 | June 25, 2022 | |
Continue pre-training Greek BERT with domain specific dataset | 10 | 4669 | January 4, 2023 |