I am following the tutorial from chapter 7 of the NLP course in order to fine tune a Bert model on my own dataset, for the same purpose of the tutorial: fine tuning a model for domain adaption. The code in the tutorial uses a AutoModelForMaskedLM object. I was wondering if its reasonable to use just the AutoModel class instead. My thinking is that i want to fine tune a model to get better embeddings for my dataset, not necessarily for any specific task. For clarification, the code I am following is here.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Fine-tuning BERT Model on domain specific language and for classification | 7 | 8242 | November 14, 2024 | |
How to use fine-tuned model | 1 | 306 | April 27, 2021 | |
How to use AutoModel | 0 | 1960 | May 4, 2021 | |
How to use CausalLM model to pre-train, and use SequenceClassification model to fine-tune? | 2 | 686 | August 31, 2023 | |
How to use the model from the chapter "Fine-tuning a model with the Trainer API" | 0 | 309 | April 17, 2024 |