Domain adaptation with MLM and NSP

Hi,

I’ve coded a domain adapter for further pretraining a BERT model (Portuguese language) in a specific domain. I used the article Fine-tuning a masked language model - Hugging Face Course as a guide, but it doesn’t focus on next sentence prediction, only masked language model.

I’d like to know what is necessary to change the code to add support for next sentence prediction. I know I should use BertForPreTraining instead of AutoModelForMaskedLM, but my question is how to add the data and abels for NSP.

Thanks in advance.

2 Likes

Hi. Same question here. Followed the tutorial and was wondering how to go about domain adaption for generation tasks, which would be CLM and not MLM. Were you able to figure things out?

I have a similar question. Were you able to figure it out?

I have a same question.