Further pre-train language model in transformers like BERT

Hi all,
yesterday through a workshop I learned about this forum. So I have the following question: is it possible to further pre-train transformers (e.g. BERT, DistilBert) using my own corpus? I mean not for the downstream task, but the language model (e.g. BERT tasks MSM and NSP) itself? Is it possible in general and with huggingface specifically?

Thank you. best regards
LIza

Hi @lizzzi111, nice to see you here :slight_smile:

Yes it’s possible.

Examples and readme to do so are here: https://github.com/huggingface/transformers/tree/master/examples/language-modeling

Thank you a lot, @thomwolf!

Hi @thomwolf ,

The link is expired, could you please send the link again?

Thank you