How to use Seq2seq Trainer with my original "[MASK]"

Hello,

My account name is yusukemori.

(I’m who asked about Seq2seq Trainer at https://github.com/huggingface/transformers/issues/7740 .
Thank you for giving me a quick, kind, and detailed reply then!)

This is my first time posting on this forum and I apologize if I’m being rude.

I’m now trying to find how to use my original “[mask]” for pre-training/fine-tuning the Seq2seq model.

I’m wondering how to randomly assign [mask] at training time (per epoch) and how to get Seq2seq Trainer to load it.

As far as BERT is concerned, there seems to be a BERTProcessor in huggingface/tokenizers, which is not intended to be used with the Seq2seq Trainer, if I understand correctly.
Is there any processer, such as “BARTProcessor”, for the Seq2seq Trainer?

Please allow me to ask one more question, is there a code for pre-training (from scratch) BART with the Seq2seq Trainer? (For fine-tuning, thanks to the clear example!)

I’m afraid this is a beginner’s, rude question.
Thank you for your help.

Sincerely,
yusukemori

Hi @yusukemori

The Seq2SeqTrainer examples doesn’t (yet!) support pre-training, however you could use Seq2SeqTrainer with your custom data processing logic(masking etc) as it’s a generic s2s trainer.
But you’ll need to implement masking yourself.

1 Like

Hi @valhalla

Thank you for your reply!

Thanks to your help, I’ve learned the following:

  1. Seq2SeqTrainer examples don’t support pre-training now.
  2. I can use Seq2SeqTrainer with my custom data processing logic, but I have to implement it by myself.

I will try to implement it!

I would like to thank you again for your quick advice.

yusukemori

1 Like