I tried to use ProphetNet with Seq2SeqTrainer, but it failed.
The error message tell me: This is because the collator I implemented uses
prepare_seq2seq_batch() is not implemented for ProphetNet Tokenizer.
Is there any reason ProphetNet cannot have
prepare_seq2seq_batch() in its tokenizer?
My understanding may be insufficient, but it seems that a function that assigns special tokens in a unique way is implemented for the tokenizer. Is that the cause?
If it is implemented like other Seq2SeqLM, will ProphetNet’s original performance not be exhibited?
Thank you in advance.