I’m trying to train a seq2seq model and I’m using the CNN bert2bert example as reference.
The problem is that the example is not training correctly. I’m running the notebook as it is, with the only difference in the transformers version (I’m using
4.21.3). I’m getting Rouge2 metrics of zero for all epochs, as well as the evaluation step.
Anyone seeing the same issue?