I got the warning when evaluating a T5 model:
Token indices sequence length is longer than the specified maximum sequence length for this model (2348 > 512). Running this sequence through the model will result in indexing errors
However, there were no errors.
My question is:
What indexing exactly is this warning referring to?
My best guess is that they refer to indexing errors for fixed positional embeddings, but then, shouldnt models that use relative positional embeddings not raise this warning?