Hi,
I’m also facing same issue while running trainer.train()
Using bert-base-uncased
Basically I’ve done the sliding windows for splitting the dataset as we have the maximum sequence length for the tokenizer.
Ideally sliding windows are generated and we have mapped to respective labels.
While the trainer.train() I’m getting the below error,
ValueError: Expected input batch_size (2040) to match target batch_size (6392)
Can anyone please help with the error?
Thanks