RuntimeError: stack expects each tensor to be equal size, but got [12] at entry 0 and [35] at entry 1

I am following the tutoiral of text classification in the website but get this error while apply it into my own dataset that is similar to the tutorial one ! I checked if i have empty values or empty tensor but get this error of tensor of 0 elements into shape[-1,0] and i can’t figure it out any help?
This my notebook link arabic_tweets_classification | Kaggle

I think the tokenized texts are not of the same length as indicated by this warning message.

If you adjust the input length to be the same at each batch, I think the error will go away.

1 Like

I had the same problem, and it went away after setting max_length parameter on tokenizer.