An error in parallelism in tokenization

I encountered an error running this code. I ran it without problem a few weeks ago. Apparently, if batch=true, the function requires all inputs have the same length. But I don’t want pad my input at this step. Does anyone has any idea of how to solve it?