ClientErro:400 when using batch transformer for inference

Hi @marshmellow77 , yes, truncation make some long text acceptable to the model, but not all of them,
even I specify the length to 460 words. I tried from 500 to 460. I have no idea if I should keep reducing this or other modification would help.