ClientErro:400 when using batch transformer for inference

Thanks for letting me know. Then here again the response

Hey @miOmiO,

Happy to help you here! to narrow down your issue. I think the first step would be to check if the dataset was created in the sample (notebooks/sagemaker-notebook.ipynb at master · huggingface/notebooks · GitHub ) works or if it also errors out.

Additionally, could you bump the version of the HuggingFaceModel to the latest one? For transformers_version that’s 4.12.3 and for pytorch_version its 1.9.1 maybe this already solves your issue. You can find the list of available containers here: Reference

Also worth testing is to replace your model with a different model, e.g. distilbert-base-uncased-finetuned-sst-2-english