Serverless Inference API error on new model

Thank you @John6666

I’ve tested it and it has been worked normally