Serverless Inference API error on new model

I have seen the same errors. Has it been fixed for you?