Serverless Inference API error on new model

When I tested the Serverless Inference API with my new model, I encountered the following error:

{'error': "We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like Saripudin/autotrain-model-datasaur-ZDAzZTc5NmI-NGJmYmVlOWU is not the path to a directory containing a file named config.json.\nCheckout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'."}

Does this mean that Hugging Face has now limited their Serverless API usage?