Hosted Inference API suddenly throwing error "is not a local folder and is not a valid model identifier"

All of my models which have been working just fine, are now suddenly throwing this error when loading the API from the model page. They are all private models, but I’ve been running them fine for several months. Any Spaces using them do not work either.

Example:
bleedchocolate/mabelle-v1 is not a local folder and is not a valid model identifier listed on ‘Models - Hugging Face’ If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token or log in with huggingface-cli login.

Thank you for your help :slight_smile:
Rick

Yes, it seems there was a change and private models don’t work like that anymore.

Though, for the last few hours no model seems to work on the Inference API at all, they all time out and can’t be loaded. Seems like a big deal and I find weird that I’m the first user mentioning it. (EDIT - My comment was hidden until now and all models are now working on the inference API, yay!)