Private model inference widget suddenly stopped working

The model interface widget for my model has stopped working on its model page in the hub. When you click “Compute”, the loading progess bar spins for a bit and then it says:

Can’t load tokenizer using from_pretrained, please update its configuration: username/model is not a local folder and is not a valid model identifier listed on ‘Models - Hugging Face’ If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token or log in with huggingface-cli login and pass use_auth_token=True.

Obviously the model actually exists because this is showing up on its model page, and I’m logged in to the hub because I’m able to see the page.

Is this the best place for bug reports like this?

Facing the same issue from July 11th.
Not sure on how to report this either

Do you have a public model example that we could examine?

@rockwalrus @valentinaprotti @radames

I have the same problem. The widget worked well until last week. I got the same error.
My model is gated. I gave myself access. Same problem… Maybe something change on the inference API?

Edit: As of July 18 2023 at 8:30, my widget is back in business!
The Inference API just took one week for the Summer holidays. : )


hello all, this was fixed yesterday. thanks for the patience. Please try again