Private model inference widget suddenly stopped working

The model interface widget for my model has stopped working on its model page in the hub. When you click ā€œComputeā€, the loading progess bar spins for a bit and then it says:

Canā€™t load tokenizer using from_pretrained, please update its configuration: username/model is not a local folder and is not a valid model identifier listed on ā€˜Models - Hugging Faceā€™ If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token or log in with huggingface-cli login and pass use_auth_token=True.

Obviously the model actually exists because this is showing up on its model page, and Iā€™m logged in to the hub because Iā€™m able to see the page.

Is this the best place for bug reports like this?

Facing the same issue from July 11th.
Not sure on how to report this either

Do you have a public model example that we could examine?

@rockwalrus @valentinaprotti @radames

I have the same problem. The widget worked well until last week. I got the same error.
My model is gated. I gave myself access. Same problemā€¦ Maybe something change on the inference API?

Edit: As of July 18 2023 at 8:30, my widget is back in business!
The Inference API just took one week for the Summer holidays. : )

2 Likes

hello all, this was fixed yesterday. thanks for the patience. Please try again

2 Likes