Hi everyone,
I seem to be encountering a sudden issue with the Hugging Face Inference API. I was successfully using it for image-to-image tasks (specifically with models like stabilityai/stable-diffusion-xl-refiner-1.0
), but it abruptly stopped working a short while ago.
Now, regardless of which model I try to use, I consistently receive the following error pattern:
Model <model_name> is not supported HF inference api
This is happening in two distinct ways:
- Via
InferenceClient
: My application code, which was previously working, now gets this error for any model inference request. - Via Website Inference Widget: Even trying to use the Inference Widget directly on the model pages on the Hugging Face website results in the same error message appearing where the widget interface should be.
Here are a couple of specific examples Iāve tested:
- Model:
timbrooks/instruct-pix2pix
- Link: timbrooks/instruct-pix2pix Ā· Hugging Face
- Output:
Model timbrooks/instruct-pix2pix inference is not supported HF inference api
- Model:
stabilityai/stable-diffusion-xl-refiner-1.0
- Link: stabilityai/stable-diffusion-xl-refiner-1.0 Ā· Hugging Face
- Output:
Model stabilityai/stable-diffusion-xl-refiner-1.0 inference is not supported HF inference api
(Note: Iāve tried several others with the same result)
Since this is affecting multiple models and occurs both through the API client and the website widget, it feels like it might be a broader issue rather than something specific to my setup or a single model.
Is anyone else experiencing similar behavior? Any information or confirmation would be greatly appreciated!
Thanks!