I am new to HF and I’m trying to use the Inference API to call the model google/flan-t5-small from the Hugging Face Hub.
I have a PRO subscription and I’m using a valid access token with Read permission.
Here’s the exact curl command I’m using:
No JSON body, no additional error message — just that plain string.
I’ve tried:
Regenerating the token
Testing other models like bloomz-560m and t5-base-finetuned-common_gen (same result)
Using curl -i to inspect headers
I’m not sure if I’m doing something wrong or if these models aren’t available via Inference API.
but I tried it a few days ago and it worked…
Any guidance would be appreciated!
Thanks a lot for the reply.
That clarifies things — I guess we just have to wait and hope it gets resolved soon.
I was planning to use it for a class tomorrow, so fingers crossed
Good to know I’m not the only one experiencing this!
Hi @KR0ld apologies - the model google/flan-t5-small is not deployed by any Inference Provider at the moment, but you can ask for provider support on the model page here: Ask for Provider Support.
Models that are available with Inference Providers can be found here: Models - Hugging Face.