Consistent 404 Not Found on Inference API for all models

Hello,

I am consistently receiving a 404 Not Found error when trying to use the Inference API, and I believe there is an issue with my account’s permissions.

I have confirmed the following through extensive testing:

  1. My API token is correct and authenticates properly. A diagnostic script confirms this.

  2. The error occurs for all models I try, including fully public models like HuggingFaceH4/zephyr-7b-beta and gated models I have been granted access to, like mistralai/Mixtral-8x7B-Instruct-v0.1.

  3. The issue is not network-related. I have tried from different IP addresses and regions using a VPN, and the result is the same 404 Not Found error.

  4. The issue is not the request format. I have also tried adding a standard browser User-Agent header to my requests, and the error still persists.

It seems there is a specific issue with my account’s fundamental ability to access the Inference API service. Could you please investigate?

My Hugging Face username is: Shkila-coder

Thank you.

1 Like

Inference API has been revamped into Inference Providers, resulting in significant changes to its specifications and the models that can be used. The models currently deployed are these.

Thank you for the clarification, John6666.

Following your advice, I switched my code to use a model from the official “Inference Providers” list that you linked: HuggingFaceTB/SmolLM3-3B.

Unfortunately, I am still receiving the exact same 404 Not Found error, even with this confirmed-available model.

This seems to prove that the issue is not with model selection, but a fundamental problem with my account’s permissions to access the Inference API for any model at all.

1 Like

Hm. Endpoint URLs have also changed, so if you’re referencing old code, you may need to rewrite it. If you’re experiencing account issues, I recommend to contact Hugging Face via email first. website@huggingface.co

Hi @Shkila-coder Can you please check to make sure you’re authenticated? You’ll want to make sure you’re using afine-grained token with Make calls to Inference Providers permissions. More info here: Authentication.

1 Like

Hello, thank you for the specific instructions.

I have now done the following:

Deleted all old tokens.

Created a new fine-grained token.

Gave it the specific permission: “Make calls to Inference Providers” under User Permissions.

Unfortunately, I am still receiving the exact same 404 Not Found error when calling the API for a public model from the approved list (HuggingFaceTB/SmolLM3-3B).

1 Like

I can send you the code and my settings if needed.

1 Like