404 to any API i tried

i am trying to work with few API s:
https://api-inference.huggingface.co/models/mistralai/Mistral-7B-Instruct-v0.3
or
https://api-inference.huggingface.co/models/gpt2

actually any model i try to run .
it worked before.
after i got the error i have deleted old token and created a new one, its the only one there.
i am, active with pro plan of HF. what is wrong here?

1 Like

This is because none of them have been deployed now. I think some models can be used via Inference Provider.

you mean they have been removed?

1 Like

i was struggling with this for weeks, and did not understand how token caching worked. I stumbled across a forum on here and someone said: “i seemed to have in ~/.netrc credentials for huggingface.co, which i think they might have overridden my Authorization header.
a smiple rm ~/.netrc did the trick” it took me even longer to find it because its hidden and im not great with Linux, but i finally found, it delete this and i was able to login using API and download the exact mode youre talking about.

1 Like

Thanks i tried it no .netrc or any combination. thinking to move on to other services.

1 Like

Hi @GilMS The model mistralai/Mistral-7B-Instruct-v0.3 is available with other Inference Providers such as Together AI and Novita. If you’re PRO, you can use your free monthly inference credits towards these providers. More info on Inference Providers: Inference Providers.

mistralai/Mistral-7B-Instruct-v0.3 and gpt2 can both be used with Inference Endpoints. You can also request inference provider support for gpt2 on the model page: openai-community/gpt2 · Hugging Face.

Let us know if you have other questions.

1 Like