HF Inference API last few minutes returns the same 404 exception to all models

Facing the same issue : 404 Client Error: Not Found for url: https://router.huggingface.co/hf-inference/models/meta-llama/Meta-Llama-3-8B-Instruct/v1/chat/completions. Any update on the issue

1 Like

That model is deployed, so it’s strange that this error is occurring…
But it actually returns a 404.

Does it work now?

1 Like

has this been fixed yet? I’m still facing the error.

1 Like

Same error HfHubHTTPError: 404 Client Error: Not Found for url for Mistral model ā€˜mistralai/Mistral-7B-Instruct-v0.2’ with HuggingFaceEndpoint(), although the model ā€œmeta-llama/Llama-3.1-8B-Instructā€ works

1 Like

For mistralai/Mistral-7B-Instruct-v0.2, you may need to specify inference provider.