Cannot use Inference Provider. 429 error. First time usage

@John6666 Are you part of HuggingFace team?

I went to check Together AI’s page for their list of models and realized that their version of Lllama3.1 8B is meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo

Is this the reason why my InferenceClient cannot pull model info from huggingface using this link https://huggingface.co/api/models/meta-llama/Llama-3.1-8B-Instruct?expand=inferenceProviderMapping?

I have also checked the json object returned from this link. Turns out there is no Together AI among the providers. Is this the root cause for 429 error?

1 Like