Hi, I just subscribed HF Pro and want to migrate my Open WebUI from Groq.com to HF models, and contribute to the community, of course.
Now, a very basic question: I want to add HF models using an OpenAI connection. I created an API and link to the API URL https://api-inference.huggingface.co/models/Qwen/Qwen3-235B-A22B. However, no connection is possible. What is my error?
Thanks, Robert
1 Like
It seems that it is also possible to use Hugging Face models (in a similar way) from Groq and OpenAI APIs. The usage of the Inference API on Hugging Face has been significantly revamped recently and integrated into the Inference Provider, so I recommend looking into that.
Thanks for your prompt reply, John!
What finally worked, was the putting the API URL
https://router.huggingface.co/novita/v3/openai in Open WebUI connections.
This provides me with access to ~50 models on Novita. I tried some other
Inference providers without luck; Thus for now I’m fine to go and start
testing; however, the documentation on the API still could be improved.
Best, Robert
1 Like