Dumb Question: Seeing that my inference API links not working

Context: Setting up a basic chatbot using https://api-inference.huggingface.co/models/microsoft/DialoGPT-small and other popular LLMs.

Issue: My api tests worked before but now I seem to be getting a 404 error - Model endpoint not found, but token might be valid. Have already tested my token and know that it is working (have already regenerated it just in case)

Ask: I’m fairly new and have just been using the api inference links as before. Is this still best practice for using models on this site or is it different now? Appreciate your advise on this :smiley:

1 Like

I believe this is an old endpoint that became unusable quite some time ago.

However, I don’t recall there being any clear or proactive communication about the change. It may have been communicated, but it didn’t come to my attention.

The code may still be present in older libraries, and referencing online knowledge or searching with ChatGPT might still yield the endpoint URL for example…

In any case, the currently usable code has changed significantly, so please refer to the documentation. If you have any questions, feel free to ask.

julien-c commented on Jul 7, 2025

api-inference.huggingface.co is a very old URL, where did you find it documented?