Facing ConnectionError with Ollama LLM Model on Hugging Face Spaces

Hi everyone,

I’m currently working on a project where I’m integrating the Ollama LLM model within a Hugging Face Space. However, I’ve run into a runtime error that I haven’t been able to resolve. When I try to run the space, I get the following error:

requests.exceptions.ConnectionError: HTTPConnectionPool(host=‘localhost’, port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError(‘<urllib3.connection.HTTPConnection object at 0x7fcd2d95c970>: Failed to establish a new connection: [Errno 111] Connection refused’))

Questions:

  1. Has anyone successfully integrated the Ollama LLM model on Hugging Face Spaces? Are there additional configurations required?
  2. Does this error indicate an issue with the local server configuration or Hugging Face Spaces settings?
  3. Any guidance on how to configure the environment properly to avoid this error?

I’d appreciate any insights or solutions from the community! Thanks in advance for your help

1 Like

You can configure the port in README.md, but it’s not clear whether it will work just by doing this.