Dependency Conflict with huggingface-hub Version in Gradio and Llama-Index Setup

Certainly! Here’s a detailed message to explain the dependency issue you’re facing, with the necessary commands to help resolve it. Hopefully, this will make it clear to others in the community what you’ve tried and where the issue lies.


Help Request: Dependency Conflict Between Gradio and Huggingface Hub Versions

Hi everyone,

I’ve been struggling with a dependency issue for an entire day and am hoping someone here can help. My application requires specific versions of libraries, especially huggingface-hub==0.23.5 and gradio==4.44.0, but a later step in my setup process unintentionally upgrades huggingface-hub, causing conflicts. Here’s a breakdown of what I’ve tried and where the problem occurs:

Setup and Context

  1. Installing dependencies:
    First, I install all my dependencies from a requirements.txt file:

    pip install -r requirements.txt
    

    In this file, I specify:

    huggingface-hub==0.23.5
    gradio==4.44.0
    
  2. Final setup step:
    After installing dependencies, I run the following:

    RUN pip install --no-cache-dir gradio[oauth]==5.5.0 "uvicorn>=0.14.0" spaces
    

    This command unexpectedly upgrades huggingface-hub to version 0.26.2, which is incompatible with some llama-index packages I’m using:

    • llama-index-llms-huggingface requires huggingface-hub<0.24.0
    • llama-index-llms-huggingface-api also requires huggingface-hub<0.24.0

Problem

The version bump of huggingface-hub to 0.26.2 breaks compatibility with these llama-index packages, causing my application to fail.

Attempted Solutions and Issue Recurrence

I tried downgrading huggingface-hub back to 0.23.5 after the initial install, but each time I execute the final setup step (above), the newer version 0.26.2 of huggingface-hub is reinstalled due to gradio==5.5.0 dependencies.

Request for Help

I would greatly appreciate any guidance on:

  1. How to force pip to retain huggingface-hub==0.23.5 while still satisfying gradio requirements, OR
  2. Any alternative approach that ensures stable compatibility across these packages without further dependency conflicts.

Thank you for any insights—this has been a full day of troubleshooting without resolution, and I’m feeling stuck.

1 Like

For Gradio, huggingface_hub is an important part of the backend. If you use a different version, there is a high possibility that it will not work properly.
If possible, the most reliable thing to do is to downgrade Gradio.
If you absolutely have to use 5.5.0, you will have to do something about Lllamaindex.

RUN pip install --no-cache-dir gradio[oauth]==4.44.0 "uvicorn>=0.14.0" spaces

Thank you for the response!

I understand that huggingface_hub is crucial for Gradio’s backend compatibility. I can try downgrading Gradio if that’s likely to be more stable. However, I’m not sure how to apply these changes directly on the Hugging Face Spaces server. Could you guide me on how to run the downgrading commands (or any necessary adjustments) directly in Spaces?

To give some more context, I’m using huggingface_hub primarily for LLM models, which means downgrading it could impact compatibility with llama-index, as it currently doesn’t support the latest huggingface_hub versions.

Any advice on managing this setup would be much appreciated—thanks again!

1 Like

I looked at your space, and the cause is probably in README.md, which I think you can fix. README.md is a configuration file in HF.
There is a bigger problem, though. You have set the token as an environment variable, which is completely visible to others. I recommend that you set it as a secret.

Edit:
I opened PR for README.md.

Thanks for your contributing.

1 Like

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.