Broken Space After Debian13 Update And llama-cpp-python Update

Hi,

Some of my Gradio spaces that were working previously are no longer functioning. The first issue seems to be related to the Debian 13 update: my Gradio spaces were likely initially deployed with Debian 12.

After trying the workaround suggested by john6666, one of my older spaces restarted, but it now gets stuck with a different Python error.

For another space deployed with Docker, I modified the Dockerfile to specify the Debian and Python versions:

FROM python:3.11-slim-bookworm
# Instead of: FROM python:3.11-slim

This change was intended to use Python 3.11 with Debian 12 (Bookworm), as the default python:3.11-slim now uses Debian 13 (Trixie).
However, it initially returned an error:

E: Package 'libgl1-mesa-glx' has no installation candidate

After fixing the package error, the space no longer shows that issue, but it gets stuck during the build stage after:

Building wheel for llama-cpp-python (pyproject.toml): started

It get in TimeOut.

The same issue occurs in a third space that was working today until I changed its name (which triggered a rebuild). Now, it also gets stuck at the same build stage.

For my older spaces deployed automatically with Gradio, it would be ideal if, during a rebuild, the versions of the OS, Python, Gradio, and other essential dependencies remained the same as those used during the initial deployment. This would help avoid failures during restarts or rebuilds.

Note: I know that versions can be specified in requirements.txt (though not the base OS container).


My Questions:

  1. For my Hugging Face Spaces that were automatically deployed for Gradio, is there a way to find out the versions of the OS, Python, and the main packages/dependencies used? This would allow me to specify or lock those versions by simply editing the requirements.txt file.

  2. Is there a solution to stay on, for example, Debian 12 with Python 3.10 during a rebuild for spaces deployed without a Dockerfile?

  3. Regarding the current error:

    Building wheel for llama-cpp-python (pyproject.toml): started
    

    Does specifying a version of llama-cpp-python that can be downloaded like other libraries (without needing to build a wheel) seem like the only solution?

Thank you for your feedback!

1 / 2

You can specify Python versions and the additional packages to install. However, everything else must be done manually… Also, the OS is fixed in Gradio spaces.

import sys, platform
from importlib import metadata as md

print("Python:", platform.python_version(), sys.implementation.name)
print("OS:", platform.uname())
print("\n".join(sorted(f"{d.metadata['Name']}=={d.version}" for d in md.distributions())))

3

Installing the latest CPU build of llama_cpp_python in HF Spaces doesn’t work properly with requirements.txt for now…

1 Like

hello,

Thank you for your answer and solutions @John6666
Already 2 HFSpaces up again.*

**For Memory : workaround

  • in requirements.txt

#Comment the line for llama.cpp
#llama-cpp-python>=0.2.0

  • in app.py for DockerSpace

import subprocess
import sys, platform
from importlib import metadata as md


#Install wheel From URL (here for Python3.11 check for other python version if needed)
subprocess.run("pip install https://github.com/abetlen/llama-cpp-python/releases/download/v0.3.1/llama_cpp_python-0.3.1-cp311-cp311-linux_x86_64.whl", shell=True)

#Add Log to show all versions
print("Python:", platform.python_version(), sys.implementation.name)
print("OS:", platform.uname())
print("\n".join(sorted(f"{d.metadata['Name']}=={d.version}" for d in md.distributions())))

  • in app.py for GRadioSpace

import subprocess
import sys, platform
from importlib import metadata as md


#Install and Compile wheel at cost of 5minutes
subprocess.run("pip install -V llama_cpp_python==0.3.15", shell=True)

#Add Log to show all versions 
print("Python:", platform.python_version(), sys.implementation.name)
print("OS:", platform.uname())
print("\n".join(sorted(f"{d.metadata['Name']}=={d.version}" for d in md.distributions())))

thank you.

1 Like

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.