There was a problem when trying to write in your cache folder (/.cache/huggingface/hub) + some other permission denied messages for writing from my app. Any configuration I missed?
# read the doc: https://huggingface.co/docs/hub/spaces-sdks-docker
# you will also find guides on how best to write your Dockerfile
FROM python:3.9
WORKDIR /code
COPY ./requirements.txt /code/requirements.txt
RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt
RUN apt update && apt install -y ffmpeg
COPY . .
ENV H2O_WAVE_LISTEN=":7860"
ENV H2O_WAVE_ADDRESS='http://127.0.0.1:7860'
CMD ["wave", "run", "app", "--no-reload"]
Everything works fine locally. Any help would be highly appreciated!
I sent you a PR on the hub. You only need to add the write user running the script permission, following this Docker Spaces
changed your Dockerfile to
# you will also find guides on how best to write your Dockerfile
FROM python:3.9
WORKDIR /code
COPY ./requirements.txt /code/requirements.txt
RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt
RUN apt update && apt install -y ffmpeg
RUN useradd -m -u 1000 user
USER user
ENV HOME=/home/user \
PATH=/home/user/.local/bin:$PATH
WORKDIR $HOME/app
COPY --chown=user . $HOME/app
ENV H2O_WAVE_LISTEN=":7860"
ENV H2O_WAVE_ADDRESS='http://127.0.0.1:7860'
CMD ["wave", "run", "app", "--no-reload"]
Hello @radames , I also tried to your fix but I’m trying to run an airflow container (for education purposes) but I still get error. Here is the file:
# Use an official Airflow image as base
FROM apache/airflow:2.7.0
# Set environment variables
ENV AIRFLOW_HOME=/opt/airflow
ENV AIRFLOW__CORE__LOAD_EXAMPLES=False
ENV AIRFLOW__CORE__EXECUTOR=SequentialExecutor
ENV AIRFLOW__WEBSERVER__WEB_SERVER_MASTER_TIMEOUT=300
ENV AIRFLOW__WEBSERVER__WORKER_CLASS=gevent
ENV AIRFLOW__WEBSERVER__WEB_SERVER_PORT=7860
ENV AWS_DEFAULT_REGION=eu-west-3
# Switch user
USER root
# COPY DAGS & PEM Key
COPY ./dags /opt/airflow/dags
COPY secrets/DEMO_KEY_PAIR.pem /opt/airflow/
# Ensure correct permissions for the .pem file
RUN chmod 400 /opt/airflow/DEMO_KEY_PAIR.pem \
&& chown airflow /opt/airflow/DEMO_KEY_PAIR.pem
RUN useradd -m -u 1000 user
USER user
RUN chown -R user:user /opt/airflow
COPY --chown=user ./dags /opt/airflow/dags
# Switch back to airflow user
USER airflow
# Install any additional dependencies if needed
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
# Initialize the Airflow database (PostgreSQL in this case)
ENV AIRFLOW__DATABASE__SQL_ALCHEMY_CONN=$POSTGRES_URL
RUN airflow db init
# Create default admin user for Airflow (username: admin, password: admin)
RUN airflow users create \
--username admin \
--firstname Admin \
--lastname User \
--role Admin \
--email admin@example.com \
--password admin
# Expose the necessary ports (optional if Hugging Face already handles port exposure)
EXPOSE 7860
# Start Airflow webserver and scheduler within the same container
CMD ["bash", "-c", "airflow scheduler & airflow webserver"]