I’m creating a chatbot and have used BAAI/llm-embedder
model via HuggingFaceBgeEmbeddings
class from langchain
. But I am facing the following issue
There was a problem when trying to write in your cache folder (/.cache/huggingface/hub). You should set the environment variable TRANSFORMERS_CACHE to a writable directory.
As mentioned I have also tried setting the environment variable but still the same error occurs. I have also tried passing the path in parameters in the function.
def store_embeddings(self, docs):
embeddings = HuggingFaceBgeEmbeddings(
model_name=self.cfg.EMBEDDINGS,
model_kwargs={"device": self.cfg.DEVICE},
encode_kwargs={"normalize_embeddings": self.cfg.NORMALIZE_EMBEDDINGS},
cache_folder = self.cfg.CACHE_FOLDER
)
My Space: StudybotAPI - a Hugging Face Space by HemanthSai7
I deployed this fastAPI instance in spaces via docker and the main backend code lies StudybotAPI/backend
My dockerfile
FROM python:3.10.9
COPY ./StudybotAPI .
WORKDIR /
RUN pip install --no-cache-dir --upgrade -r /requirements.txt
CMD ["uvicorn", "app:app", "--host", "0.0.0.0", "--port", "7860"]
Need some help in fixing this issue.
Thank you