Failing to start a ChatUI with a model

I’m trying to use autotrain to fine-tune a llama 2 7b base model. I’m trying to use the ChatUI module to use my trained model and it fails. To narrow the issue I tried to use the same ChatUI space with the original base llama-2-7b model but got the same error so it looks like the issue is with the ChatUI config not the fine-tuning part. Here’s the error I’m getting when ChatUI it built:

Traceback (most recent call last):

File “/opt/conda/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py”, line 261, in hf_raise_for_status
response.raise_for_status()

File “/opt/conda/lib/python3.9/site-packages/requests/models.py”, line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)

requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/api/models/meta-llama/Llama-2-7b-hf

Note that I do have access to the llama 2 model.
Any help will be much appreciated.