Error 401 Client Error: Unauthorized for url

This is a gated model, you probably need a token to download if via the hub library, since your token is associated to your account and the agreed gated access

1 Like

login with the token

from huggingface_hub import notebook_login
notebook_login()

still have the same issue

how are you trying to load the model?

Traceback (most recent call last):
File “/Radiata/venv/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py”, line 259, in hf_raise_for_status
response.raise_for_status()
File “/Radiata/venv/lib/python3.10/site-packages/requests/models.py”, line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/DeepFloyd/IF-I-XL-v1.0/resolve/main/text_encoder/config.json

64915853-65a0c71e088ca04908666477)

Repository Not Found for url: https://huggingface.co/DeepFloyd/IF-I-XL-v1.0/resolve/main/text_encoder/config.json.
Please make sure you specified the correct repo_id and repo_type.
If you are trying to access a private or gated repo, make sure you are authenticated.
Invalid username or password.

Would you please elaborate on this? How did you find out that the authentication key was not being used? What steps did you take to make it use your authentication key? I have an authentication key but I still get the same 401 error.

1 Like

I am trying to access this model and running into ‘401 Client Error: Repository Not Found for url’. I have completed the three steps outlined (2 requiring accepting user agreement after logging in and the third requiring to create an access token. I have tried accessing the model via the API on huggingface.co as well as using the python code snippet for the inference api in my local notebook. Can anyone help?

Why don’t you answer the question?

If someone is using a webui and got this error, set the environment variables : HF_USER, HF_PASS, and also HF_TOKEN (create new token under settings menu in your Hugging Face account).

!huggingface-cli login

or using an environment variable

!huggingface-cli login --token $

I use this conquer this promble ,hope to help you

This issue still exists. I have been suffering from it since the start I signed up hf. It appears only on the webui, not on every model pages but on some. Meanwhile I can successfully access the model files from python code without any auth errors.

Currently facing the same issue. Trying to load “tiiuae/falcon-180b-chat” via jupyter notebook. Already accepted the license T&C and was given access to the gated model but when I log in using hugginface_hub.notebook_login(new_session=True), I receive the 401 error while loading the tokenizer config file.

This is a new development for me. I used the same login method to download LLama-2 a few weeks back without a problem but with this new model I am facing this issue and haven’t been able to resolve it. Any help would be appreciated.

All of a sudden started getting the 401 Error for loading a model from a HF Space.

Login is successful, HF_TOKEN env is set, I’m also passing the token into the model loading function as an arg… From all my envs (local, cloud, browser) the model is available.

Token will not been saved to git credential helper. Pass `add_to_git_credential=True` if you want to set the git credential as well.
Token is valid.
Your token has been saved to /home/user/.cache/huggingface/token
Login successful
...
Repository Not Found for url: https://huggingface.co/whoismikha/room_scene_reconstruction/resolve/atiss/model_index.json

but the URL works when opened from browser. Loading the model also works from other environments.

1 Like

@aoliveira He doesnt care anymore because got problem solved. Selfish