Problem access public model?

I have been using yahma/Llama-2-7b-chat-hf for the past month, but today I have suddenly developed issues. To my knowledge, nothing has changed about my environment. Below is the error message. The repo also says it was last updated in 2023, so I’m not sure what’s going on.

Traceback (most recent call last): File "/opt/conda/envs/icv_cliff/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py", line 261, in hf_raise_for_status response.raise_for_status() File "/opt/conda/envs/icv_cliff/lib/python3.9/site-packages/requests/models.py", line 1024, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/yahma/llama-7b-hf/resolve/main/adapter_config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/opt/conda/envs/icv_cliff/lib/python3.9/site-packages/transformers/utils/hub.py", line 430, in cached_file resolved_file = hf_hub_download( File "/opt/conda/envs/icv_cliff/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn return fn(*args, **kwargs) File "/opt/conda/envs/icv_cliff/lib/python3.9/site-packages/huggingface_hub/file_download.py", line 1346, in hf_hub_download raise head_call_error File "/opt/conda/envs/icv_cliff/lib/python3.9/site-packages/huggingface_hub/file_download.py", line 1232, in hf_hub_download metadata = get_hf_file_metadata( File "/opt/conda/envs/icv_cliff/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn return fn(*args, **kwargs) File "/opt/conda/envs/icv_cliff/lib/python3.9/site-packages/huggingface_hub/file_download.py", line 1608, in get_hf_file_metadata hf_raise_for_status(r) File "/opt/conda/envs/icv_cliff/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py", line 293, in hf_raise_for_status raise RepositoryNotFoundError(message, response) from e huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=blah blah blah)

Repository Not Found for url: https://huggingface.co/yahma/llama-7b-hf/resolve/main/adapter_config.json. Please make sure you specified the correct repo_id and repo_type. If you are trying to access a private or gated repo, make sure you are authenticated. Invalid credentials in Authorization header

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/workspace/ToT/lammarepo/ICMV/ICV/task_style_vector.py", line 63, in <module> model = build_model(args.model_type, args.model_size, args.in_8bit) File "/workspace/ToT/lammarepo/ICMV/ICV/models/huggingface.py", line 40, in build_model model = AutoModelForCausalLM.from_pretrained( File "/opt/conda/envs/icv_cliff/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 505, in from_pretrained maybe_adapter_path = find_adapter_config_file( File "/opt/conda/envs/icv_cliff/lib/python3.9/site-packages/transformers/utils/peft_utils.py", line 87, in find_adapter_config_file adapter_cached_filename = cached_file( File "/opt/conda/envs/icv_cliff/lib/python3.9/site-packages/transformers/utils/hub.py", line 451, in cached_file raise EnvironmentError( OSError: yahma/llama-7b-hf is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>

1 Like

Solution. Even though I didn’t have to do this in the past and this is a public model and I am logged in when I check “huggingface-cli whoami”, I now have to pass my authentication token with “token=” when I get the model. I don’t understand why this just now is happening, but this is what fixed it for me.

2 Likes

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.