Error on page: codellama-13b-chat

Page:

Gives this error message:

runtime error
Space failed. Exit code: 1. Reason: /huggingface_hub/utils/_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
File “/home/user/.pyenv/versions/3.10.13/lib/python3.10/site-packages/huggingface_hub/file_download.py”, line 1377, in hf_hub_download
raise LocalEntryNotFoundError(
huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File “/home/user/app/app.py”, line 6, in
from model import get_input_token_length, run
File “/home/user/app/model.py”, line 10, in
config = AutoConfig.from_pretrained(model_id)
File “/home/user/.pyenv/versions/3.10.13/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py”, line 1067, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File “/home/user/.pyenv/versions/3.10.13/lib/python3.10/site-packages/transformers/configuration_utils.py”, line 623, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File “/home/user/.pyenv/versions/3.10.13/lib/python3.10/site-packages/transformers/configuration_utils.py”, line 678, in _get_config_dict
resolved_config_file = cached_file(
File “/home/user/.pyenv/versions/3.10.13/lib/python3.10/site-packages/transformers/utils/hub.py”, line 429, in cached_file
raise EnvironmentError(
OSError: We couldn’t connect to ‘https://huggingface.co’ to load this file, couldn’t find it in the cached files and it looks like codellama/CodeLlama-13b-Instruct-hf is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at ‘ht tps://huggingface.co/docs/transformers/installation#offline-mode’.

Container logs:

===== Application Startup at 2024-03-06 12:43:35 =====

1 Like

There was a wide-area server error on Hugging Face the other day and a few months ago. I think it was probably just an error that occurred when trying to load the model at that time and not working. There doesn’t seem to be a problem with the model page.

So, if you report it to the author in the Discussion section and the author notices it, I think it will be fixed just by the author restarting the space. The author will receive a notification if there is a post in the Discussion section. Adding a mention (@+username) is more reliable, but I think they will notice even without it.