Repository Not Found for url:

Hi all,

I trying to get familiar with just the basics of Blooms ai in googles CoLab by fallowing a YouTube video.

I chose a video so I can watch someone doing it, successfully and who know how to do it. And I am just fallowing what they are showing me.

But when I get to the point to load the model by running:
model = AutoModelForCausalLM.from_pretrained(“bigscience/bloom-1b3”, use_cache=True)
tokenizer = AutoTokenizer.from_pretrained(“bigscience/bloom-1b3”)
I get the “Repository Not Found for url” error and if I go to the url manually I get “Repository not found”.

I am not using any custom library’s and I have not made any vacations to his code (at least not yet). I am just following that he did step by step.

Doses anyone know what wrong?

The code I am using is below:


! pip install transformers -q


from transformers import AutoModelForCausalLM, AutoTokenizer, set_seed

import torch

model = AutoModelForCausalLM.from_pretrained("bigscience/bloom-1b3", use_cache=True)
tokenizer = AutoTokenizer.from_pretrained("bigscience/bloom-1b3")

HTTPError                                 Traceback (most recent call last)

/usr/local/lib/python3.7/dist-packages/huggingface_hub/utils/ in hf_raise_for_status(response, endpoint_name)
    212     try:
--> 213         response.raise_for_status()
    214     except HTTPError as e:

10 frames

HTTPError: 401 Client Error: Unauthorized for url:

The above exception was the direct cause of the following exception:

RepositoryNotFoundError                   Traceback (most recent call last)

RepositoryNotFoundError: 401 Client Error. (Request ID: p67nY4HwMbBQD7D2jjdqs)

Repository Not Found for url:
Please make sure you specified the correct `repo_id` and `repo_type`.
If the repo is private, make sure you are authenticated.

During handling of the above exception, another exception occurred:

OSError                                   Traceback (most recent call last)

/usr/local/lib/python3.7/dist-packages/transformers/utils/ in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, use_auth_token, revision, local_files_only, subfolder, user_agent, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash)
    422     except RepositoryNotFoundError:
    423         raise EnvironmentError(
--> 424             f"{path_or_repo_id} is not a local folder and is not a valid model identifier "
    425             "listed on ''\nIf this is a private repository, make sure to "
    426             "pass a token having permission to this repo with `use_auth_token` or log in with "

OSError: bigscience/bloom-1b3 is not a local folder and is not a valid model identifier listed on ''
If this is a private repository, make sure to pass a token having permission to this repo with `use_auth_token` or log in with `huggingface-cli login` and pass `use_auth_token=True`.

I am experiencing the same problem; it was functioning normally previously, but it has been giving me errors since yesterday.

It doesn’t look like bloom-1b3 is available anymore for whatever reason. I’d change it from “bigscience/bloom-1b3” to “bigscience/bloom-1b1”.

You can see what models are available by going to Models - Hugging Face and using the filter by name:

1 Like

Thank you. This worked for me. I was getting error for shleifer/distilbart-cnn-12-6 searched model as suggested above and found name changed as sshleifer/distilbart-cnn-12-6.