Hugging Face Gated Community: Your request to access model meta-llama/Llama-3.2-3B-Instruct has been rejected by the repo's authors

I requested access via the website for the LLAMA-3.2 repo but it was denied, reason unknown.

I would like to understand the reason why the request was denied, which will allow me to choose an alternative solution to Huggingface if necessary.

Please help me resolve this issue or understand why it’s occurring:

huggingface_hub.errors.GatedRepoError: 403 Client Error. (Request ID: Root=1-675179b0-1abce7fd7ccd648b57404e8c;b7772877-458b-4b41-9910-9e1dffeffb87)

 

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct/resolve/main/config.json.

Your request to access model meta-llama/Llama-3.2-3B-Instruct has been rejected by the repo's authors.

 

The above exception was the direct cause of the following exception:

 

Traceback (most recent call last):

  File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main

    return _run_code(code, main_globals, None,

  File "/usr/lib/python3.10/runpy.py", line 86, in _run_code

    exec(code, run_globals)

  File "/home/ubuntu/.traiano/bug24/venv/lib/python3.10/site-packages/sglang/launch_server.py", line 13, in <module>

    launch_server(server_args)

  File "/home/ubuntu/.traiano/bug24/venv/lib/python3.10/site-packages/sglang/srt/server.py", line 487, in launch_server

    launch_engine(server_args=server_args)

  File "/home/ubuntu/.traiano/bug24/venv/lib/python3.10/site-packages/sglang/srt/server.py", line 449, in launch_engine

    tokenizer_manager = TokenizerManager(server_args, port_args)

  File "/home/ubuntu/.traiano/bug24/venv/lib/python3.10/site-packages/sglang/srt/managers/tokenizer_manager.py", line 105, in __init__

    self.model_config = ModelConfig(

  File "/home/ubuntu/.traiano/bug24/venv/lib/python3.10/site-packages/sglang/srt/configs/model_config.py", line 45, in __init__

    self.hf_config = get_config(

  File "/home/ubuntu/.traiano/bug24/venv/lib/python3.10/site-packages/sglang/srt/hf_transformers_utils.py", line 64, in get_config

    config = AutoConfig.from_pretrained(

  File "/home/ubuntu/.traiano/bug24/venv/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1017, in from_pretrained

    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)

  File "/home/ubuntu/.traiano/bug24/venv/lib/python3.10/site-packages/transformers/configuration_utils.py", line 574, in get_config_dict

    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)

  File "/home/ubuntu/.traiano/bug24/venv/lib/python3.10/site-packages/transformers/configuration_utils.py", line 633, in _get_config_dict

    resolved_config_file = cached_file(

  File "/home/ubuntu/.traiano/bug24/venv/lib/python3.10/site-packages/transformers/utils/hub.py", line 421, in cached_file

    raise EnvironmentError(

OSError: You are trying to access a gated repo.

Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct.

403 Client Error. (Request ID: Root=1-675179b0-1abce7fd7ccd648b57404e8c;b7772877-458b-4b41-9910-9e1dffeffb87)

 

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct/resolve/main/config.json.

Your request to access model meta-llama/Llama-3.2-3B-Instruct has been rejected by the repo's authors.



1 Like

Try unsloth’s ungated models.

I heard that EU residents are refused. Maybe there is a problem with the law or treaty?

I’m in Singapore … Is there an embargo against Singapore too?

1 Like

I don’t know. I’ve never heard that it’s particularly prohibited…:thinking:
I use unsolth from the beginning because it’s troublesome to get permission.