Hello All,
Please pardon for my basic/trivial questions, still i am in learning phase.
Context:
I used Autotrain to train meta-llama/Llama-3.2-1B (i have access to this model).
Autrain completed succesfully.
I want to test using HF Inference Endpoint option available from fine tuned model. I am getting following error and not sure how to resolve this issue.
Appreciate your help on this.
Thank you!
==============error================
Exit code: 3. Reason: File “/app/huggingface_inference_toolkit/handler.py”, line 22, in init
self.pipeline = get_pipeline(
^^^^^^^^^^^^^
File “/app/huggingface_inference_toolkit/utils.py”, line 252, in get_pipeline
hf_pipeline = pipeline(task=task, model=model_dir, device=device, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.11/dist-packages/transformers/pipelines/init.py”, line 849, in pipeline
config = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.11/dist-packages/transformers/models/auto/configuration_auto.py”, line 1054, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.11/dist-packages/transformers/configuration_utils.py”, line 591, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/local/lib/python3.11/dist-packages/transformers/configuration_utils.py”, line 650, in _get_config_dict
resolved_config_file = cached_file(
^^^^^^^^^^^^
File “/usr/local/lib/python3.11/dist-packages/transformers/utils/hub.py”, line 421, in cached_file
raise EnvironmentError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at meta-llama/Llama-3.2-1B · Hugging Face.
401 Client Error. (Request ID: Root=1-67a3db7b-5d79316f2004d07c068598e2;57232df2-b5bf-4ca0-b882-4b0a4cfc1100)
Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B is restricted. You must have access to it and be authenticated to access it. Please log in.
Application startup failed. Exiting.