I have been granted access to the Llama 70b model (it says that when I go to that page). I have a token. I tried this code (but I replaced “my token” with the actual token in quotes. However, I get this error: “OSError: You are trying to access a gated repo.
Make sure to request access at meta-llama/Llama-2-70b-chat-hf · Hugging Face and pass a token having permission to this repo either by logging in with huggingface-cli login
or by passing token=<your_token>
.”
MY CODE:
from transformers import AutoModelForCausalLM, AutoTokenizer
import os
hf_access_token = “my token”
os.environ[“HF_ACCESS_TOKEN”] = hf_access_token
tokenizer = AutoTokenizer.from_pretrained(“meta-llama/Llama-2-70b-chat-hf”)
model = AutoModelForCausalLM.from_pretrained(“meta-llama/Llama-2-70b-chat-hf”)
I also tried this code with my actual token instead of your_token_here
from transformers import LlamaForCausalLM, LlamaTokenizer
import sentencepiece
token = “<your_token_here>” # Replace <your_token_here> with the token you obtained
tokenizer = LlamaTokenizer.from_pretrained(“meta-llama/Llama-2-70b-chat-hf”, use_auth_token=token)
model = LlamaForCausalLM.from_pretrained(“meta-llama/Llama-2-70b-chat-hf”, use_auth_token=token)
Again, I got a message about it being a gated model. Can anyone tell me what I am doing wrong? Thank you .