Cannot load tokenizer for llama2

Hi,
I am trying to load tokenizer for llama2 using AutoTokenizer but I am facing this issue

“”“OSError: Can’t load tokenizer for ‘meta-llama/Llama-2-7b-hf’. If you were trying to load it from ‘Models - Hugging Face’, make sure you don’t have a local directory with the same name. Otherwise, make sure ‘meta-llama/Llama-2-7b-hf’ is the correct path to a directory containing all relevant files for a LlamaTokenizer tokenizer.
“””

any help is appreciated!

Thanks

Hello, have you solved this problem,I’m having the same issue too?

yes, we need to pass access_token and proxy(if applicable) for tokenizers as well

2 Likes

hi, how did you do that?

tokenizer = AutoTokenizer.from_pretrained(model_id, token=access_token)
access_token can be generated from huggingface UI

1 Like

hi, what did you put in the variable model_id?? i am trying it on colab and facing this same issue

model_id should be the name from hf model repository
for example : meta-llama/Meta-Llama-3-8B