Cannot load tokenizer for llama2

I am trying to load tokenizer for llama2 using AutoTokenizer but I am facing this issue

“”“OSError: Can’t load tokenizer for ‘meta-llama/Llama-2-7b-hf’. If you were trying to load it from ‘Models - Hugging Face’, make sure you don’t have a local directory with the same name. Otherwise, make sure ‘meta-llama/Llama-2-7b-hf’ is the correct path to a directory containing all relevant files for a LlamaTokenizer tokenizer.

any help is appreciated!


Hello, have you solved this problem,I’m having the same issue too?

yes, we need to pass access_token and proxy(if applicable) for tokenizers as well


hi, how did you do that?

tokenizer = AutoTokenizer.from_pretrained(model_id, token=access_token)
access_token can be generated from huggingface UI