model = AutoModelForCausalLM.from_pretrained(
"meta-llama/Llama-3.2-3B",
quantization_config = bnb_config,
trust_remote_code = True,
token = token
).to(device)
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-3.2-3B")
im trying to load llama-3.2-3B tokenizer but getting error.
1 Like
I don’t know if this is the reason, but you forgot your token.
#tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-3.2-3B")
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-3.2-3B", token = token)
system
Closed
4
This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.