When I try to execute the following lines of code:
quantization_config = BitsAndBytesConfig(load_in_8bit=True)
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
device_map=âautoâ,
quantization_config=quantization_config
)
The tokenizer raises a 404 Client Error: Not Found, specifically:
âEntry Not Found for URL: https://huggingface.co/api/models/Qwen/Qwen2.5-7B-Instruct/tree/main/additional_chat_templates?recursive=false&expand=false.
additional_chat_templates
does not exist on âmainâ.â
The libraries I am using are:
-
tokenizers == 0.21.2
-
transformers == 4.53.3
-
bitsandbytes == 0.48.1
Is there anything I can do to fix this issue? Could it be related to a version mismatch? Any advice would be appreciated.