Chat_template is not set & throwing error

When I am running the sample code from here:

tokenizer = AutoTokenizer.from_pretrained("facebook/blenderbot-400M-distill")

chat = [
   {"role": "user", "content": "Hello, how are you?"},
   {"role": "assistant", "content": "I'm doing great. How can I help you today?"},
   {"role": "user", "content": "I'd like to show off how chat templating works!"},
]

tokenizer.apply_chat_template(chat, tokenize=False)

I get the error message:

ValueError: Cannot use apply_chat_template() because tokenizer.chat_template is not set and no template argument was passed! For information about writing templates and setting the tokenizer.chat_template attribute, please see the documentation at https://huggingface.co/docs/transformers/main/en/chat_templating

I am using tokenizers-0.19.1 & transformers-4.44.2 on a ubuntu on windows system.

Any clue what the problem could be? I already uninstalled and re-installed these two packages.
Thank you for your help!!

I have the same issue. Can somebody help?

1 Like

Thanks for notifying, this is because initially we had default chat templates for models which didn’t have one set. As this led to various issues, one now needs to explicitly add a chat template (PR for that is here).

I’ll open an issue on the Transformers library to replace the code snippet by a model which supports chat templates (like Mistral-7B, LLaMa 3.1, Qwen2, etc.). Check whether the chat_template attribute is present in the tokenizer_config.json file.

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.