Chat_template is not set & throwing error

Thanks for notifying, this is because initially we had default chat templates for models which didn’t have one set. As this led to various issues, one now needs to explicitly add a chat template (PR for that is here).

I’ll open an issue on the Transformers library to replace the code snippet by a model which supports chat templates (like Mistral-7B, LLaMa 3.1, Qwen2, etc.). Check whether the chat_template attribute is present in the tokenizer_config.json file.

4 Likes