As of transformers v4.44, default chat template is no longer allowed

As of transformers v4.44, default chat template is no longer allowed
i have download llma2 model programmatically
I have hosted vllm model on server and i am accessing it using openapi but i am getting this issue . code

completion = self.client.chat.completions.create(
                    model=self.model,
                    messages=[
                        {"role": "system", "content": "You are a helpful assistant."},
                        {"role": "user", "content": user_input},
                    ],
                    stream=self.stream
                    
                )
1 Like

Perhaps this?
https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html#chat-template