As of transformers v4.44, default chat template is no longer allowed
i have download llma2 model programmatically
I have hosted vllm model on server and i am accessing it using openapi but i am getting this issue . code
completion = self.client.chat.completions.create(
model=self.model,
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": user_input},
],
stream=self.stream
)