LLAMA-2 conversation generated responses always empty

I already found posts describing the problem of no response when the prompt is long, but as you can see, my example is minimal and still, I get an empty response:

I’m using the default system prompt:

Conversation id: 70ca71b2-ac9d-42c2-9991-3f153f035a86 
user >> <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.

If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
<</SYS>>

Going to the movies tonight - any suggestions? 
bot >> The Big lebowski. 
user >> Is it good? 
bot >>
1 Like

The solution is to give max_length:

llama = pipeline(
    "conversational",
    model="meta-llama/Llama-2-7b-chat-hf",
    max_length=2000
)
1 Like