Greedy sampling with the new branch

With new version starting from 4.39, performing greedy search gives a warning:
You should set do_sample=True or unset temperature.
I am loading pretrained llama2-7b-chat-hf model. I understand that by default temperature is set to 0.6, so I explicitly set it to 0 while calling generate function.
Something like this: model.generate(do_sample=False, temperature=0)
But I get a warning message that either set do_sample to True or unset temperature
Is the warning hampering the greedy decoding process or can I ignore it?
(FYI I have tried with do_sample=True and top_k=1 which should ideally be same as greedy decoding. But just wanted to confirm if do_sample=True really gives greedy decoding results)
If we cannot ignore it, then how should I unset the temperature?