meta-llama/Llama-2-7b-chat-hf not performing well

I was using meta-llama/Llama-2-7b-chat-hf for my project asking it to generate some questions based on the given context , it is generating for some contexts and for some contexts it is not generating , the same thing i tried in the huggingface llma space

and it was performing much better.

is there any difference between meta-llama/Llama-2-7b-chat-hf and model that is used in the above space provided by huggingface ? if so how to access the model that is being used by above space ?