Hi, I would like to be able to specify the output format of the JSON object in the call to meta-llama/Llama-3.2-3B-Instruct. I know this can be done in the prompt itself, but it returns a lot of unwanted text and other JSON objects in addition to the one requested.
So I would like to be able to call llama 3.2 3B and, in the call itself, ask for the exact JSON I want. Can this be done? If so, how do you do it using huggingface?
Thanks all!