Ask for a structured JSON object in the call to meta-llama/Llama-3.2-3B-Instruct

Hi, I would like to be able to specify the output format of the JSON object in the call to meta-llama/Llama-3.2-3B-Instruct. I know this can be done in the prompt itself, but it returns a lot of unwanted text and other JSON objects in addition to the one requested.

So I would like to be able to call llama 3.2 3B and, in the call itself, ask for the exact JSON I want. Can this be done? If so, how do you do it using huggingface?

Thanks all!

1 Like

I think the functions called Function Calling and Tool are what you are looking for. However, I think the Function Calling in the free Inference API is buggy at the moment…:sweat_smile:
Also, this function is supported or not supported depending on the model, so I don’t know if it can be done with standard Llama 3.2.