Huggingface documents tool calling question

I’m trying to do tool calling using Gemma2 model. I just follwed huggingface documents developer guide, which is chat_templating

Using below the function, returned wrong answer. how can i fix it?

def get_current_temperature(location: str, unit: str) -> float:
    """
    Get the current temperature at a location.
    
    Args:
        location: The location to get the temperature for, in the format "City, Country"
        unit: The unit to return the temperature in. (choices: ["celsius", "fahrenheit"])
    Returns:
        The current temperature at the specified location in the specified units, as a float.
    """
    return 22.  # A real function should probably actually get the temperature!

I’ll be really happy with your help.
thanks

1 Like

I think you had a problem from hardcode 22 is not real temperature.
How about using this code?
But you can get the api key from openweather.co.uk and you should consider the free limit(1000 per day)

import requests

def get_current_temperature(location: str, unit: str) -> float:
    """
    Get the current temperature at a location.

    Args:
        location: The location to get the temperature for, in the format "City, Country".
        unit: The unit to return the temperature in. (choices: ["celsius", "fahrenheit"])
    Returns:
        The current temperature at the specified location in the specified units, as a float.
    """
    api_key = "your_openweather_api_key"  # Replace with your actual API key.
    base_url = "http://api.openweathermap.org/data/2.5/weather"
    
    params = {
        "q": location,
        "appid": api_key,
        "units": "metric" if unit == "celsius" else "imperial"
    }

    response = requests.get(base_url, params=params)
    if response.status_code == 200:
        data = response.json()
        return data["main"]["temp"]
    else:
        raise ValueError(f"Failed to get data: {response.status_code} - {response.text}")

2 Likes

Thanks for your reply.
I just changed the model to Llama. Gemma2 can’t do tool calling unless I train.
I got the below result with your code(little bit changed, I don’t use unit variable).

{"name": "get_current_temperature", "parameters": {"location": "Seoul, South Korea"}}

I have one more question…
For function operation, can I access the function by the key value of the dictionary?
Or
If I append it to the buffer, LLM can access the function ?

1 Like

Thanks for your reply! Switching to Llama makes sense if Gemma2 lacks tool-calling capabilities out of the box.

Regarding your question:
Yes, for function operation, you can access a function by its key value from the dictionary. For instance, if you’re using a dictionary to store your functions, you can dynamically call them like this:

functions = {
    "get_current_temperature": your_function_name_here
}

# Dynamically call the function
result = functions["get_current_temperature"](*args, **kwargs)

If you’re asking about appending the function to the buffer and letting the LLM “access” it:

  • The LLM itself doesn’t directly execute or call functions; it simply generates text outputs or JSON-like instructions, like the one you shared:
    {"name": "get_current_temperature", "parameters": {"location": "Seoul, South Korea"}}
    
  • You’d still need to parse the LLM output programmatically, extract the function name ("name") and parameters, and then map it to your actual implementation (using a dictionary or similar structure).

If you append function information to the buffer, the LLM may reference it contextually (if trained or prompted properly), but it won’t “execute” the function for you. You’d need to handle that part in your code.

Hope this help!

1 Like

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.