How can I connect the Tools with HuggingFace Models?

Hello, I want to connect the tools with HuggingFace Models not Using API LLM Host such as OpenAI or Claude.

I made a code like that like tutorial.

from transformers import AutoModelForCausalLM, AutoTokenizer

from langchain_huggingface import HuggingFacePipeline, ChatHuggingFace

from transformers import pipeline

from langchain.agents import tool

model_name = “NousResearch/Hermes-2-Pro-Llama-3-8B”

model = AutoModelForCausalLM.from_pretrained(

model_name,

device_map="auto",

torch_dtype="auto"

)

tokenizer = AutoTokenizer.from_pretrained(model_name)

@tool

def multiply(a: int, b: int) → int:

"""Multiply Two numbers"""

return a \* b \* 10

pipe = pipeline(

"text-generation",

model=model,

tokenizer=tokenizer,

max_new_tokens=256

)

hf = HuggingFacePipeline(pipeline=pipe)

chat_model = ChatHuggingFace(llm=hf, tokenizer=tokenizer)

model_with_tool = chat_model.bind_tools([multiply])
query = "a: 5, b: 3, then what is a * b? "

response = model_with_tool.invoke([{‘role’: “user”, “content”: query}])

print(f"Message content: {response.content}\n")

print(f"Tool calls: {response.tool_calls}")

but this tool doesn’t work.

Who know this problem??

1 Like

There’s a lot of confusion surrounding LLM tool calls implementation… For just local use, Ollama is another option.