Function/tool calling using Transformer models

how do i add tool/function calls to a model, or how can i interact with an hugging face model using openai api format.

I loaded the model woth auto model and auto process for chat template but I am unable to use tooks.
When I use tokeizer.from_chat_template, it calls the function on first message without waiting for the condition to be met

1 Like

There are several ways to do this. Personally, I recommend creating agents. This is not too dependent on the model’s functions.

If you want to train the model itself to be compatible with function calling, that’s a completely different story… or rather, I don’t really understand.

2 Likes

I have loaded a model with auto model and auto tokenizer
I want to use the model to collect users information like name address age etc etc.
When all information has been collected, I want the model to send the information to an endpoint.

Also can I use Inference client on a model I am running locally

1 Like

I think it would be difficult to use the Hugging Face InferenceClient library as it is…

It would probably be easier to load the Transformers model from each library locally.
If you use smoalgents, you can use the TransformersModel to use the Transformers model. If you want to set up a server locally, you could use Ollama, Llamacpp or TGI.

1 Like