Hello,
I have a question about downloading a model to run locally with ollama. It seems that new DeppSeek-R1 model doesn’t currently support function calling, but the guys from unsloth did something on their tokenizer.template to allow them.
I just want to try if the model really works with tools in ollama. I download it with :
ollama run hf.co/unsloth/DeepSeek-R1-Distill-Qwen-14B-GGUF:Q2_K_L
This is the tokenizer.template in Huggingface Hub:
{% if not add_generation_prompt is defined %}
{% set add_generation_prompt = false %}
{% endif %}
{% set ns = namespace(is_first=false, is_tool=false, is_output_first=true, system_prompt='') %}
{%- for message in messages %}
{%- if message['role'] == 'system' %}
{% set ns.system_prompt = message['content'] %}
{%- endif %}
{%- endfor %}
{{bos_token}}{{ns.system_prompt}}{%- for message in messages %}
{%- if message['role'] == 'user' %}
{%- set ns.is_tool = false -%}
{{'<|User|>' + message['content']}}{%- endif %}
{%- if message['role'] == 'assistant' and message['content'] is none %}
{%- set ns.is_tool = false -%}
{%- for tool in message['tool_calls']%}
{%- if not ns.is_first %}
{{'<|Assistant|><|tool▁calls▁begin|><|tool▁call▁begin|>' + tool['type'] + '<|tool▁sep|>' + tool['function']['name'] + '\n' + '```json' + '\n' + tool['function']['arguments'] + '\n' + '```' + '<|tool▁call▁end|>'}}{%- set ns.is_first = true -%}
{%- else %}
{{'\n' + '<|tool▁call▁begin|>' + tool['type'] + '<|tool▁sep|>' + tool['function']['name'] + '\n' + '```json' + '\n' + tool['function']['arguments'] + '\n' + '```' + '<|tool▁call▁end|>'}}{{'<|tool▁calls▁end|><|end▁of▁sentence|>'}}{%- endif %}
{%- endfor %}
{%- endif %}
{%- if message['role'] == 'assistant' and message['content'] is not none %}
{%- if ns.is_tool %}
{{'<|tool▁outputs▁end|>' + message['content'] + '<|end▁of▁sentence|>'}}{%- set ns.is_tool = false -%}
{%- else %}
{% set content = message['content'] %}
{% if '</think>' in content %}
{% set content = content.split('</think>')[-1] %}
{% endif %}
{{'<|Assistant|>' + content + '<|end▁of▁sentence|>'}}{%- endif %}
{%- endif %}
{%- if message['role'] == 'tool' %}
{%- set ns.is_tool = true -%}
{%- if ns.is_output_first %}
{{'<|tool▁outputs▁begin|><|tool▁output▁begin|>' + message['content'] + '<|tool▁output▁end|>'}}{%- set ns.is_output_first = false %}
{%- else %}
{{'\n<|tool▁output▁begin|>' + message['content'] + '<|tool▁output▁end|>'}}{%- endif %}
{%- endif %}
{%- endfor -%}
{% if ns.is_tool %}
{{'<|tool▁outputs▁end|>'}}{% endif %}
{% if add_generation_prompt and not ns.is_tool %}
{{'<|Assistant|>'}}{% endif %}
But his is what I get from: ollama show hf.co/unsloth/DeepSeek-R1-Distill-Qwen-14B-GGUF:Q2_K_L --modelfile
# Modelfile generated by "ollama show"
# To build a new Modelfile based on this, replace FROM with:
# FROM hf.co/unsloth/DeepSeek-R1-Distill-Qwen-14B-GGUF:Q2_K_L
FROM /usr/share/ollama/.ollama/models/blobs/sha256-d7b38b112a9b76e7e926fd2826b6e2aa325c9bfe41afa65742fc4f3a3a751c38
TEMPLATE """{{- if .Suffix }}<|fim▁begin|>{{ .Prompt }}<|fim▁hole|>{{ .Suffix }}<|fim▁end|>
{{- else if .Messages }}
{{- range $i, $_ := .Messages }}
{{- if eq .Role "user" }}<|User|>
{{- else if eq .Role "assistant" }}<|Assistant|>
{{- end }}{{ .Content }}
{{- if eq (len (slice $.Messages $i)) 1 }}
{{- if eq .Role "user" }}<|Assistant|>
{{- end }}
{{- else if eq .Role "assistant" }}<|end▁of▁sentence|><|begin▁of▁sentence|>
{{- end }}
{{- end }}
{{- end }}"""
PARAMETER stop <|begin▁of▁sentence|>
PARAMETER stop <|end▁of▁sentence|>
PARAMETER stop <|User|>
PARAMETER stop <|Assistant|>
PARAMETER stop <|fim▁begin|>
PARAMETER stop <|fim▁hole|>
PARAMETER stop <|fim▁end|>
As you can see the template is not the same. I know that I can change the model file and add the template by hand, but shouldn’t be this managed by Huggingface?
Have a nice day!