How to use hugging face to fine-tune ollama's local model

I am a newbie. I have downloaded ollama and can run gemma:2b on my laptop.
model, I want to fine-tune this model, but I did not find the gguf file under C:/Users/lihuacat/.ollama/models. I found one under C:/Users/lihuacat/.ollama/models/manifests/registry. .ollama.ai/library/gemma/2b, but when writing the file, an error that the file cannot be found will be reported.
C:/Users/lihuacat/.ollama/models/blobs/sha256-c1864a5eb19305c40519da12cc543519e48a0697ecd30e15d5ac228644957d12 This file, but when writing the file name, an error that the config file cannot be found appears. Is it true that ollama can only be used as a proxy? The model file needs to be downloaded from hugging face. Can?

OSError: Unable to load weights from pytorch checkpoint file for ‘C:/Users/lihuacat/.ollama/models/manifests/registry.ollama.ai/library/gemma/2b’ at ‘C:/Users/lihuacat/.ollama/models/manifests/registry.ollama.ai/library/gemma/2b’. If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True.

Hi,

File formats like GGUF are typically meant for inference on local hardware, see ggml/docs/gguf.md at master · ggerganov/ggml · GitHub.

For fine-tuning models, one typically uses one of the following libraries (in combination with GPU hardware):

After fine-tuning, the weights can be converted to the GGUF format, which allows local inference with ollama and llama cpp. See How to convert any HuggingFace Model to gguf file format? - GeeksforGeeks for a tutorial.

Additionally there’s Apple’s MLX library which allows to fine-tune LLMs on a Macbook. See mlx-examples/llms at main · ml-explore/mlx-examples · GitHub for fine-tuning LLMs.

1 Like

thank you,now i know how to over my job

所以这个具体什么情况导致的呢?我在vscode调用llama3的时候也出现相关问题,是需要再去hugging face上把分词器文件那些下载了才行吗

@lihuacat Did you find any way to fine tune the model which is avail in ollama?

I am facing lot of issues to fine tune ol
lama models but, i tried with HF models after that i load those models into ollama but i found nonsense anwers.