I am new in the field of Ollama. I am using llama3 model and I am looking to fine tune the model using huggingface to get the proper response. I am curious to know how to do that and will my data get stored in the hugging face server if I use?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Ollama + Llama-3.2-11b-vision-uncensored like 22 | 1 | 1195 | December 10, 2024 | |
How to run huggingface model on base url | 0 | 155 | January 17, 2025 | |
Deploy model in hugging face platform | 0 | 258 | December 18, 2023 | |
How to download a model and run it with Ollama locally? | 17 | 119069 | May 15, 2025 | |
Is there llama3 api for hugging face to use? | 4 | 916 | September 8, 2024 |