Ollama + Llama-3.2-11b-vision-uncensored like 22

Request: Support for pulling this model directly into my ollama enviornment.

Note: I want to start by saying that I don’t fully know all the details and complexities of this request.

Context:

  • Huggingface has native ollama integration [1]
  • Llama3.2-vision (the censored version) is already on ollama [2]
  • My understanding is that this is llama.cpp compatible, which is what ollama is built on top of.

Question: How easy is it to do this?

[1] Use Ollama with any GGUF Model on Hugging Face Hub
[2] The llama3.2-vision library

1 Like

It seems easy.