How to download a model and run it with Ollama locally?

This guy has the exact video you’re looking for. The first half is about Quantized/GGUF models. But around minute mark 4:23 he gets into regular models, like what you’re asking about

9 Likes