Running LLM on Android

Newbie hobby project
running ollama and other llms on a android phone.
While learning AI along the way.

Used Termux to install proot-Ubuntu and Ollama using ai generated code.

I stopped at loading models while learning models for use cases on public platforms.

If anyone has done similar, what models, sizes and use cases run well on a 12gb phone with 512ssd.

Which mini models could run basic rewriting, grammer, dataset chat, image tagging, digital garden, static sites flow. Etc.
I’ve tried pocketai app but that’s not great.

Suggestions appriciated

Got a couple of spare 8gb phones that I imagine being better than Raspberry Pi at self hosting.

1 Like

When it comes to running on an edge device, I think it’s safest to use a model that is at most 7B in size after quantization, and preferably 3B or less.