Chat with PDF locally using Llama 3

Hi everyone,

Recently, we added chat with PDF feature, local RAG and Llama 3 support in RecurseChat, a local AI chat app on macOS.

I wrote about why we build it and the technical details here: Local Docs, Local AI: Chat with PDF locally using Llama 3.

You can chat with PDF locally and offline with built-in models such as Meta Llama 3 and Mistral, your own GGUF models or online providers like Together AI and Groq.

Looking forward to your feedback!