Open source · macOS · Free

Chat with your
documents.

rag is a local AI tool that lets you ask questions about any PDF, text, or markdown file. Runs entirely on your machine — your data never leaves.

$ brew tap Eshap09/rag copy
$ brew install rag copy
$ rag start copy
Requires macOS · Homebrew · Python 3.12

See it in action
zsh — rag
~ rag start
 
chromadb ready ~/.rag/chromadb
model ready all-MiniLM-L6-v2
server starting http://localhost:8000
→ opening browser...
 
~ # upload a PDF and ask questions in the browser

How it works
01
Install
One brew command. rag installs Python dependencies, the embedding model, and everything it needs automatically.
02
Upload
Drop any PDF, text, or markdown file into the browser UI. rag chunks it, embeds it locally, and indexes it in seconds.
03
Ask
Type any question. rag retrieves the most relevant passages, sends them to Groq, and returns a grounded answer with source citations.

Features
🔒
Fully local
Embeddings run on your machine using sentence-transformers. Your documents never leave your computer. Only the final LLM call goes to Groq.
📄
PDF + OCR support
Works on text-based PDFs and image-based PDFs alike. Automatically falls back to OCR via tesseract when the text layer is sparse.
Reranking
Retrieves 20 candidate chunks then uses a cross-encoder to rerank them by relevance. Much more accurate than vector similarity alone.
📍
Source citations
Every answer shows which file and page it came from. Hover any source chip to preview the exact passage the model used.
🔄
Smart re-indexing
Upload the same file again and rag automatically replaces the old chunks — no duplicates, no manual clearing required.
🌓
Auto dark mode
The UI automatically matches your system theme. Light during the day, dark at night — no toggle needed.
FastAPI
LangChain
ChromaDB
sentence-transformers
Groq
cross-encoder
pytesseract
pypdf
Homebrew

Ready to try it?

Three commands. No account. No cloud. No cost.

$ brew tap Eshap09/rag copy
$ brew install rag copy
$ rag start copy