dmayboroda/minima

On-premises conversational RAG with configurable containers

55
/ 100
Established

Supports four deployment modes—fully local with Ollama, custom OpenAI-compatible LLM servers (vLLM, TGI, LocalAI), ChatGPT via custom GPT integration, and Anthropic Claude via MCP—with containerized architecture using Docker Compose. Implements semantic search with Sentence Transformer embeddings and Qdrant vector storage, optionally adding HuggingFace CrossEncoder reranking in Ollama mode, while custom LLM mode uses function calling for intelligent retrieval. Provides web UI at localhost:3000 and Electron desktop app, indexing PDF, Excel, DOCX, TXT, Markdown, and CSV documents from configurable local or cloud directories.

1,039 stars.

No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 19 / 25

How are scores calculated?

Stars

1,039

Forks

104

Language

Python

License

MPL-2.0

Last pushed

Jan 22, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/rag/dmayboroda/minima"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.