anything-llm and llmcore
About anything-llm
Mintplex-Labs/anything-llm
The all-in-one AI productivity accelerator. On device and privacy first with no annoying setup or configration.
Supports document-based RAG through pluggable vector databases (Pinecone, Weaviate, Qdrant, Chroma, etc.) and works with 20+ LLM providers from local (llama.cpp, Ollama) to cloud-based services, plus built-in no-code agent flows and MCP compatibility for extensible tool integrations. Runs as self-contained desktop app or Docker instance with native multi-user access control and document pipelines requiring zero configuration.
About llmcore
araray/llmcore
A unified, async Python framework for LLM applications—chat, autonomous agents, RAG, and sandboxed code execution. Supports OpenAI, Anthropic, Gemini, Ollama, and more through a single API. Features an 8-phase cognitive reasoning cycle, Docker-based isolation, human-in-the-loop approvals, session management, and comprehensive observability.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work