orneryd/Mimir
Mimir - Fully open and customizable memory bank with semantic vector search capabilities for locally indexed files (Code Intelligence) and stored memories that are shared across sessions and chat contexts allowing worker agent to learn from errors in past runs. Includes Drag and Drop multi-agent orchestration
Leverages Neo4j graph databases paired with NornicDB for local embeddings and semantic search, implementing the Model Context Protocol (MCP) standard for AI assistant integration. Bundles a visual orchestration studio and VSCode extension for workflow automation, with PCTX Code Mode providing 98% token reduction for codebase indexing. Supports multiple LLM backends (OpenAI, Copilot, Ollama, llama.cpp) and maintains relationship-aware memory through automatic knowledge graph construction across agent conversations.
249 stars.
Stars
249
Forks
25
Language
Go
License
—
Category
Last pushed
Dec 25, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/vector-db/orneryd/Mimir"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
topoteretes/cognee
Knowledge Engine for AI Agent Memory in 6 lines of code
divagr18/memlayer
Plug-and-play memory for LLMs in 3 lines of code. Add persistent, intelligent, human-like memory...
verygoodplugins/automem
AutoMem is a graph-vector memory service that gives AI assistants durable, relational memory:
CortexReach/memory-lancedb-pro
Enhanced LanceDB memory plugin for OpenClaw — Hybrid Retrieval (Vector + BM25), Cross-Encoder...
CaviraOSS/OpenMemory
Local persistent memory store for LLM applications including claude desktop, github copilot,...