automem and mcp-automem
The first is the core AutoMem service/library, while the second is its Model Context Protocol (MCP) wrapper, making them ecosystem siblings where the MCP version enables AutoMem integration with Claude and other MCP-compatible AI agents.
About automem
verygoodplugins/automem
AutoMem is a graph-vector memory service that gives AI assistants durable, relational memory:
**Combines FalkorDB graph storage with Qdrant vectors to enable hybrid semantic and relational search**, allowing AI to retrieve not just relevant memories but the relationships and reasoning between them. Implements research-backed techniques including multi-hop bridge discovery, automatic entity extraction with 11+ relationship types, and consolidation pipelines for pattern detection. Deploys as a standalone Flask service with sub-100ms recall performance and includes benchmarked baselines (87-89% on LoCoMo), making it suitable for long-term AI assistant memory in production environments.
About mcp-automem
verygoodplugins/mcp-automem
AutoMem is a graph-vector memory service that gives AI assistants durable, relational memory:
Implements a graph-vector memory backend with 11 relationship types between stored memories, using research-validated HippoRAG 2 architecture for sub-second retrieval across millions of records. Runs as an MCP (Model Context Protocol) server compatible with Claude Desktop, Cursor, GitHub Copilot, and other AI platforms, with optional HTTP/SSE transport for web-based clients like ChatGPT and ElevenLabs. Stores persistent memories locally or on Railway with cross-device sync, automatically capturing coding patterns, decisions, and context across conversations.
Scores updated daily from GitHub, PyPI, and npm data. How scores work