verygoodplugins/mcp-automem
AutoMem is a graph-vector memory service that gives AI assistants durable, relational memory:
Implements a graph-vector memory backend with 11 relationship types between stored memories, using research-validated HippoRAG 2 architecture for sub-second retrieval across millions of records. Runs as an MCP (Model Context Protocol) server compatible with Claude Desktop, Cursor, GitHub Copilot, and other AI platforms, with optional HTTP/SSE transport for web-based clients like ChatGPT and ElevenLabs. Stores persistent memories locally or on Railway with cross-device sync, automatically capturing coding patterns, decisions, and context across conversations.
Available on npm.
Stars
41
Forks
10
Language
TypeScript
License
MIT
Category
Last pushed
Mar 10, 2026
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/vector-db/verygoodplugins/mcp-automem"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related tools
topoteretes/cognee
Knowledge Engine for AI Agent Memory in 6 lines of code
divagr18/memlayer
Plug-and-play memory for LLMs in 3 lines of code. Add persistent, intelligent, human-like memory...
verygoodplugins/automem
AutoMem is a graph-vector memory service that gives AI assistants durable, relational memory:
CortexReach/memory-lancedb-pro
Enhanced LanceDB memory plugin for OpenClaw — Hybrid Retrieval (Vector + BM25), Cross-Encoder...
CaviraOSS/OpenMemory
Local persistent memory store for LLM applications including claude desktop, github copilot,...