MARM-Systems and purmemo-mcp
These two tools are complements within the MCP ecosystem, with the former providing a specific AI conversation memory product built on an MCP server, and the latter offering a universal MCP server solution that could host or interoperate with such a product to enable broader memory-powered AI collaboration.
About MARM-Systems
Lyellr88/MARM-Systems
Turn AI into a persistent, memory-powered collaborator. Universal MCP Server (supports HTTP, STDIO, and WebSocket) enabling cross-platform AI memory, multi-agent coordination, and context sharing. Built with MARM protocol for structured reasoning that evolves with your work.
# Technical Summary Implements semantic vector-based memory indexing with auto-classification of conversation content (code, decisions, configs) and enables cross-session recall via FastAPI-backed HTTP/STDIO transports that integrate natively with Claude, Gemini, and other MCP-compatible agents. The architecture uses SQLite with WAL mode for persistent storage and connection pooling, exposing 18 MCP tools for granular memory control—including structured session logs, reusable notebooks, and smart context fallbacks when vector similarity alone is insufficient. Designed for production workflows requiring reliable long-term context across multiple AI agents and deployment cycles, with Docker containerization and rate-limiting built-in.
About purmemo-mcp
purmemo-ai/purmemo-mcp
MCP server for pūrmemo — AI conversation memory that works everywhere. Save and recall conversations across Claude Desktop, Cursor, and other MCP-compatible platforms.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work