nocturne_memory and MARM-Systems

These are **competitors** — both provide MCP servers for persistent AI memory management, but nocturne_memory emphasizes graph-structured rollback with visual inspection while MARM-Systems emphasizes multi-transport protocol support and cross-platform agent coordination, requiring selection based on whether you prioritize memory structure/debugging or deployment flexibility.

nocturne_memory
63
Established
MARM-Systems
48
Emerging
Maintenance 25/25
Adoption 10/25
Maturity 9/25
Community 19/25
Maintenance 10/25
Adoption 10/25
Maturity 9/25
Community 19/25
Stars: 615
Forks: 79
Downloads:
Commits (30d): 101
Language: Python
License: MIT
Stars: 251
Forks: 42
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No Package No Dependents
No Package No Dependents

About nocturne_memory

Dataojitori/nocturne_memory

A lightweight, rollbackable, and visual Long-Term Memory Server for MCP Agents. Say goodbye to Vector RAG and amnesia. Empower your AI with persistent, graph-like structured memory across any model, session, or tool. Drop-in replacement for OpenClaw.

Implements a graph-based memory architecture with SQLite/PostgreSQL backends, where AI agents can create, update, and rollback their own structured memories through MCP—eliminating vector RAG's semantic lossy compression and enabling condition-triggered disclosure of hierarchical knowledge graphs with human-auditable versioning. Includes a visual dashboard for memory exploration, diff review, and governance; integrates natively with Claude Desktop, Cursor, and other MCP-compatible frameworks as a direct OpenClaw replacement.

About MARM-Systems

Lyellr88/MARM-Systems

Turn AI into a persistent, memory-powered collaborator. Universal MCP Server (supports HTTP, STDIO, and WebSocket) enabling cross-platform AI memory, multi-agent coordination, and context sharing. Built with MARM protocol for structured reasoning that evolves with your work.

# Technical Summary Implements semantic vector-based memory indexing with auto-classification of conversation content (code, decisions, configs) and enables cross-session recall via FastAPI-backed HTTP/STDIO transports that integrate natively with Claude, Gemini, and other MCP-compatible agents. The architecture uses SQLite with WAL mode for persistent storage and connection pooling, exposing 18 MCP tools for granular memory control—including structured session logs, reusable notebooks, and smart context fallbacks when vector similarity alone is insufficient. Designed for production workflows requiring reliable long-term context across multiple AI agents and deployment cycles, with Docker containerization and rate-limiting built-in.

Scores updated daily from GitHub, PyPI, and npm data. How scores work