memora and Memory-Plus

memora
54
Established
Memory-Plus
52
Established
Maintenance 13/25
Adoption 10/25
Maturity 15/25
Community 16/25
Maintenance 2/25
Adoption 13/25
Maturity 24/25
Community 13/25
Stars: 322
Forks: 34
Downloads: โ€”
Commits (30d): 0
Language: Python
License: MIT
Stars: 52
Forks: 7
Downloads: 149
Commits (30d): 0
Language: Python
License: Apache-2.0
No Package No Dependents
Stale 6m

About memora

agentic-box/memora

Give your AI agents persistent memory โ€” MCP server for semantic storage, knowledge graphs, and cross-session context

Implements a Model Context Protocol (MCP) server with pluggable embedding backends (OpenAI, sentence-transformers, TF-IDF) and multi-tiered storageโ€”local SQLite, Cloudflare D1, or S3/R2 with optional encryption and compression. Features include interactive knowledge graph visualization, RAG-powered chat with streaming LLM tool calling, event notifications for inter-agent communication, and automated memory deduplication via LLM comparison. Integrates with Claude Code and Codex CLI through stdio or HTTP transports.

About Memory-Plus

Yuchen20/Memory-Plus

๐Ÿง  ๐‘ด๐’†๐’Ž๐’๐’“๐’š-๐‘ท๐’๐’–๐’” is a lightweight, local RAG memory store for MCP agents. Easily record, retrieve, update, delete, and visualize persistent "memories" across sessionsโ€”perfect for developers working with multiple AI coders (like Windsurf, Cursor, or Copilot) or anyone who wants their AI to actually remember them.

Built on Google's Embedding API for semantic search, Memory-Plus stores encoded memories locally and supports versioning to track changes over time. It integrates as an MCP server via stdio transport, compatible with VS Code, Cursor, Cline, and other MCP-enabled IDEs, with optional resource-based prompting to control when agents access past context.

Scores updated daily from GitHub, PyPI, and npm data. How scores work