OpenMemory and persistent-ai-memory
Both projects offer local persistent memory stores for LLM applications and AI assistants, making them **competitors** providing similar functionality.
About OpenMemory
CaviraOSS/OpenMemory
Local persistent memory store for LLM applications including claude desktop, github copilot, codex, antigravity, etc.
Provides multi-sector memory (episodic, semantic, procedural) with temporal reasoning and composite scoring—not just vector retrieval—via self-hosted SQLite/Postgres backends. Offers both embedded SDKs (Python/Node) and a centralized server exposing HTTP API, MCP protocol, and dashboard, with source connectors for GitHub, Notion, Google Drive, and web crawling to populate long-term agent context.
About persistent-ai-memory
savantskie/persistent-ai-memory
A persistent local memory for AI, LLMs, or Copilot in VS Code.
Integrates with OpenWebUI as a native plugin for sophisticated memory extraction and injection during conversations, uses SQLite databases with vector embeddings for semantic search, and supports multiple embedding providers (Ollama, LM Studio, OpenAI) with strict multi-tenant isolation via user_id and model_id parameters. The system implements specialized databases for conversations, memories, schedules, tool calls, and project context, plus an MCP server for cross-platform compatibility with assistants and development environments.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work