OpenMemory and persistent-ai-memory

Both projects offer local persistent memory stores for LLM applications and AI assistants, making them **competitors** providing similar functionality.

OpenMemory
61
Established
persistent-ai-memory
51
Established
Maintenance 17/25
Adoption 10/25
Maturity 13/25
Community 21/25
Maintenance 10/25
Adoption 10/25
Maturity 15/25
Community 16/25
Stars: 3,604
Forks: 412
Downloads:
Commits (30d): 19
Language: TypeScript
License: Apache-2.0
Stars: 207
Forks: 23
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No Package No Dependents
No Package No Dependents

About OpenMemory

CaviraOSS/OpenMemory

Local persistent memory store for LLM applications including claude desktop, github copilot, codex, antigravity, etc.

Provides multi-sector memory (episodic, semantic, procedural) with temporal reasoning and composite scoring—not just vector retrieval—via self-hosted SQLite/Postgres backends. Offers both embedded SDKs (Python/Node) and a centralized server exposing HTTP API, MCP protocol, and dashboard, with source connectors for GitHub, Notion, Google Drive, and web crawling to populate long-term agent context.

About persistent-ai-memory

savantskie/persistent-ai-memory

A persistent local memory for AI, LLMs, or Copilot in VS Code.

Integrates with OpenWebUI as a native plugin for sophisticated memory extraction and injection during conversations, uses SQLite databases with vector embeddings for semantic search, and supports multiple embedding providers (Ollama, LM Studio, OpenAI) with strict multi-tenant isolation via user_id and model_id parameters. The system implements specialized databases for conversations, memories, schedules, tool calls, and project context, plus an MCP server for cross-platform compatibility with assistants and development environments.

Scores updated daily from GitHub, PyPI, and npm data. How scores work