OpenMemory and context-vault
About OpenMemory
CaviraOSS/OpenMemory
Local persistent memory store for LLM applications including claude desktop, github copilot, codex, antigravity, etc.
This project gives AI agents and large language models (LLMs) a persistent, long-term memory. It allows you to feed in information from various sources like GitHub, Notion, or web pages, and the AI can then recall and use these memories contextually over time. It's for developers building AI applications (e.g., chatbots, automated assistants, or intelligent UIs) who want their creations to remember past interactions and information without starting fresh every time.
About context-vault
fellanH/context-vault
Persistent memory for AI agents — save and search knowledge across sessions via MCP. Local-first, markdown + SQLite + embeddings.
Implements hybrid full-text and semantic search via embeddings, with MCP tools for saving structured entry types (insights, decisions, patterns) and ingesting external content from URLs or projects. Runs as an auto-configured shared daemon that detects Claude, Cursor, and other AI tools, storing all data as plain markdown in `~/vault/` with SQLite indexing for search and optional web dashboard access.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work