cognee and OpenMemory
These are **complements**: cognee provides the knowledge graph and semantic memory engine, while OpenMemory provides the persistent storage layer and integration interfaces that cognee's agents would need to retrieve and maintain their memories across sessions.
About cognee
topoteretes/cognee
Knowledge Engine for AI Agent Memory in 6 lines of code
Combines vector search with graph databases to index documents by semantic meaning and learned entity relationships, enabling hybrid retrieval that improves context relevance for agents. Supports multimodal ingestion across arbitrary data formats and structures while maintaining local execution, ontology grounding, and audit trails for trustworthy agent isolation. Integrates with multiple LLM providers and includes CLI tooling and web UI for pipeline management alongside programmatic Python APIs.
About OpenMemory
CaviraOSS/OpenMemory
Local persistent memory store for LLM applications including claude desktop, github copilot, codex, antigravity, etc.
Provides multi-sector memory (episodic, semantic, procedural) with temporal reasoning and composite scoring—not just vector retrieval—via self-hosted SQLite/Postgres backends. Offers both embedded SDKs (Python/Node) and a centralized server exposing HTTP API, MCP protocol, and dashboard, with source connectors for GitHub, Notion, Google Drive, and web crawling to populate long-term agent context.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work