openclaw-mem0 and bamdra-openclaw-memory
These two tools are competitors, as both offer long-term memory solutions for OpenClaw agents, with tool B providing topic-aware continuity and bounded token growth, and tool A leveraging Mem0 and self-hosting with OpenAI-compatible providers.
About openclaw-mem0
tensakulabs/openclaw-mem0
Long-term memory plugin for OpenClaw agents, powered by Mem0. Self-hosted with any OpenAI-compatible provider.
Provides five memory management tools (`memory_search`, `memory_store`, `memory_list`, `memory_get`, `memory_forget`) with automatic recall injection before agent turns and auto-capture after turns, supporting both session and long-term memory scopes. Implements lazy provider loading to avoid bloated SDK imports, and vendors a patched Mem0 build that fixes critical bugs in embeddings routing and memory recall in self-hosted deployments using Qdrant vector storage. Integrates seamlessly with OpenClaw's plugin system and supports any OpenAI-compatible LLM provider (OpenRouter, DashScope, LocalAI) alongside Mem0's cloud platform.
About bamdra-openclaw-memory
bamdra/bamdra-openclaw-memory
Give one OpenClaw session durable memory, topic-aware continuity, and bounded token growth.
Implements a three-plugin stack (memory runtime, user-bind identity layer, and vector knowledge indexing) that maintains durable conversation state across OpenClaw sessions while indexing local Markdown files for semantic recall. Uses incremental profile updates with frontmatter-sourced truth and tracks topic continuity to prevent token bloat and context drift. Integrates directly with OpenClaw's plugin system via `clawdhub` and supports private/shared Markdown roots for knowledge organization.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work