memobase and LightMem
These are competitors as both projects offer distinct, self-contained solutions for managing and utilizing long-term memory in AI applications, with Memobase focusing on user profile-based storage for chatbots and LightMem on efficient memory-augmented generation more broadly.
About memobase
memodb-io/memobase
User Profile-Based Long-Term Memory for AI Chatbot Applications.
Structures user data into dynamically-evolving profiles and timestamped event timelines, enabling sub-100ms memory retrieval through SQL queries rather than vector search. Supports Python, Node.js, and Go SDKs with batch processing buffers to reduce LLM token costs by 40-50%, and includes a Model Context Protocol (MCP) server for seamless integration with AI frameworks. Achieves state-of-the-art performance on the LOCOMO benchmark while maintaining configurable memory schemas, allowing developers to define precisely which user attributes their applications capture.
About LightMem
zjunlp/LightMem
[ICLR 2026] LightMem: Lightweight and Efficient Memory-Augmented Generation
Employs a modular architecture with pluggable storage engines and retrieval strategies to manage long-term memory for LLMs and AI agents. Supports both cloud APIs (OpenAI, DeepSeek) and local deployment via Ollama, vLLM, and Transformers with integrated memory update mechanisms. Includes benchmark evaluation frameworks for LoCoMo and LongMemEval datasets, with hierarchical memory structures (StructMem) that preserve event-level bindings and cross-event connections.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work