mcp-memory-service and local_faiss_mcp

mcp-memory-service
73
Verified
local_faiss_mcp
60
Established
Maintenance 25/25
Adoption 10/25
Maturity 16/25
Community 22/25
Maintenance 13/25
Adoption 12/25
Maturity 18/25
Community 17/25
Stars: 1,504
Forks: 215
Downloads:
Commits (30d): 153
Language: Python
License: Apache-2.0
Stars: 23
Forks: 9
Downloads: 376
Commits (30d): 0
Language: Python
License: MIT
No Package No Dependents
No risk flags

About mcp-memory-service

doobidoo/mcp-memory-service

Open-source persistent memory for AI agent pipelines (LangGraph, CrewAI, AutoGen) and Claude. REST API + knowledge graph + autonomous consolidation.

Consolidates multi-agent memory using a knowledge graph with typed edges (causes, fixes, contradicts) and autonomous compression, accessible via REST API with ONNX-based embeddings that run locally. Implements Remote MCP support for browser-based claude.ai integration via Server-Sent Events, alongside traditional desktop MCP, with OAuth 2.0 authentication and self-hosted infrastructure (no cloud lock-in). Agent identity is tracked via `X-Agent-ID` headers for scoped retrieval, and conversation threading is preserved through `conversation_id` fields, enabling both shared memory across agent fleets and inter-agent messaging through semantic tag-based filtering.

About local_faiss_mcp

nonatofabio/local_faiss_mcp

Local FAISS vector store as an MCP server – Agent Memory, drop-in local semantic search for Claude / Copilot / Agents.

Implements a two-stage retrieve-and-rerank architecture using sentence-transformers for embeddings and configurable cross-encoders to improve relevance scoring. Exposes MCP tools for document ingestion (supporting PDF, TXT, MD natively plus pandoc-compatible formats) and semantic querying, alongside built-in prompts for answer extraction and summarization that integrate directly with Claude and other MCP-compatible agents.

Scores updated daily from GitHub, PyPI, and npm data. How scores work