nakurian/hce
Middleware memory engine for LLMs — replaces linear chat history with three parallel structures (Entity Graph, Semantic Tree, Focus Buffer) to retrieve only the most relevant context within a token budget. Works as a Python library or MCP server for Claude Code, Copilot CLI, and other AI tools.
Stars
1
Forks
—
Language
Python
License
—
Category
Last pushed
Feb 20, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/nakurian/hce"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
doobidoo/mcp-memory-service
Open-source persistent memory for AI agent pipelines (LangGraph, CrewAI, AutoGen) and Claude....
mnemox-ai/tradememory-protocol
MCP server for AI trading memory — outcome-weighted cognitive memory with 10 tools, 399 tests.
Dataojitori/nocturne_memory
A lightweight, rollbackable, and visual Long-Term Memory Server for MCP Agents. Say goodbye to...
mordechaipotash/brain-mcp
Your AI has amnesia. Persistent memory and cognitive context for AI. 25 MCP tools. 12ms recall.
nonatofabio/local_faiss_mcp
Local FAISS vector store as an MCP server – Agent Memory, drop-in local semantic search for...