parallax-labs/context-harness
Local-first context ingestion and retrieval for AI tools. SQLite + embeddings + MCP server for Cursor & Claude.
Supports multiple embedding backends (local fastembed/tract, Ollama, OpenAI) with automatic batching and staleness detection, enabling offline-first retrieval or cloud-scaled inference. Ingests diverse sources—filesystem, Git repos, S3, Lua scripts—with multi-format extraction (PDF, DOCX, XLSX) into SQLite's FTS5, then exposes hybrid keyword+semantic search via a CLI tool and MCP HTTP server compatible with Cursor and Claude.
Stars
29
Forks
3
Language
Rust
License
AGPL-3.0
Category
Last pushed
Mar 02, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/parallax-labs/context-harness"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
patravishek/memex
Retains Claudes session
sgroy10/speclock
AI Constraint Engine by Sandeep Roy — stops AI from breaking what you locked. 100/100 on...
Intina47/context-sync
Local persistent memory store for LLM applications including continue.dev, cursor, claude...
amanhij/Zikkaron
Biologically-inspired persistent memory engine for Claude Code. 26 cognitive subsystems,...
roampal-ai/roampal-core
Outcome-based persistent memory MCP server for Claude Code and OpenCode. Good advice promoted,...