onfabric/context-use
Turn your data exports into portable AI memory.
Implements an OpenAI-compatible proxy that intercepts chat completions to automatically extract and store memories in SQLite, enriching future requests with semantic context. Supports batch importing from ChatGPT, Claude, Instagram, Google, Netflix, and Airbnb exports via cost-efficient LLM batch APIs. Includes a multi-turn agent for querying memories, generating pattern synthesis, and compiling personal profiles across all ingested data sources.
13 stars and 8,457 monthly downloads. Available on PyPI.
Stars
13
Forks
1
Language
Python
License
MIT
Category
Last pushed
Mar 12, 2026
Monthly downloads
8,457
Commits (30d)
0
Dependencies
7
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/agents/onfabric/context-use"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related agents
ultracontext/ultracontext
Open Source Context infrastructure for AI agents. Auto-capture and share your agents' context everywhere.
dunova/ContextGO
Local-first context & memory runtime for multi-agent AI coding teams. MCP-free. Rust/Go accelerated.
dgenio/contextweaver
Budget-aware context compilation and context firewall for tool-heavy AI agents.
EfficientContext/ContextPilot
Accelerating Long Context LLM Inference with Accuracy-Preserving Context Optimization in SGLang,...
astrio-ai/atlas
Coding agent for legacy code modernization