shcherbak-ai/contextgem
ContextGem: Effortless LLM extraction from documents
Provides automated dynamic prompting, granular reference mapping to source paragraphs/sentences, and justifications for extracted data using Pydantic v2 schema validation. Integrates neural sentence segmentation (wtpsplit SaT models) and supports multilingual extraction without explicit prompting, with a declarative, fully serializable pipeline architecture and built-in concurrent processing with cost tracking.
1,810 stars and 3,588 monthly downloads. Actively maintained with 2 commits in the last 30 days. Available on PyPI.
Stars
1,810
Forks
145
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 22, 2026
Monthly downloads
3,588
Commits (30d)
2
Dependencies
16
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/shcherbak-ai/contextgem"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
mufeedvh/code2prompt
A CLI tool to convert your codebase into a single LLM prompt with source tree, prompt...
ShahzaibAhmad05/gitree
An upgrade from "ls" for developers. An open-source tool to analyze folder structures and to...
nicepkg/ctxport
Copy AI conversations as clean Markdown Context Bundles — one click from ChatGPT, Claude,...
mkorpela/kopipasta
`cat project | LLM | patch`. Transparent context control and interactive patching for...
glue-tools-ai/repogrok
Pack your entire codebase into a single AI-friendly file. Feed your repo to Claude, ChatGPT,...