frmoretto/clarity-gate
Stop LLMs from hallucinating your guesses as facts. Clarity Gate is a verification protocol for your documents that are going to be provided to LLMs or RAG systems. Place automatically the missing uncertainty markers to avoid confident hallucinations. HITL for non-directly verifiable claims.
Implements a 9-point epistemic verification system combining detection (fact vs. projection labeling, assumption visibility, temporal coherence) with automated enforcement via HITL workflows to annotate raw documents before RAG ingestion. Ships as Claude skill files and framework-agnostic markdown methodology, targeting teams ingesting unvetted sources (meeting notes, drafts, user content) where pre-trained models would otherwise faithfully propagate unqualified assertions as confident fact.
Available on PyPI.
Stars
23
Forks
2
Language
Python
License
—
Category
Last pushed
Mar 02, 2026
Monthly downloads
16
Commits (30d)
0
Dependencies
1
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/frmoretto/clarity-gate"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
onestardao/WFGY
WFGY: open-source reasoning and debugging infrastructure for RAG and AI agents. Includes the...
KRLabsOrg/verbatim-rag
Hallucination-prevention RAG system with verbatim span extraction. Ensures all generated content...
iMoonLab/Hyper-RAG
"Hyper-RAG: Combating LLM Hallucinations using Hypergraph-Driven Retrieval-Augmented Generation"...
chensyCN/LogicRAG
Source code of LogicRAG at AAAI'26.
anulum/director-ai
Real-time LLM hallucination guardrail — NLI + RAG fact-checking with token-level streaming halt....