cathy841106/ai-hallucination-detect
A tool for detecting hallucinations in domain-specific LLM outputs. It enables domain data import, model training, and API deployment, providing reliability assessment and evidence for evaluating LLM implementation performance.
Stars
—
Forks
—
Language
Python
License
—
Category
Last pushed
Feb 23, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/generative-ai/cathy841106/ai-hallucination-detect"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Featured in
Higher-rated alternatives
madroidmaq/mlx-omni-server
MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically...
openvinotoolkit/model_server
A scalable inference server for models optimized with OpenVINO™
rhesis-ai/rhesis
Open-source platform & SDK for testing LLM and agentic apps. Define expected behavior, generate...
NVIDIA-NeMo/Guardrails
NeMo Guardrails is an open-source toolkit for easily adding programmable guardrails to LLM-based...
taco-group/OpenEMMA
OpenEMMA, a permissively licensed open source "reproduction" of Waymo’s EMMA model.