CogitatorTech/omni-nli
A multi-interface (REST and MCP) server for natural language inference
Supports multiple model backends (Ollama, HuggingFace, OpenRouter) with confidence scoring and optional reasoning traces for explainability. Built as a stateless microservice with caching and configurable scaling, accessible via REST API for traditional applications or MCP interface for AI agent integration. Designed to mitigate LLM hallucinations by verifying generated content consistency against premise-hypothesis relationships.
Available on PyPI.
Stars
3
Forks
1
Language
Python
License
MIT
Category
Last pushed
Feb 23, 2026
Monthly downloads
151
Commits (30d)
0
Dependencies
14
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/CogitatorTech/omni-nli"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jonigl/mcp-client-for-ollama
A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features...
ArcadeAI/arcade-mcp
The best way to create, deploy, and share MCP Servers
hmldns/nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
possible055/relace-mcp
Unofficial Relace MCP client with AI features. Personal project; not affiliated with or endorsed...
Dicklesworthstone/ultimate_mcp_server
Comprehensive MCP server exposing dozens of capabilities to AI agents: multi-provider LLM...