scira-mcp-chat and chat-mcp

These are ecosystem siblings—one is a lightweight MCP client framework for building chat interfaces, while the other is a complete desktop application that implements MCP to connect multiple LLMs, so they serve different use cases (library vs. standalone app) within the same MCP ecosystem.

scira-mcp-chat
56
Established
chat-mcp
46
Emerging
Maintenance 6/25
Adoption 10/25
Maturity 15/25
Community 25/25
Maintenance 2/25
Adoption 10/25
Maturity 16/25
Community 18/25
Stars: 832
Forks: 212
Downloads:
Commits (30d): 0
Language: TypeScript
License: Apache-2.0
Stars: 243
Forks: 35
Downloads:
Commits (30d): 0
Language: HTML
License: Apache-2.0
No Package No Dependents
Stale 6m No Package No Dependents

About scira-mcp-chat

zaidmukaddam/scira-mcp-chat

A minimalistic MCP client with a good feature set.

Supports dynamic MCP server configuration via HTTP and SSE transports, enabling real-time tool discovery from external providers like Composio and Zapier. Built on Next.js with Vercel's AI SDK for multi-provider LLM compatibility and streaming responses, integrated with shadcn/ui components. Includes reasoning model support and tool-use capabilities through the Model Context Protocol ecosystem.

About chat-mcp

AI-QL/chat-mcp

A Desktop Chat App that leverages MCP(Model Context Protocol) to interface with other LLMs.

Supports dynamic multi-server MCP configuration and multi-LLM backend testing across OpenAI SDK-compatible providers (GPT, Qwen, Llama). Built on Electron with a modular architecture designed for educational clarity, the app manages MCP server processes via config-driven stdio transport and enables seamless switching between multiple LLM endpoints and MCP tool servers. UI components are framework-agnostic and extractable for web deployment, maintaining consistent interaction patterns across platforms.

Scores updated daily from GitHub, PyPI, and npm data. How scores work