neurolink and mcp
About neurolink
juspay/neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
Abstracts multi-provider LLM communication as composable token streams using a pipe-based architecture, unifying 13 AI providers (OpenAI, Anthropic, Google, AWS Bedrock, Azure, etc.) under a single TypeScript API. Built-in features include 64+ MCP server tools, Redis-backed persistent memory with LLM-powered condensation, context window auto-compaction with per-provider token estimation, RAG with hybrid search and reranking, and multi-provider failover for cost optimization. Deployable via professional CLI or as HTTP servers (Hono, Express, Fastify, Koa) with full observability hooks for existing OpenTelemetry instrumentation.
About mcp
taskade/mcp
๐ค Taskade MCP ยท Official MCP server and OpenAPI to MCP codegen. Build AI agent tools from any OpenAPI API and connect to Claude, Cursor, and more.
Exposes 50+ MCP tools covering workspaces, projects, tasks, and AI agent management, enabling Claude, Cursor, and other clients to directly interact with Taskade's workspace API. Includes a reusable OpenAPI-to-MCP codegen package that generates tool bindings from any OpenAPI 3.0+ schema, supporting both stdio transport for desktop clients and HTTP/SSE mode for integration platforms like n8n.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work