one-search-mcp and kindly-web-search-mcp-server

Both tools are independent, competing implementations of a web search and content retrieval server designed to provide robust search capabilities for AI coding tools and agents.

one-search-mcp
63
Established
Maintenance 10/25
Adoption 9/25
Maturity 25/25
Community 19/25
Maintenance 13/25
Adoption 10/25
Maturity 13/25
Community 11/25
Stars: 87
Forks: 18
Downloads:
Commits (30d): 0
Language: TypeScript
License: MIT
Stars: 214
Forks: 14
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No risk flags
No Package No Dependents

About one-search-mcp

yokingma/one-search-mcp

🚀 OneSearch MCP Server: Web Search & Scraper & Extract, Support agent-browser, SearXNG, Tavily, DuckDuckGo, Bing, etc.

Implements Model Context Protocol (MCP) server with pluggable search backends (SearXNG, Tavily, DuckDuckGo, Bing, Google, etc.) and exposes four tools—`one_search`, `one_scrape`, `one_map`, `one_extract`—for web search, scraping, and structured data extraction. Uses local browser automation via `agent-browser` for privacy-preserving searches and scraping without external API dependencies, with automatic Chromium detection across Chrome, Edge, and Canary installations. Integrates with Claude Desktop, Cursor, and Windsurf through standard MCP configuration files.

About kindly-web-search-mcp-server

Shelpuk-AI-Technology-Consulting/kindly-web-search-mcp-server

Kindly Web Search MCP Server: Web search + robust content retrieval for AI coding tools (Claude Code, Codex, Cursor, GitHub Copilot, Gemini, etc.) and AI agents (Claude Desktop, OpenClaw, etc.). Supports Serper, Tavily, and SearXNG.

Implements MCP (Model Context Protocol) server architecture with stdio transport for seamless integration into AI coding assistants, combining multiple search backends (Serper, Tavily, SearXNG) with specialized parsers for StackExchange, GitHub Issues, arXiv, and Wikipedia that return structured, conversation-complete content. Uses headless Chromium via `nodriver` for real-time webpage extraction into Markdown, eliminating the need for separate web scraping or platform-specific MCP servers. Part of a broader agentic suite designed to improve code quality through integrated tools for semantic navigation, design review, and TDD enforcement.

Scores updated daily from GitHub, PyPI, and npm data. How scores work