pinecone-io/pinecone-mcp
Connect your Pinecone projects to Cursor, Claude, and other AI assistants
Implements the Model Context Protocol (MCP) standard to expose Pinecone operations as tools—including index management, document upsert/search, cascading multi-index queries, and documentation retrieval—enabling AI assistants to generate and test vector search code within development workflows. Supports indexes with integrated inference only, communicating via stdio transport with authentication through environment variables. Works across Cursor, Claude Desktop, and Gemini CLI with a single Node.js-based server installation.
Available on npm.
Stars
59
Forks
21
Language
TypeScript
License
Apache-2.0
Category
Last pushed
Mar 21, 2026
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/pinecone-io/pinecone-mcp"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related servers
AI-QL/tuui
A desktop MCP client designed as a tool unitary utility integration, accelerating AI adoption...
SixHq/Overture
Overture is an open-source, locally running web interface delivered as an MCP (Model Context...
cap-js/mcp-server
MCP server for AI-assisted development of CAP applications
AnEntrypoint/gm-exec
wanna develop an app ❓
zcaceres/builtwith-api
TypeScript library, MCP, and agent-friendly CLI for the BuiltWith API.