ollama-mcp and pubnub-mcp-server
About ollama-mcp
rawveg/ollama-mcp
An MCP Server for Ollama
This project helps developers integrate their local Ollama-powered large language models (LLMs) with AI assistant clients like Claude Desktop or Cline. It takes your local Ollama setup, including various models and their functionalities, and exposes them as 'tools' that these AI assistants can use. This allows developers to leverage their self-hosted LLMs directly within compatible applications, enabling local AI capabilities.
About pubnub-mcp-server
pubnub/pubnub-mcp-server
PubNub MCP Model Context Protocol Server for use in Cursor, Windsurf, Claude Desktop, Claude Code and OpenAI Codex and more!
This tool helps developers streamline building real-time applications using PubNub. It integrates with AI assistants like Cursor or Visual Studio Code, allowing you to use natural language to interact with PubNub's APIs and documentation. You provide an API key, and it gives you a way to manage applications, send messages, track users, and get code examples faster.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work