ollama-mcp and pubnub-mcp-server

ollama-mcp
59
Established
pubnub-mcp-server
49
Emerging
Maintenance 6/25
Adoption 10/25
Maturity 25/25
Community 18/25
Maintenance 10/25
Adoption 7/25
Maturity 15/25
Community 17/25
Stars: 143
Forks: 24
Downloads:
Commits (30d): 0
Language: TypeScript
License: AGPL-3.0
Stars: 30
Forks: 9
Downloads:
Commits (30d): 0
Language: TypeScript
License:
No risk flags
No Package No Dependents

About ollama-mcp

rawveg/ollama-mcp

An MCP Server for Ollama

This project helps developers integrate their local Ollama-powered large language models (LLMs) with AI assistant clients like Claude Desktop or Cline. It takes your local Ollama setup, including various models and their functionalities, and exposes them as 'tools' that these AI assistants can use. This allows developers to leverage their self-hosted LLMs directly within compatible applications, enabling local AI capabilities.

AI-development LLM-integration local-AI AI-assistant-tools developer-workflow

About pubnub-mcp-server

pubnub/pubnub-mcp-server

PubNub MCP Model Context Protocol Server for use in Cursor, Windsurf, Claude Desktop, Claude Code and OpenAI Codex and more!

This tool helps developers streamline building real-time applications using PubNub. It integrates with AI assistants like Cursor or Visual Studio Code, allowing you to use natural language to interact with PubNub's APIs and documentation. You provide an API key, and it gives you a way to manage applications, send messages, track users, and get code examples faster.

real-time-communication application-development API-management software-engineering AI-assisted-coding

Scores updated daily from GitHub, PyPI, and npm data. How scores work