langfuse-mcp and mcp-zenml

langfuse-mcp
74
Verified
mcp-zenml
55
Established
Maintenance 13/25
Adoption 17/25
Maturity 25/25
Community 19/25
Maintenance 13/25
Adoption 8/25
Maturity 16/25
Community 18/25
Stars: 61
Forks: 18
Downloads: 6,228
Commits (30d): 0
Language: Python
License: MIT
Stars: 43
Forks: 13
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No risk flags
No Package No Dependents

About langfuse-mcp

avivsinai/langfuse-mcp

A Model Context Protocol (MCP) server for Langfuse, enabling AI agents to query Langfuse trace data for enhanced debugging and observability

Exposes 25 tools across traces, observations, sessions, exceptions, datasets, and prompt management—significantly broader than prompt-only alternatives. Implements selective tool loading to minimize token overhead and read-only mode for safer access, with stdio transport integration for Claude Code, Codex CLI, and Cursor. Includes bundled debugging playbooks via a skill system and supports both cloud and self-hosted Langfuse instances.

About mcp-zenml

zenml-io/mcp-zenml

MCP server to connect an MCP client (Cursor, Claude Desktop etc) with your ZenML MLOps and LLMOps pipelines

Exposes 40+ tools for querying ZenML pipelines, runs, models, deployments, and infrastructure stacks via the Model Context Protocol. Built on MCP's client-server architecture, it authenticates with ZenML servers (self-hosted or cloud) to provide real-time access to pipeline execution history, artifact metadata, deployment status, and operational logs—enabling LLM-powered analysis and pipeline triggering within AI IDEs and chat clients. Includes experimental interactive dashboards and diagnostic tools for troubleshooting ZenML connectivity.

Scores updated daily from GitHub, PyPI, and npm data. How scores work