skillz and local-skills-mcp
These are complements: skillz provides a standardized MCP interface for skill loading that works across multiple LLM clients, while local-skills-mcp implements that same pattern specifically optimized for filesystem-based skills with lazy loading, allowing them to be used together to serve different deployment scenarios (remote vs. local skills).
About skillz
intellectronica/skillz
An MCP server for loading skills (shim for non-claude clients).
Discovers and exposes Claude-style skills (defined via `SKILL.md` with YAML metadata) as MCP tools for non-Claude agents like Copilot and Cursor, supporting flexible skill packaging as directories, zip archives, or `.skill` files with bundled resources. Uses stdio, HTTP, or SSE transports and can execute helper scripts within each skill, with optional Docker containerization for sandboxed execution. Integrates with the Skills Supermarket directory and provides both Python CLI and Gemini CLI extension interfaces for agent configuration.
About local-skills-mcp
kdpa-llc/local-skills-mcp
Universal MCP server enabling any LLM or AI agent to utilize expert skills from your local filesystem. Reduces context consumption through lazy loading. Works with Claude, Cline, and any MCP-compatible client.
Implements stdio transport with automatic skill discovery across multiple configurable directories, aggregating skills from built-in sources, global paths, project-specific locations, and custom environment variables. Features hot-reload capability for instant skill updates without server restart, and uses lazy-loaded YAML frontmatter metadata (~50 tokens per skill) to minimize context overhead while deferring full skill content retrieval on-demand. Exposes a single `get_skill` tool to any MCP-compatible client, enabling skill reuse across Claude, Cline, Continue.dev, and custom AI agents regardless of underlying LLM.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work