ShipItAndPray/mcp-turboquant
MCP server for LLM quantization. Compress any model to GGUF/GPTQ/AWQ in one tool call. First MCP server for model compression.
Stars
—
Forks
—
Language
JavaScript
License
MIT
Category
Last pushed
Mar 25, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/ShipItAndPray/mcp-turboquant"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
upstash/context7
Context7 Platform -- Up-to-date code documentation for LLMs and AI code editors
DMontgomery40/deepseek-mcp-server
Model Context Protocol server for DeepSeek's advanced language models
graphlit/graphlit-mcp-server
Model Context Protocol (MCP) Server for Graphlit Platform
dvcrn/mcp-server-siri-shortcuts
MCP for calling Siri Shorcuts from LLMs
hhszzzz/MingAI
🔮 高精度AI算命工具,涵盖八字、紫微斗数、六卦、奇门遁甲、大六壬、塔罗、MBTI、面相手相、合盘配对、每日/每月运势、周公解梦等。支持MCP服务和Skills