kimi-free-api and GLM-Free-API
These are competitors, as both offer reverse-engineered free APIs for different large language models (KIMI AI and ChatGLM-4.7, respectively), providing similar functionalities like streaming output and multi-turn conversations.
About kimi-free-api
LLM-Red-Team/kimi-free-api
🚀 KIMI AI 长文本大模型逆向API【特长:长文本解读整理】,支持高速流式输出、智能体对话、联网搜索、探索版、K1思考模型、长文档解读、图像解析、多轮对话,零配置部署,多路token支持,自动清理会话痕迹,仅供测试,如需商用请前往官方开放平台。
Implements reverse-engineered API compatibility with OpenAI's chat completions interface while preserving Kimi's native conversation context through optional `conversation_id` parameters for stateful multi-turn interactions. Supports multi-account token rotation to circumvent usage rate limits, with automatic session cleanup and flexible deployment across Docker, Render, Vercel, and native Node.js environments using PM2 process management.
About GLM-Free-API
xiaoY233/GLM-Free-API
🚀 智谱清言ChatGLM-4.7 大模型逆向API【特长:超强智能体】,支持高速流式输出、支持智能体对话、支持多轮对话、支持沉思模型、支持Zero思考推理模型;仅供测试,如需商用请前往官方开放平台。
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work