bold84/cot_proxy
Smart proxy for LLM APIs that enables model-specific parameter control, automatic mode switching (like Qwen3's /think and /no_think), and
No commits in the last 6 months.
Stars
51
Forks
5
Language
Python
License
MIT
Category
Last pushed
May 19, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/bold84/cot_proxy"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
BerriAI/litellm
Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with...
vava-nessa/free-coding-models
Find, benchmark and install in CLI 158 FREE coding LLM models across 20 providers in real time
Portkey-AI/gateway
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with...
coaidev/coai
🚀 Next Generation Multi-tenant AI One-Stop Solution. Builtin Admin & Billing System....
theopenco/llmgateway
Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.