TensorOpsAI/LLMstudio

Framework to bring LLM applications to production

67
/ 100
Established

Provides a unified proxy layer across OpenAI, Anthropic, and Google LLMs plus local models via Ollama, with smart routing and fallback mechanisms for reliability. Includes a web-based prompt playground UI, Python SDK, request monitoring/logging, and LangChain compatibility for seamless integration into existing projects. Supports batch calling and deploys as a server with separate proxy and tracker APIs.

371 stars and 563 monthly downloads. Available on PyPI.

Maintenance 10 / 25
Adoption 16 / 25
Maturity 25 / 25
Community 16 / 25

How are scores calculated?

Stars

371

Forks

39

Language

Python

License

MPL-2.0

Last pushed

Feb 05, 2026

Monthly downloads

563

Commits (30d)

0

Dependencies

5

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/prompt-engineering/TensorOpsAI/LLMstudio"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.