lmstudio-ai/lms
LM Studio CLI
Provides programmatic control over LM Studio's local inference server, enabling model loading, unloading, and status monitoring via command-line interface. Built on lmstudio.js SDK, it exposes server lifecycle management (start/stop), model discovery with JSON output for scripting, and log streaming capabilities. Integrates directly with LM Studio's local API server, allowing automation of GPU-accelerated inference workflows without GUI interaction.
4,341 stars. Actively maintained with 7 commits in the last 30 days.
Stars
4,341
Forks
346
Language
TypeScript
License
MIT
Category
Last pushed
Mar 12, 2026
Commits (30d)
7
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/lmstudio-ai/lms"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related tools
lmstudio-ai/lmstudio-js
LM Studio TypeScript SDK
token-js/token.js
Integrate 200+ LLMs with one TypeScript SDK using OpenAI's format.
nbonamy/multi-llm-ts
A Typescript library to use LLM providers APIs in a unified way.
samestrin/llm-interface
A simple NPM interface for seamlessly interacting with 36 Large Language Model (LLM) providers,...
gregreindel/llm-exe
A package that provides simplified base components to make building and maintaining LLM-powered...