web-llm and llm-x

These are complements: web-llm provides the core in-browser inference engine that llm-x uses as its underlying LLM runtime to deliver a user-friendly web interface for local model execution.

web-llm
86
Verified
llm-x
49
Emerging
Maintenance 20/25
Adoption 23/25
Maturity 25/25
Community 18/25
Maintenance 6/25
Adoption 10/25
Maturity 16/25
Community 17/25
Stars: 17,562
Forks: 1,221
Downloads: 188,037
Commits (30d): 9
Language: TypeScript
License: Apache-2.0
Stars: 292
Forks: 33
Downloads:
Commits (30d): 0
Language: TypeScript
License: MIT
No risk flags
No Package No Dependents

About web-llm

mlc-ai/web-llm

High-performance In-browser LLM Inference Engine

WebLLM helps web developers integrate powerful AI language models directly into their web applications, running entirely within the user's browser. It takes model files and instructions, producing AI-generated text, chat responses, or structured JSON data. Web developers creating interactive, privacy-focused AI experiences for their users would benefit from this.

web-development in-browser-ai client-side-llm web-application-development interactive-ai

About llm-x

mrdjohnson/llm-x

LLMX; Easiest 3rd party Local LLM UI for the web!

Related comparisons

Scores updated daily from GitHub, PyPI, and npm data. How scores work