web-llm and llm-x
These are complements: web-llm provides the core in-browser inference engine that llm-x uses as its underlying LLM runtime to deliver a user-friendly web interface for local model execution.
About web-llm
mlc-ai/web-llm
High-performance In-browser LLM Inference Engine
WebLLM helps web developers integrate powerful AI language models directly into their web applications, running entirely within the user's browser. It takes model files and instructions, producing AI-generated text, chat responses, or structured JSON data. Web developers creating interactive, privacy-focused AI experiences for their users would benefit from this.
About llm-x
mrdjohnson/llm-x
LLMX; Easiest 3rd party Local LLM UI for the web!
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work