huggingface/transformers.js
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
Leverages ONNX Runtime with WASM execution (CPU) or WebGPU (GPU) to run quantized transformer models efficiently in browsers, supporting optional 4-bit and 8-bit quantization. Provides a JavaScript API functionally equivalent to Hugging Face's Python library, enabling seamless conversion of PyTorch, TensorFlow, and JAX models via Optimum. Spans NLP, computer vision, audio, and multimodal tasks across text classification, object detection, speech recognition, and zero-shot learning.
15,538 stars. Actively maintained with 38 commits in the last 30 days.
Stars
15,538
Forks
1,104
Language
JavaScript
License
Apache-2.0
Category
Last pushed
Mar 12, 2026
Commits (30d)
38
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/huggingface/transformers.js"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related models
huggingface/transformers.js-examples
A collection of 🤗 Transformers.js demos and example applications
daviddaytw/react-native-transformers
Run local LLM from Huggingface in React-Native or Expo using onnxruntime.
salesforce/TransmogrifAI
TransmogrifAI (pronounced trăns-mŏgˈrə-fī) is an AutoML library for building modular, reusable,...
jobergum/browser-ml-inference
Edge Inference in Browser with Transformer NLP model
otadk/nuxt-edge-ai
Nuxt module for local-first AI apps with server-side WASM inference via Transformers.js and ONNX Runtime.