sanchit-gandhi/whisper-jax
JAX implementation of OpenAI's Whisper model for up to 70x speed-up on TPU.
Leverages JAX's `pmap` for automatic data parallelism across accelerators with JIT compilation and caching, enabling efficient multi-device inference. Supports half-precision computation (float16/bfloat16), audio chunking with boundary stitching for 10x batching speedups, and optional timestamp prediction alongside transcription and translation tasks. Integrates with Hugging Face Transformers and the Hub, allowing any Flax-compatible Whisper checkpoint to be loaded via the `FlaxWhisperPipeline` abstraction.
4,690 stars. No commits in the last 6 months.
Stars
4,690
Forks
414
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Apr 03, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/voice-ai/sanchit-gandhi/whisper-jax"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SYSTRAN/faster-whisper
Faster Whisper transcription with CTranslate2
oseiskar/autosubsync
Automatically synchronize subtitles with audio using machine learning
FL33TW00D/whisper-turbo
Cross-Platform, GPU Accelerated Whisper 🏎️
machinelearningZH/audio-transcription
Transcribe any audio or video file. Edit and view your transcripts in a standalone HTML editor.
saharmor/whisper-playground
Build real time speech2text web apps using OpenAI's Whisper https://openai.com/blog/whisper/