whisper-finetune and whisper-prep

These are complementary tools designed to be used sequentially: whisper-prep handles the upstream data preparation stage, while whisper-finetune consumes that prepared data to perform the actual model fine-tuning.

whisper-finetune
44
Emerging
whisper-prep
31
Emerging
Maintenance 13/25
Adoption 6/25
Maturity 9/25
Community 16/25
Maintenance 10/25
Adoption 5/25
Maturity 9/25
Community 7/25
Stars: 22
Forks: 6
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: MIT
Stars: 11
Forks: 1
Downloads:
Commits (30d): 0
Language: Python
License: MIT
No Package No Dependents
No Package No Dependents

About whisper-finetune

i4Ds/whisper-finetune

This repository contains code for fine-tuning the Whisper speech-to-text model.

About whisper-prep

i4Ds/whisper-prep

Data preparation utility for the finetuning of OpenAI's Whisper model.

Scores updated daily from GitHub, PyPI, and npm data. How scores work