BatsResearch/bonito

A lightweight library for generating synthetic instruction tuning datasets for your data without GPT.

43
/ 100
Emerging

Bonito performs conditional task generation by converting raw text into diverse instruction-tuning datasets across 16+ task types (NLI, QA, summarization, sentiment, etc.) using a specialized model rather than general-purpose LLMs. Built on Hugging Face Transformers and vLLM, it leverages task-specific generation models trained on conditional task adaptation to synthesize high-quality training data from unannotated corpora without external API dependencies. The library supports multiple model variants including a Llama 3.1-based version and offers quantized options for resource-constrained environments.

823 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

823

Forks

56

Language

Python

License

BSD-3-Clause

Last pushed

Jul 15, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/BatsResearch/bonito"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.