ai4sd/multiscale-byte-lm
A hierarchical LM that scales to training on context windows of +5M tokens
24
/ 100
Experimental
No Package
No Dependents
Maintenance
10 / 25
Adoption
5 / 25
Maturity
9 / 25
Community
0 / 25
Stars
9
Forks
—
Language
Python
License
MIT
Category
Last pushed
Feb 17, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ai4sd/multiscale-byte-lm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Goekdeniz-Guelmez/mlx-lm-lora
Train Large Language Models on MLX.
72
uber-research/PPLM
Plug and Play Language Model implementation. Allows to steer topic and attributes of GPT-2 models.
50
jarobyte91/pytorch_beam_search
A lightweight implementation of Beam Search for sequence models in PyTorch.
49
SmallDoges/small-doge
Doge Family of Small Language Models
48
VHellendoorn/Code-LMs
Guide to using pre-trained large language models of source code
48