AI-Hypercomputer/maxtext
A simple, performant and scalable Jax LLM!
2,169 stars and 1,029 monthly downloads. Actively maintained with 359 commits in the last 30 days. Available on PyPI.
Stars
2,169
Forks
485
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 13, 2026
Monthly downloads
1,029
Commits (30d)
359
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/AI-Hypercomputer/maxtext"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
mosaicml/llm-foundry
LLM training code for Databricks foundation models
rasbt/reasoning-from-scratch
Implement a reasoning LLM in PyTorch from scratch, step by step
mindspore-lab/mindnlp
MindSpore + 🤗Huggingface: Run any Transformers/Diffusers model on MindSpore with seamless...
ridgerchu/matmulfreellm
Implementation for MatMul-free LM.
CASE-Lab-UMD/LLM-Drop
The official implementation of the paper "Uncovering the Redundancy in Transformers via a...