keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
Supports multiple search algorithms including Bayesian Optimization and Hyperband alongside Random Search, enabling both efficient exploration and extensibility for custom algorithms. Uses a define-by-run syntax where hyperparameter spaces are specified directly within model-building functions, integrating seamlessly with TensorFlow 2.0+ and Keras Sequential/Functional APIs. Scales across distributed training scenarios while tracking trial history and checkpoints for reproducible optimization workflows.
2,917 stars and 209,372 monthly downloads. Used by 9 other packages. Available on PyPI.
Stars
2,917
Forks
404
Language
Python
License
Apache-2.0
Category
Last pushed
Dec 01, 2025
Monthly downloads
209,372
Commits (30d)
0
Dependencies
6
Reverse dependents
9
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/keras-team/keras-tuner"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related frameworks
optuna/optuna
A hyperparameter optimization framework
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning
KernelTuner/kernel_tuner
Kernel Tuner
tensorflow/adanet
Fast and flexible AutoML with learning guarantees.