awslabs/adatune
Gradient based Hyperparameter Tuning library in PyTorch
41
/ 100
Emerging
291 stars. No commits in the last 6 months.
Stale 6m
No Package
No Dependents
Maintenance
0 / 25
Adoption
10 / 25
Maturity
16 / 25
Community
15 / 25
Stars
291
Forks
32
Language
Python
License
Apache-2.0
Category
Last pushed
Jul 17, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/awslabs/adatune"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
optuna/optuna
A hyperparameter optimization framework
85
keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
78
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
77
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning
77
KernelTuner/kernel_tuner
Kernel Tuner
74