tensorflow/adanet
Fast and flexible AutoML with learning guarantees.
Implements adaptive ensemble learning by iteratively growing neural network subnetworks while freezing previous iterations, optimizing an objective with theoretical learning guarantees based on the AdaNet algorithm. Integrates with TensorFlow's Estimator API and supports multi-task learning (regression, classification), distributed training across CPU/GPU/TPU, and extensible subnetwork definitions via `tf.layers`. Also provides `AutoEnsembleEstimator` for learning optimal ensembles from user-defined heterogeneous models.
3,457 stars and 291 monthly downloads. No commits in the last 6 months. Available on PyPI.
Stars
3,457
Forks
527
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Nov 30, 2023
Monthly downloads
291
Commits (30d)
0
Dependencies
8
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/tensorflow/adanet"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
optuna/optuna
A hyperparameter optimization framework
keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
KernelTuner/kernel_tuner
Kernel Tuner