keyhankamyar/SpaX
Pythonic, type-safe search space configuration for HPO (hyperparameter optimization), NAS (neural architecture search), and ML experiment tracking. Define complex search spaces with conditional parameters, automatic validation, and zero boilerplate. Pydantic-based, Optuna-ready to nail hyperparameter tuning.
Available on PyPI.
Stars
9
Forks
—
Language
Python
License
MIT
Category
Last pushed
Oct 29, 2025
Monthly downloads
17
Commits (30d)
0
Dependencies
1
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/keyhankamyar/SpaX"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
optuna/optuna
A hyperparameter optimization framework
keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
KernelTuner/kernel_tuner
Kernel Tuner