jmrichardson/tuneta
Intelligently optimizes technical indicators and optionally selects the least intercorrelated for use in machine learning models
457 stars and 431 monthly downloads. No commits in the last 6 months. Available on PyPI.
Stars
457
Forks
81
Language
Python
License
MIT
Category
Last pushed
Oct 13, 2023
Monthly downloads
431
Commits (30d)
0
Dependencies
14
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/jmrichardson/tuneta"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
optuna/optuna
A hyperparameter optimization framework
keras-team/keras-tuner
A Hyperparameter Tuning Library for Keras
syne-tune/syne-tune
Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
deephyper/deephyper
DeepHyper: A Python Package for Massively Parallel Hyperparameter Optimization in Machine Learning
KernelTuner/kernel_tuner
Kernel Tuner