damn8daniel/gradient-boosting-from-scratch
XGBoost-like gradient boosting from scratch in NumPy. Newton boosting, histogram splits, L1/L2 regularization, custom losses. Beats sklearn on benchmarks.
Stars
—
Forks
—
Language
Python
License
—
Category
Last pushed
Mar 21, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/damn8daniel/gradient-boosting-from-scratch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python,...
catboost/catboost
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for...
stanfordmlgroup/ngboost
Natural Gradient Boosting for Probabilistic Prediction
fabsig/GPBoost
Tree-Boosting, Gaussian Processes, and Mixed-Effects Models
lightgbm-org/LightGBM
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework...