xgboost and chefboost

XGBoost is a production-grade distributed gradient boosting library that would typically be chosen over Chefboost for serious machine learning work, making them direct competitors despite Chefboost's broader coverage of classical decision tree algorithms.

xgboost
98
Verified
chefboost
60
Established
Maintenance 23/25
Adoption 25/25
Maturity 25/25
Community 25/25
Maintenance 2/25
Adoption 16/25
Maturity 18/25
Community 24/25
Stars: 28,121
Forks: 8,847
Downloads: 41,912,233
Commits (30d): 45
Language: C++
License: Apache-2.0
Stars: 486
Forks: 101
Downloads: 623
Commits (30d): 0
Language: Python
License: MIT
No risk flags
Stale 6m

About xgboost

dmlc/xgboost

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

Implements parallel tree boosting with built-in support for categorical features, missing value handling, and monotonic constraints without preprocessing. Uses a novel column-block structure for cache-aware tree construction and supports GPU acceleration via CUDA for faster training on large datasets. Integrates with ML platforms including scikit-learn, MLflow, and Optuna for hyperparameter optimization, with native support for feature importance analysis and SHAP explainability.

About chefboost

serengil/chefboost

A Lightweight Decision Tree Framework supporting regular algorithms: ID3, C4.5, CART, CHAID and Regression Trees; some advanced techniques: Gradient Boosting, Random Forest and Adaboost w/categorical features support for Python

Scores updated daily from GitHub, PyPI, and npm data. How scores work