LightGBM and chefboost

These two tools are competitors, with LightGBM being a more established and widely adopted gradient boosting framework, while ChefBoost offers a lightweight alternative supporting various decision tree algorithms and boosting techniques.

LightGBM
71
Verified
chefboost
67
Established
Maintenance 20/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 2/25
Adoption 16/25
Maturity 25/25
Community 24/25
Stars: 18,157
Forks: 3,988
Downloads:
Commits (30d): 15
Language: C++
License: MIT
Stars: 486
Forks: 101
Downloads: 623
Commits (30d): 0
Language: Python
License: MIT
No Package No Dependents
Stale 6m

About LightGBM

lightgbm-org/LightGBM

A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.

Implements leaf-wise tree growth with histogram-based learning to reduce memory footprint and accelerate training on CPU and GPU hardware. Provides native bindings for Python, R, and C++, with ecosystem integrations including FLAML for AutoML, Optuna for hyperparameter tuning, and model compilers like Treelite and Hummingbird for production deployment.

About chefboost

serengil/chefboost

A Lightweight Decision Tree Framework supporting regular algorithms: ID3, C4.5, CART, CHAID and Regression Trees; some advanced techniques: Gradient Boosting, Random Forest and Adaboost w/categorical features support for Python

Scores updated daily from GitHub, PyPI, and npm data. How scores work