LightGBM and GPBoost

GPBoost extends LightGBM by incorporating Gaussian processes and mixed-effects models, making them complementary tools where GPBoost builds upon and enhances LightGBM's core capabilities.

LightGBM
71
Verified
GPBoost
77
Verified
Maintenance 20/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 16/25
Adoption 20/25
Maturity 25/25
Community 16/25
Stars: 18,157
Forks: 3,988
Downloads:
Commits (30d): 15
Language: C++
License: MIT
Stars: 665
Forks: 53
Downloads: 5,433
Commits (30d): 5
Language: C++
License:
No Package No Dependents
No risk flags

About LightGBM

lightgbm-org/LightGBM

A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.

Implements leaf-wise tree growth with histogram-based learning to reduce memory footprint and accelerate training on CPU and GPU hardware. Provides native bindings for Python, R, and C++, with ecosystem integrations including FLAML for AutoML, Optuna for hyperparameter tuning, and model compilers like Treelite and Hummingbird for production deployment.

About GPBoost

fabsig/GPBoost

Tree-Boosting, Gaussian Processes, and Mixed-Effects Models

Scores updated daily from GitHub, PyPI, and npm data. How scores work