xgboost and GPBoost

XGBoost is a mature, widely-adopted general-purpose gradient boosting framework, while GPBoost extends the gradient boosting paradigm by incorporating Gaussian processes and mixed-effects modeling for specialized statistical use cases, making them complementary rather than directly competitive.

xgboost
98
Verified
GPBoost
77
Verified
Maintenance 23/25
Adoption 25/25
Maturity 25/25
Community 25/25
Maintenance 16/25
Adoption 20/25
Maturity 25/25
Community 16/25
Stars: 28,121
Forks: 8,847
Downloads: 41,912,233
Commits (30d): 45
Language: C++
License: Apache-2.0
Stars: 665
Forks: 53
Downloads: 5,433
Commits (30d): 5
Language: C++
License:
No risk flags
No risk flags

About xgboost

dmlc/xgboost

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

Implements parallel tree boosting with built-in support for categorical features, missing value handling, and monotonic constraints without preprocessing. Uses a novel column-block structure for cache-aware tree construction and supports GPU acceleration via CUDA for faster training on large datasets. Integrates with ML platforms including scikit-learn, MLflow, and Optuna for hyperparameter optimization, with native support for feature importance analysis and SHAP explainability.

About GPBoost

fabsig/GPBoost

Tree-Boosting, Gaussian Processes, and Mixed-Effects Models

Scores updated daily from GitHub, PyPI, and npm data. How scores work