xgboost and GBM-perf
XGBoost is a gradient boosting implementation that GBM-perf benchmarks and compares against other GBM frameworks to evaluate relative performance.
About xgboost
dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
Implements parallel tree boosting with built-in support for categorical features, missing value handling, and monotonic constraints without preprocessing. Uses a novel column-block structure for cache-aware tree construction and supports GPU acceleration via CUDA for faster training on large datasets. Integrates with ML platforms including scikit-learn, MLflow, and Optuna for hyperparameter optimization, with native support for feature importance analysis and SHAP explainability.
About GBM-perf
szilard/GBM-perf
Performance of various open source GBM implementations
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work