xgboost and XGBoostLSS
XGBoostLSS extends XGBoost's core gradient boosting implementation to enable distributional predictions by modeling location, scale, and shape parameters, making them complementary tools used together rather than alternatives.
About xgboost
dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
Implements parallel tree boosting with built-in support for categorical features, missing value handling, and monotonic constraints without preprocessing. Uses a novel column-block structure for cache-aware tree construction and supports GPU acceleration via CUDA for faster training on large datasets. Integrates with ML platforms including scikit-learn, MLflow, and Optuna for hyperparameter optimization, with native support for feature importance analysis and SHAP explainability.
About XGBoostLSS
StatMixedML/XGBoostLSS
An extension of XGBoost to probabilistic modelling
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work