shap and shap-analysis-guide

The first is a core library implementing SHAP explainability methods, while the second is a non-technical interpretive guide for understanding SHAP outputs—making them complements where the guide helps users apply the library's results.

shap
92
Verified
shap-analysis-guide
40
Emerging
Maintenance 20/25
Adoption 25/25
Maturity 25/25
Community 22/25
Maintenance 0/25
Adoption 8/25
Maturity 16/25
Community 16/25
Stars: 25,115
Forks: 3,481
Downloads: 14,461,405
Commits (30d): 17
Language: Jupyter Notebook
License: MIT
Stars: 58
Forks: 11
Downloads:
Commits (30d): 0
Language: Jupyter Notebook
License: MIT
No risk flags
Stale 6m No Package No Dependents

About shap

shap/shap

A game theoretic approach to explain the output of any machine learning model.

Based on the README, here's a technical summary: Implements fast exact algorithms for tree ensemble models (XGBoost, LightGBM, CatBoost, scikit-learn, PySpark) via optimized C++ backends, alongside approximation methods for deep learning (DeepExplainer leveraging DeepLIFT) and NLP transformers using coalitional game rules. Provides multiple visualization outputs—waterfall plots, force plots, dependence scatter plots, and beeswarm distributions—to show feature contributions at instance and global levels. Integrates directly with popular ML frameworks and Hugging Face transformers, supporting both tabular and text-based model explanations.

About shap-analysis-guide

AidanCooper/shap-analysis-guide

How to Interpret SHAP Analyses: A Non-Technical Guide

Scores updated daily from GitHub, PyPI, and npm data. How scores work