aerdem4/lofo-importance
Leave One Feature Out Importance
Iteratively removes each feature and retrains the model across validation folds to measure performance impact, providing model-agnostic importance scores that account for feature interactions and generalize to unseen data. Supports custom validation schemes, custom models (defaults to LightGBM), and feature grouping for high-dimensional features; also includes FLOFO, a faster permutation-based variant that groups samples during shuffling to avoid unrealistic feature value replacements.
863 stars and 1,583 monthly downloads. No commits in the last 6 months. Available on PyPI.
Stars
863
Forks
83
Language
Python
License
MIT
Category
Last pushed
Feb 14, 2025
Monthly downloads
1,583
Commits (30d)
0
Dependencies
7
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/aerdem4/lofo-importance"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
shap/shap
A game theoretic approach to explain the output of any machine learning model.
mmschlk/shapiq
Shapley Interactions and Shapley Values for Machine Learning
predict-idlab/powershap
A power-full Shapley feature selection method.
linkedin/FastTreeSHAP
Fast SHAP value computation for interpreting tree-based models
iancovert/sage
For calculating global feature importance using Shapley values.