pytorch-grad-cam and Explore-Deep-Network-Explainability-Using-an-App

Maintenance 2/25
Adoption 23/25
Maturity 25/25
Community 22/25
Maintenance 0/25
Adoption 7/25
Maturity 9/25
Community 15/25
Stars: 12,682
Forks: 1,694
Downloads: 58,294
Commits (30d): 0
Language: Python
License: MIT
Stars: 37
Forks: 7
Downloads:
Commits (30d): 0
Language: MATLAB
License:
Stale 6m
Stale 6m No Package No Dependents

About pytorch-grad-cam

jacobgil/pytorch-grad-cam

Advanced AI Explainability for computer vision. Support for CNNs, Vision Transformers, Classification, Object detection, Segmentation, Image similarity and more.

Implements 16+ attribution methods ranging from gradient-based approaches (GradCAM, GradCAM++) to perturbation-based techniques (AblationCAM, ScoreCAM) with batched inference for high performance. Built on PyTorch, it supports explainability across diverse architectures including CNNs, Vision Transformers, and multimodal models like CLIP, plus includes built-in metrics and smoothing algorithms to validate and refine explanation quality. Also works with medical imaging, embedding similarity tasks, and provides deep feature factorization for interpretable representation analysis.

About Explore-Deep-Network-Explainability-Using-an-App

matlab-deep-learning/Explore-Deep-Network-Explainability-Using-an-App

This repository provides an app for exploring the predictions of an image classification network using several deep learning visualization techniques. Using the app, you can: explore network predictions with occlusion sensitivity, Grad-CAM, and gradient attribution methods, investigate misclassifications using confusion and t-SNE plots, visualize layer activations, and many more techniques to help you understand and explain your deep network’s predictions.

Scores updated daily from GitHub, PyPI, and npm data. How scores work