pytorch-grad-cam and Explore-Deep-Network-Explainability-Using-an-App
About pytorch-grad-cam
jacobgil/pytorch-grad-cam
Advanced AI Explainability for computer vision. Support for CNNs, Vision Transformers, Classification, Object detection, Segmentation, Image similarity and more.
Implements 16+ attribution methods ranging from gradient-based approaches (GradCAM, GradCAM++) to perturbation-based techniques (AblationCAM, ScoreCAM) with batched inference for high performance. Built on PyTorch, it supports explainability across diverse architectures including CNNs, Vision Transformers, and multimodal models like CLIP, plus includes built-in metrics and smoothing algorithms to validate and refine explanation quality. Also works with medical imaging, embedding similarity tasks, and provides deep feature factorization for interpretable representation analysis.
About Explore-Deep-Network-Explainability-Using-an-App
matlab-deep-learning/Explore-Deep-Network-Explainability-Using-an-App
This repository provides an app for exploring the predictions of an image classification network using several deep learning visualization techniques. Using the app, you can: explore network predictions with occlusion sensitivity, Grad-CAM, and gradient attribution methods, investigate misclassifications using confusion and t-SNE plots, visualize layer activations, and many more techniques to help you understand and explain your deep network’s predictions.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work