All Projects → 1202kbs → Understanding Nn

1202kbs / Understanding Nn

Licence: mit
Tensorflow tutorial for various Deep Neural Network visualization techniques

Projects that are alternatives of or similar to Understanding Nn

How To Read Pytorch
Quick, visual, principled introduction to pytorch code through five colab notebooks.
Stars: ✭ 218 (-16.48%)
Mutual labels:  jupyter-notebook, tutorial
Functional intro to python
[tutorial]A functional, Data Science focused introduction to Python
Stars: ✭ 228 (-12.64%)
Mutual labels:  jupyter-notebook, tutorial
50 Days Of Ml
A day to day plan for this challenge (50 Days of Machine Learning) . Covers both theoretical and practical aspects
Stars: ✭ 218 (-16.48%)
Mutual labels:  jupyter-notebook, tutorial
Sc17
SuperComputing 2017 Deep Learning Tutorial
Stars: ✭ 211 (-19.16%)
Mutual labels:  jupyter-notebook, tutorial
Pytorch Sentiment Analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+1129.5%)
Mutual labels:  jupyter-notebook, tutorial
Tutorials
AI-related tutorials. Access any of them for free → https://towardsai.net/editorial
Stars: ✭ 204 (-21.84%)
Mutual labels:  jupyter-notebook, tutorial
Tutorial
Tutorial covering Open Source tools for Source Separation.
Stars: ✭ 223 (-14.56%)
Mutual labels:  jupyter-notebook, tutorial
Trump Lies
Tutorial: Web scraping in Python with Beautiful Soup
Stars: ✭ 201 (-22.99%)
Mutual labels:  jupyter-notebook, tutorial
Kitti tutorial
Tutorial for using Kitti dataset easily
Stars: ✭ 235 (-9.96%)
Mutual labels:  jupyter-notebook, tutorial
Datavisualization
Tutorials on visualizing data using python packages like bokeh, plotly, seaborn and igraph
Stars: ✭ 234 (-10.34%)
Mutual labels:  jupyter-notebook, tutorial
Dlsys Course.github.io
Deep learning system course
Stars: ✭ 207 (-20.69%)
Mutual labels:  jupyter-notebook, tutorial
Dl tutorial
Tutorials for deep learning
Stars: ✭ 247 (-5.36%)
Mutual labels:  jupyter-notebook, tutorial
Rl Tutorial Jnrr19
Stable-Baselines tutorial for Journées Nationales de la Recherche en Robotique 2019
Stars: ✭ 204 (-21.84%)
Mutual labels:  jupyter-notebook, tutorial
Tensorflow
Deep Learning Zero to All - Tensorflow
Stars: ✭ 216 (-17.24%)
Mutual labels:  jupyter-notebook, tutorial
Bayesian Modelling In Python
A python tutorial on bayesian modeling techniques (PyMC3)
Stars: ✭ 2,332 (+793.49%)
Mutual labels:  jupyter-notebook, tutorial
Dl For Chatbot
Deep Learning / NLP tutorial for Chatbot Developers
Stars: ✭ 221 (-15.33%)
Mutual labels:  jupyter-notebook, tutorial
Gans From Theory To Production
Material for the tutorial: "Deep Diving into GANs: from theory to production"
Stars: ✭ 182 (-30.27%)
Mutual labels:  jupyter-notebook, tutorial
Imodels
Interpretable ML package 🔍 for concise, transparent, and accurate predictive modeling (sklearn-compatible).
Stars: ✭ 194 (-25.67%)
Mutual labels:  jupyter-notebook, tutorial
Neural Network From Scratch
Ever wondered how to code your Neural Network using NumPy, with no frameworks involved?
Stars: ✭ 230 (-11.88%)
Mutual labels:  jupyter-notebook, tutorial
Deeplearningcoursecodes
Notes, Codes, and Tutorials for the Deep Learning Course <which I taught at ChinaHadoop>
Stars: ✭ 241 (-7.66%)
Mutual labels:  jupyter-notebook, tutorial

Understanding NN

This repository is intended to be a tutorial of various DNN interpretation and explanation techniques. Explanation of the theoretical background as well as step-by-step Tensorflow implementation for practical usage are both covered in the Jupyter Notebooks. I did not include explanation for techniques for which I thought the algorithm as well as the explanation of the original paper was clear.

UPDATE

It seems that Github is unable to render some of the equations in the notebooks. I strongly recommend using the nbviewer until I find out what the problem is (you can also download the repo and view them on your local environment). Links are listed below.

Nbviewer Links

1.1 Activation Maximization

1.3 Performing AM in Code Space

2.1 Sensitivity Analysis

2.2 Simple Taylor Decomposition

2.3 Layer-wise Relevance Propagation Part 1

2.3 Layer-wise Relevance Propagation Part 2

2.4 Deep Taylor Decomposition Part 1

2.4 Deep Taylor Decomposition Part 2

2.5 DeepLIFT

3.1 Deconvolution

3.2 Backpropagation

3.3 Guided Backpropagation

3.4 Integrated Gradients

3.5 SmoothGrad

4.1 Class Activation Map

4.2 Grad-CAM

4.3 Grad-CAM++

5.1 Explanation Continuity

5.2 Explanation Selectivity

1 Activation Maximization

This section focuses on interpreting a concept learned by a deep neural network (DNN) through activation maximization.

1.1 Activation Maximization (AM)

alt tag

alt tag

1.3 Performing AM in Code Space

alt tag

alt tag

2 Layer-wise Relevance Propagation

In this section, we first introduce the concept of relevance score with Sensitivity Analysis, explore basic relevance decomposition with Simple Taylor Decomposition and then build up to various Layer-wise Relevance Propagation methods such as Deep Taylor Decomposition and DeepLIFT.

2.1 Sensitivity Analysis

alt tag

alt tag

2.2 Simple Taylor Decomposition

alt tag

alt tag

2.3 Layer-wise Relevance Propagation

alt tag

2.4 Deep Taylor Decomposition

alt tag

alt tag

2.5 DeepLIFT

alt tag

alt tag

3 Gradient Based Methods

Implementation of various types of gradient-based visualization methods such as Deconvolution, Backpropagation, Guided Backpropagation, Integrated Gradients and SmoothGrad.

3.1 Deconvolution

alt tag

alt tag

3.2 Backpropagation

alt tag

alt tag

3.3 Guided Backpropagation

alt tag

alt tag

3.4 Integrated Gradients

alt tag

alt tag

3.5 SmoothGrad

alt tag

alt tag

4 Class Activation Map

Implementation of Class Activation Map (CAM) and its generalized versions, Grad-CAM and Grad-CAM++ the cluttered MNIST dataset.

4.1 Class Activation Map

alt tag

alt tag

4.2 Grad-CAM

alt tag

alt tag

4.3 Grad-CAM++

alt tag

alt tag

5 Quantifying Explanation Quality

While each explanation technique is based on its own intuition or mathematical principle, it is also important to define at a more abstract level what are the characteristics of a good explanation, and to be able to test for these characteristics quantitatively. We present in Sections 5.1 and 5.2 two important properties of an explanation, along with possible evaluation metrics.

5.1 Explanation Continuity

alt tag

alt tag

5.2 Explanation Selectivity

alt tag

alt tag

Explanation Technique Comparison Graph

Prerequisites

This tutorial requires Tensorflow, NumPy, Matplotlib, and OpenCV.

References

Sections 1.1 ~ 2.2 and 5.1 ~ 5.2

[1] Montavon, G., Samek, W., Müller, K., jun 2017. Methods for Interpreting and Understanding Deep Neural Networks. arXiv preprint arXiv:1706.07979, 2017.

Section 1.3

[2] Nguyen, A., Dosovitskiy, A., Yosinski, J., Brox, T., Clune, J., 2016. Synthesizing the preferred inputs for neurons in neural networks via deep generator networks. In: Advances in Neural Information Processing Systems 29: Annual Conference on Neural Information Processing Systems 2016, December 5-10, 2016, Barcelona, Spain. pp. 3387-3395.

[3] A. Dosovitskiy and T. Brox. Generating images with perceptual similarity metrics based on deep networks. In NIPS, 2016.

Section 2.3

[4] Bach, S., Binder, A., Montavon, G., Klauschen, F., Müller, K.R., Samek, W., 07 2015. On pixel-wise explanations for non-linear classier decisions by layer-wise relevance propagation. PLOS ONE 10 (7), 1-46.

Section 2.4

[5] Montavon, G., Lapuschkin, S., Binder, A., Samek, W., Müller, K.R., 2017. Explaining nonlinear classication decisions with deep Taylor decomposition. Pattern Recognition 65, 211-222.

Section 2.5

[6] Avanti Shrikumar, Peyton Greenside, and Anshul Kundaje. Learning Important Features Through Propagating Activation Differences. arXiv preprint arXiv:1704.02685, 2017.

Section 3.1

[7] Zeiler, M. D., Fergus, R., 2014. Visualizing and understanding convolutional networks. In: Computer Vision - ECCV 2014 - 13th European Conference, Zurich, Switzerland, September 6-12, 2014, Proceedings, Part I. pp. 818-833.

Section 3.2

[8] K. Simonyan, A. Vedaldi, and A. Zisserman. Deep inside convolutional networks: Visualising image classification models and saliency maps. In Workshop at International Conference on Learning Representations, 2014.

Section 3.3

[9] Jost Tobias Springenberg, Alexey Dosovitskiy, Thomas Brox, and Martin Riedmiller. Striving for simplicity: The all convolutional net. arXiv preprint arXiv:1412.6806, 2014.

Section 3.4

[10] Mukund Sundararajan, Ankur Taly, and Qiqi Yan. Axiomatic attribution for deep networks. arXiv preprint arXiv:1703.01365, 2017.

Section 3.5

[11] Daniel Smilkov, Nikhil Thorat, Been Kim, Fernanda Viégas, and Martin Wattenberg. SmoothGrad: removing noise by adding noise. arXiv preprint arXiv:1706.03825, 2017.

Section 4.1

[12] Bolei Zhou, Aditya Khosla, Agata Lapedriza, Aude Oliva, and Antonio Torralba. Learning deep features for discriminative localization. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2921–2929, 2016.

Section 4.2

[13] R. R.Selvaraju, A. Das, R. Vedantam, M. Cogswell, D. Parikh, and D. Batra. Grad-cam: Why did you say that? visual explanations from deep networks via gradient-based localization. arXiv:1611.01646, 2016.

Section 4.3

[14] A. Chattopadhyay, A. Sarkar, P. Howlader, and V. N. Balasubramanian. Grad-cam++: Generalized gradient-based visual explanations for deep convolutional networks. CoRR, abs/1710.11063, 2017.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].