All Projects → hiranumn → Integratedgradients

hiranumn / Integratedgradients

Licence: mit
Python/Keras implementation of integrated gradients presented in "Axiomatic Attribution for Deep Networks" for explaining any model defined in Keras framework.

Projects that are alternatives of or similar to Integratedgradients

Spyketorch
High-speed simulator of convolutional spiking neural networks with at most one spike per neuron.
Stars: ✭ 194 (-1.52%)
Mutual labels:  jupyter-notebook
Intrinsic Dimension
Stars: ✭ 197 (+0%)
Mutual labels:  jupyter-notebook
Ml
Codes related to various ML Hackathons
Stars: ✭ 197 (+0%)
Mutual labels:  jupyter-notebook
Veridefteri
Veri Defteri Üzerine Kısa Notlar (www.veridefteri.com)
Stars: ✭ 196 (-0.51%)
Mutual labels:  jupyter-notebook
Explainx
Explainable AI framework for data scientists. Explain & debug any blackbox machine learning model with a single line of code.
Stars: ✭ 196 (-0.51%)
Mutual labels:  jupyter-notebook
Svhnclassifier
A TensorFlow implementation of Multi-digit Number Recognition from Street View Imagery using Deep Convolutional Neural Networks (http://arxiv.org/pdf/1312.6082.pdf)
Stars: ✭ 197 (+0%)
Mutual labels:  jupyter-notebook
Lernapparat
Various Notebooks for Lernapparat.de
Stars: ✭ 196 (-0.51%)
Mutual labels:  jupyter-notebook
Atari Model Zoo
A binary release of trained deep reinforcement learning models trained in the Atari machine learning benchmark, and a software release that enables easy visualization and analysis of models, and comparison across training algorithms.
Stars: ✭ 198 (+0.51%)
Mutual labels:  jupyter-notebook
Mnist tutorial
A tutorial for MNIST handwritten digit classification using sklearn, PyTorch and Keras.
Stars: ✭ 197 (+0%)
Mutual labels:  jupyter-notebook
Analytics Zoo
Distributed Tensorflow, Keras and PyTorch on Apache Spark/Flink & Ray
Stars: ✭ 2,448 (+1142.64%)
Mutual labels:  jupyter-notebook
Introml
Python tutorials for introduction to machine learning
Stars: ✭ 196 (-0.51%)
Mutual labels:  jupyter-notebook
Keras Acgan
Auxiliary Classifier Generative Adversarial Networks in Keras
Stars: ✭ 196 (-0.51%)
Mutual labels:  jupyter-notebook
Ddpg
Implementation of Deep Deterministic Policy Gradients using TensorFlow and OpenAI Gym
Stars: ✭ 197 (+0%)
Mutual labels:  jupyter-notebook
Pynq workshop
Stars: ✭ 196 (-0.51%)
Mutual labels:  jupyter-notebook
Text detector
Text detection model that combines Retinanet with textboxes++ for OCR
Stars: ✭ 198 (+0.51%)
Mutual labels:  jupyter-notebook
Biological learning
Example of "biological" learning for MNIST
Stars: ✭ 196 (-0.51%)
Mutual labels:  jupyter-notebook
Coursera Applied Data Science With Python
Repository for coursera specialization Applied Data Science with Python by University of Michigan
Stars: ✭ 197 (+0%)
Mutual labels:  jupyter-notebook
Up Down Captioner
Automatic image captioning model based on Caffe, using features from bottom-up attention.
Stars: ✭ 195 (-1.02%)
Mutual labels:  jupyter-notebook
Data Science Projects With Python
A Case Study Approach to Successful Data Science Projects Using Python, Pandas, and Scikit-Learn
Stars: ✭ 198 (+0.51%)
Mutual labels:  jupyter-notebook
Miscellaneous
Scripts and code examples. Includes Spark notes, Jupyter notebook examples for Spark, Impala and Oracle.
Stars: ✭ 197 (+0%)
Mutual labels:  jupyter-notebook

Integrated Gradients

Python implementation of integrated gradients [1]. The algorithm "explains" a prediction of a Keras-based deep learning model by approximating Aumann–Shapley values for the input features. These values allocate the difference between the model prediction for a reference value (all zeros by default) and the prediction for the current sample among the input features. TensorFlow version is implemented now!

Usage

Using Integrated_Gradients is very easy. There is no need to modify your Keras model.
Here is a minimal working example on UCI Iris data.

  1. Build your own Keras model and train it. Make sure to complie it!
from IntegratedGradients import *
from keras.layers import Dense
from keras.layers.core import Activation

X = np.array([[float(j) for j in i.rstrip().split(",")[:-1]] for i in open("iris.data").readlines()][:-1])
Y = np.array([0 for i in range(100)] + [1 for i in range(50)])

model = Sequential([
    Dense(1, input_dim=4),
    Activation('sigmoid'),
])
model.compile(optimizer='sgd', loss='binary_crossentropy')
model.fit(X, Y, epochs=300, batch_size=10, validation_split=0.2, verbose=0)
  1. Wrap it with an integrated_gradients instance.
ig = integrated_gradients(model)
  1. Call explain() with a sample to explain.
ig.explain(X[0])
==> array([-0.25757075, -0.24014562,  0.12732635,  0.00960122])

Features

  • supports both Sequential() and Model() instances.
  • supports both TensorFlow and Theano backends.
  • works on models with multiple outputs.
  • works on models with mulitple input branches.

Example notebooks

  • More thorough example can be found here.
  • There is also an example of running this on VGG16 model.
  • If your network has multiple input sources (branches), you can take a look at this.

MNIST example

We trained a simple CNN model (1 conv layer and 1 dense layer) on the MNIST imagesets. Here are some results of running integrated_gradients on the trained model and explaining some samples.

alt text alt text alt text alt text alt text alt text alt text

References

  1. Sundararajan, Mukund, Ankur Taly, and Qiqi Yan. "Axiomatic Attribution for Deep Networks." arXiv preprint arXiv:1703.01365 (2017).

Email me at hiranumn at cs dot washington dot edu for questions.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].