All Projects → ankurtaly → Integrated Gradients

ankurtaly / Integrated Gradients

Attributing predictions made by the Inception network using the Integrated Gradients method

Projects that are alternatives of or similar to Integrated Gradients

Natural Language Image Search
Search photos on Unsplash using natural language
Stars: ✭ 359 (-1.64%)
Mutual labels:  jupyter-notebook
Articles
A repository for the source code, notebooks, data, files, and other assets used in the data science and machine learning articles on LearnDataSci
Stars: ✭ 350 (-4.11%)
Mutual labels:  jupyter-notebook
Tts
🐸💬 - a deep learning toolkit for Text-to-Speech, battle-tested in research and production
Stars: ✭ 305 (-16.44%)
Mutual labels:  jupyter-notebook
Qtrader
Reinforcement Learning for Portfolio Management
Stars: ✭ 363 (-0.55%)
Mutual labels:  jupyter-notebook
Cs229 ml
🍟 Stanford CS229: Machine Learning
Stars: ✭ 364 (-0.27%)
Mutual labels:  jupyter-notebook
Kaggle titanic
the data and ipython notebook of my attempt to solve the kaggle titanic problem
Stars: ✭ 363 (-0.55%)
Mutual labels:  jupyter-notebook
Data Science Projects
DataScience projects for learning : Kaggle challenges, Object Recognition, Parsing, etc.
Stars: ✭ 361 (-1.1%)
Mutual labels:  jupyter-notebook
Easy Deep Learning With Keras
Keras tutorial for beginners (using TF backend)
Stars: ✭ 367 (+0.55%)
Mutual labels:  jupyter-notebook
Lagom
lagom: A PyTorch infrastructure for rapid prototyping of reinforcement learning algorithms.
Stars: ✭ 364 (-0.27%)
Mutual labels:  jupyter-notebook
Sdv
Synthetic Data Generation for tabular, relational and time series data.
Stars: ✭ 360 (-1.37%)
Mutual labels:  jupyter-notebook
Datascience
A Python library for introductory data science
Stars: ✭ 363 (-0.55%)
Mutual labels:  jupyter-notebook
Vilbert beta
Stars: ✭ 359 (-1.64%)
Mutual labels:  jupyter-notebook
Optunity
optimization routines for hyperparameter tuning
Stars: ✭ 362 (-0.82%)
Mutual labels:  jupyter-notebook
Deepstream python apps
A project demonstrating use of Python for DeepStream sample apps given as a part of SDK (that are currently in C,C++).
Stars: ✭ 359 (-1.64%)
Mutual labels:  jupyter-notebook
Intro programming
A set of IPython notebooks and learning resources for an Introduction to Programming class, focusing on Python.
Stars: ✭ 366 (+0.27%)
Mutual labels:  jupyter-notebook
Tensorflow chessbot
Predict chessboard FEN layouts from images using TensorFlow
Stars: ✭ 362 (-0.82%)
Mutual labels:  jupyter-notebook
Advanced Tensorflow
Little More Advanced TensorFlow Implementations
Stars: ✭ 364 (-0.27%)
Mutual labels:  jupyter-notebook
Carnd Vehicle Detection
Vehicle detection using YOLO in Keras runs at 21FPS
Stars: ✭ 367 (+0.55%)
Mutual labels:  jupyter-notebook
Nerf pl
NeRF (Neural Radiance Fields) and NeRF in the Wild using pytorch-lightning
Stars: ✭ 362 (-0.82%)
Mutual labels:  jupyter-notebook
Sagemaker Deployment
Code and associated files for the deploying ML models within AWS SageMaker
Stars: ✭ 361 (-1.1%)
Mutual labels:  jupyter-notebook

Integrated Gradients

(a.k.a. Path-Integrated Gradients, a.k.a. Axiomatic Attribution for Deep Networks)

Contact: integrated-gradients AT gmail.com

Contributors (alphabetical, last name):

  • Kedar Dhamdhere (Google)
  • Pramod Kaushik Mudrakarta (U. Chicago)
  • Mukund Sundararajan (Google)
  • Ankur Taly (Google Brain)
  • Jinhua (Shawn) Xu (Verily)

We study the problem of attributing the prediction of a deep network to its input features, as an attempt towards explaining individual predictions. For instance, in an object recognition network, an attribution method could tell us which pixels of the image were responsible for a certain label being picked, or which words from sentence were indicative of strong sentiment.

Applications range from helping a developer debug, allowing analysts to explore the logic of a network, and to give end-user’s some transparency into the reason for a network’s prediction.

Integrated Gradients is a variation on computing the gradient of the prediction output w.r.t. features of the input. It requires no modification to the original network, is simple to implement, and is applicable to a variety of deep models (sparse and dense, text and vision).

Relevant papers and slide decks

  • Axiomatic Attribution for Deep Networks -- Mukund Sundararajan, Ankur Taly, Qiqi Yan, Proceedings of International Conference on Machine Learning (ICML), 2017

    This paper introduced the Integrated Gradients method. It presents an axiomatic justification of the method along with applications to various deep networks. Slide deck

  • Did the model understand the questions? -- Pramod Mudrakarta, Ankur Taly, Mukund Sundararajan, Kedar Dhamdhere, Proceedings of Association of Computational Linguistics (ACL), 2018

    This paper discusses an application of integrated gradients for evaluating the robustness of question-answering networks. Slide deck

Implementing Integrated Gradients

This How-To document describes the steps involved in implementing integrated gradients for an arbitrary deep network.

This repository provideds code for implementing integrated gradients for networks with image inputs. It is structured as follows:

We recommend starting with the notebook. To run the notebook, please follow the following instructions.

  • Clone this repository

    git clone https://github.com/ankurtaly/Attributions
    
  • In the same directory, run the Jupyter notebook server.

    jupyter notebook
    

    Instructions for installing Jupyter are available here. Please make sure that you have TensorFlow, NumPy, and PIL.Image installed for Python 2.7.

  • Open attributions.ipynb and run all cells.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].