All Projects → alokwhitewolf → Guided Attention Inference Network

alokwhitewolf / Guided Attention Inference Network

Licence: mit
Contains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Guided Attention Inference Network

Isab Pytorch
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-89.71%)
Mutual labels:  attention-mechanism, attention
Graph attention pool
Attention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Stars: ✭ 186 (-8.82%)
Mutual labels:  attention-mechanism, attention
Global Self Attention Network
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-68.63%)
Mutual labels:  attention-mechanism, attention
Hnatt
Train and visualize Hierarchical Attention Networks
Stars: ✭ 192 (-5.88%)
Mutual labels:  attention-mechanism, attention
Absa keras
Keras Implementation of Aspect based Sentiment Analysis
Stars: ✭ 126 (-38.24%)
Mutual labels:  attention-mechanism, attention
Performer Pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+167.65%)
Mutual labels:  attention-mechanism, attention
Attentive Gan Derainnet
Unofficial tensorflow implemention of "Attentive Generative Adversarial Network for Raindrop Removal from A Single Image (CVPR 2018) " model https://maybeshewill-cv.github.io/attentive-gan-derainnet/
Stars: ✭ 184 (-9.8%)
Mutual labels:  attention-mechanism, cvpr2018
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (+50%)
Mutual labels:  attention-mechanism, attention
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (-9.8%)
Mutual labels:  attention-mechanism, attention
Lambda Networks
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (+633.82%)
Mutual labels:  attention-mechanism, attention
Structured Self Attention
A Structured Self-attentive Sentence Embedding
Stars: ✭ 459 (+125%)
Mutual labels:  attention-mechanism, attention
Prediction Flow
Deep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (-32.35%)
Mutual labels:  attention-mechanism, attention
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+101.47%)
Mutual labels:  attention-mechanism, attention
Pytorch Gat
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+345.1%)
Mutual labels:  attention-mechanism, attention
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+100%)
Mutual labels:  attention-mechanism, attention
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-59.8%)
Mutual labels:  attention-mechanism, attention
NTUA-slp-nlp
💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-90.69%)
Mutual labels:  attention, attention-mechanism
Attention
一些不同的Attention机制代码
Stars: ✭ 17 (-91.67%)
Mutual labels:  attention, attention-mechanism
Dhf1k
Revisiting Video Saliency: A Large-scale Benchmark and a New Model (CVPR18, PAMI19)
Stars: ✭ 96 (-52.94%)
Mutual labels:  attention-mechanism, cvpr2018
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-38.24%)
Mutual labels:  attention-mechanism, attention

Guided Attention for FCN

About

Chainer implementation of Tell Me Where To Look. This is an experiment to apply Guided Attention Inference Network(GAIN) as presented in the paper to Fully Convolutional Networks(FCN) used for segmentation purposes. The trained FCN8s model is fine tuned using guided attention.

GAIN

GAIN is based on supervising the attention maps that is produced when we train the network for the task of interest.

Image

FCN

Fully Convolutional Networks is a network architecture that consists of convolution layers followed by deconvolutions to give the segmentation output

Image

Approach

  • We take the fully trained FCN8 network and add a average pooling and fully connected layers after its convolutional layers. We freeze the convolutional layers and train the fully connected networks to classify for the objects. We do this in order to get GradCAMs for a particular class to be later used during GAIN

Image

  • Next we train the network as per the GAIN update rule. However in this implementation I have also considered the segmentation loss along with the GAIN updates/loss. This is because, I found using only the GAIN updates though did lead to convergence of losses, but also resulted in quite a significant dip in segmentation accuracies. In this step, the fully connected ayers are freezed and are not updated.

Loss Curves

For classification training

Image

Segmentation Loss during GAIN updates

Image

Qualitative Results

| Original Image | PreTrained GCAMs | Post GAIN GCAMs |

Image

Image

Image

Image

Quantitative Results

For FCN8s

Implementation Accuracy Accuracy Class Mean IU FWAVACC Model File
Original 91.2212 77.6146 65.5126 84.5445 fcn8s_from_caffe.npz
Experimental 90.5962 80.4099 64.6869 83.9952 To make public soon

How to use

pip install chainer
pip install chainercv
pip install cupy
pip install fcn

Training

For training the classifier, download. the pretrained FCN8s chainer model

python3 train_classifier.py --device 0

This will automatically download the pretrained file and train the classifier on it. You might run into an error of " xxx.txt file not found " while running this script. To solve this, at the place where your fcn library is installed, get the missing file from the fcn repository over github, and take care to put the exisiting file by making the same directory structure as asked in the error message. For more details, refer to this issue

For GAIN updates,

python3 train_GAIN.py --modelfile <path to the trained model with trained classifier> --device 0

The accuracy of original implementation is computed with (evaluate.py <path to the trained fcn8 model>) which has been borrowed from wkentaro's implementation

Visualization

visualize.py 

required arguements -

 --pretrained <path to the model file with trained classifier but not trained through GAIN method>
 --trained <path to the model trained with GAIN>

optional arguements -

  --device=-1 <device to assign to model, default uses cpu>
  --whole=False < whether to test on whole valid dataset>
  --shuffle=False <shuffles fataset loader>
  --no=10 <if whole is False, then no of images to visualize>

To Do

  • [x] Push Visualization Code

Using GAIN for other models

I have attempted to make GAIN as modular as possible so that it can be used on some other model as well. All you would need to do is make GAIN class( which itself inherits chainer.Chain) as parent class to your model. Each GAIN model needs to have a few particular instance variables in order to be able to function. GAIN module has methods to instantiate every single one of them. I would advice you to lookup models/fcn8.py as well as GAIN.py to have an idea about them.

  • GAIN_functions - An ordered dict consisting of names of steps and it's associated functions.
  • final_conv_layer - Name of the step after which no convolutions happen
  • grad_target_layer - Name of the step from where gradients are to be collected for computing GradCAM

Credits

The original FCN module and the fcn package is courtesy of wkentaro

Citation

If you find this code useful in your research, please consider citing:

@misc{Alok2018,
Author = {Bishoyi, Alok Kumar},
Title = {Guided Attention Inference Network},
Year = {2018},
Publisher = {GitHub},
journal = {Github repository},
howpublished = {\url{https://github.com/alokwhitewolf/Guided-Attention-Inference-Network}},
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].