All Projects → wenguanwang → Deepattention

wenguanwang / Deepattention

Deep Visual Attention Prediction (TIP18)

Programming Languages

matlab
3953 projects

Projects that are alternatives of or similar to Deepattention

Keras Attention Mechanism
Attention mechanism Implementation for Keras.
Stars: ✭ 2,504 (+3752.31%)
Mutual labels:  attention-mechanism, attention-model
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (+93.85%)
Mutual labels:  attention-mechanism, attention-model
Pytorch Attention Guided Cyclegan
Pytorch implementation of Unsupervised Attention-guided Image-to-Image Translation.
Stars: ✭ 67 (+3.08%)
Mutual labels:  attention-mechanism, attention-model
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+670.77%)
Mutual labels:  attention-mechanism, attention-model
attention-mechanism-keras
attention mechanism in keras, like Dense and RNN...
Stars: ✭ 19 (-70.77%)
Mutual labels:  attention-mechanism, attention-model
Attentionalpoolingaction
Code/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Stars: ✭ 248 (+281.54%)
Mutual labels:  attention-mechanism, attention-model
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (+83.08%)
Mutual labels:  attention-mechanism, attention-model
Compact-Global-Descriptor
Pytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-66.15%)
Mutual labels:  attention-mechanism, attention-model
Structured Self Attention
A Structured Self-attentive Sentence Embedding
Stars: ✭ 459 (+606.15%)
Mutual labels:  attention-mechanism, attention-model
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+1423.08%)
Mutual labels:  attention-mechanism, attention-model
Keras Attention
Visualizing RNNs using the attention mechanism
Stars: ✭ 697 (+972.31%)
Mutual labels:  attention-mechanism
Chatbot cn
基于金融-司法领域(兼有闲聊性质)的聊天机器人,其中的主要模块有信息抽取、NLU、NLG、知识图谱等,并且利用Django整合了前端展示,目前已经封装了nlp和kg的restful接口
Stars: ✭ 791 (+1116.92%)
Mutual labels:  attention-mechanism
Textclassifier
Text classifier for Hierarchical Attention Networks for Document Classification
Stars: ✭ 985 (+1415.38%)
Mutual labels:  attention-mechanism
Awesome Attention Mechanism In Cv
计算机视觉中用到的注意力模块和其他即插即用模块PyTorch Implementation Collection of Attention Module and Plug&Play Module
Stars: ✭ 54 (-16.92%)
Mutual labels:  attention-model
Text Classification Pytorch
Text classification using deep learning models in Pytorch
Stars: ✭ 683 (+950.77%)
Mutual labels:  attention-model
Reading comprehension tf
Machine Reading Comprehension in Tensorflow
Stars: ✭ 37 (-43.08%)
Mutual labels:  attention-model
Pointer summarizer
pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
Stars: ✭ 629 (+867.69%)
Mutual labels:  attention-mechanism
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+772.31%)
Mutual labels:  attention-mechanism
Performer Pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+740%)
Mutual labels:  attention-mechanism
Attentional Interfaces
🔍 Attentional interfaces in TensorFlow.
Stars: ✭ 58 (-10.77%)
Mutual labels:  attention-mechanism

This repo contains a CAFFE re-implementation for

"Deep Visual Attention Prediction", TIP18

By Wenguan Wang, Jianbing Shen

========================================================================

Our current implementation is based on the caffe version of Holistically-Nested Edge Detection (has been included in this repository, see external/caffe). We also use some functions from fasterrcnn's caffe. But it can be easily implemented in standard caffe library.

We plan to re-implement it on Keras, but it depends on my schedule.

a. Please first compile the 'Caffe' for matlab.

b. Please download our MODEL and RESUTLS on DUT, MIT300, MIT1003, Pascal, and Toronto datasets from

google drive: https://drive.google.com/open?id=1cl3k3POMqS5obv0ffz84qOgGfAbEmtB3

or Baidu Wangpan: https://pan.baidu.com/s/1o8kQcAY password: wdwa

and put the model 'attention_final' into 'models' folder.

c. Run 'deep_attention_test' and the results can be found in 'datasets/results/'.

If you find our method useful in your research, please consider citing:

@article{wang2018deep,
    Author = {Wenguan Wang, Jianbing Shen},
    Title = {Deep Visual Attention Prediction},
    Journal = {IEEE Transactions on Image Processing},
    Year = {2018}
}

========================================================================

Any comments, please email: [email protected]

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].