All Projects → Attention-Visualization → Similar Projects or Alternatives

327 Open source projects that are alternatives of or similar to Attention-Visualization

free-lunch-saliency
Code for "Free-Lunch Saliency via Attention in Atari Agents"
Stars: ✭ 15 (-72.22%)
Mutual labels:  attention
interpretable-han-for-document-classification-with-keras
Keras implementation of hierarchical attention network for document classification with options to predict and present attention weights on both word and sentence level.
Stars: ✭ 18 (-66.67%)
Mutual labels:  attention
ilmulti
Tooling to play around with multilingual machine translation for Indian Languages.
Stars: ✭ 19 (-64.81%)
Mutual labels:  machine-translation
EBIM-NLI
Enhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (-55.56%)
Mutual labels:  attention
dhs summit 2019 image captioning
Image captioning using attention models
Stars: ✭ 34 (-37.04%)
Mutual labels:  attention
BSD
The Business Scene Dialogue corpus
Stars: ✭ 51 (-5.56%)
Mutual labels:  machine-translation
LFattNet
Attention-based View Selection Networks for Light-field Disparity Estimation
Stars: ✭ 41 (-24.07%)
Mutual labels:  attention
urbans
A tool for translating text from source grammar to target grammar (context-free) with corresponding dictionary.
Stars: ✭ 19 (-64.81%)
Mutual labels:  machine-translation
attention-mechanism-keras
attention mechanism in keras, like Dense and RNN...
Stars: ✭ 19 (-64.81%)
Mutual labels:  attention-visualization
inmt
Interactive Neural Machine Translation tool
Stars: ✭ 44 (-18.52%)
Mutual labels:  machine-translation
image-recognition
采用深度学习方法进行刀具识别。
Stars: ✭ 19 (-64.81%)
Mutual labels:  attention
rtg
Reader Translator Generator - NMT toolkit based on pytorch
Stars: ✭ 26 (-51.85%)
Mutual labels:  machine-translation
datagrand bert
2019达观杯信息提取第5名代码
Stars: ✭ 20 (-62.96%)
Mutual labels:  multi-head-attention
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-9.26%)
Mutual labels:  attention
LNSwipeCell
一套友好的、方便集成的针对cell的左滑编辑功能!
Stars: ✭ 16 (-70.37%)
Mutual labels:  attention
AttnSleep
[IEEE TNSRE] "An Attention-based Deep Learning Approach for Sleep Stage Classification with Single-Channel EEG"
Stars: ✭ 76 (+40.74%)
Mutual labels:  attention
reasoning attention
Unofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (-37.04%)
Mutual labels:  attention
natural-language-joint-query-search
Search photos on Unsplash based on OpenAI's CLIP model, support search with joint image+text queries and attention visualization.
Stars: ✭ 143 (+164.81%)
Mutual labels:  attention
Neural-Machine-Translation
Several basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (-46.3%)
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-48.15%)
Mutual labels:  attention
Recurrent-Independent-Mechanisms
Implementation of the paper Recurrent Independent Mechanisms (https://arxiv.org/pdf/1909.10893.pdf)
Stars: ✭ 90 (+66.67%)
Mutual labels:  attention
attention-target-detection
[CVPR2020] "Detecting Attended Visual Targets in Video"
Stars: ✭ 105 (+94.44%)
Mutual labels:  attention
tai5-uan5 gian5-gi2 kang1-ku7
臺灣言語工具
Stars: ✭ 79 (+46.3%)
Mutual labels:  machine-translation
minimal-nmt
A minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (-33.33%)
extreme-adaptation-for-personalized-translation
Code for the paper "Extreme Adaptation for Personalized Neural Machine Translation"
Stars: ✭ 42 (-22.22%)
Mutual labels:  machine-translation
AoA-pytorch
A Pytorch implementation of Attention on Attention module (both self and guided variants), for Visual Question Answering
Stars: ✭ 33 (-38.89%)
Mutual labels:  attention
DeepLearningReading
Deep Learning and Machine Learning mini-projects. Current Project: Deepmind Attentive Reader (rc-data)
Stars: ✭ 78 (+44.44%)
Mutual labels:  attention
Deep-NLP-Resources
Curated list of all NLP Resources
Stars: ✭ 65 (+20.37%)
Mutual labels:  machine-translation
flow1d
[ICCV 2021 Oral] High-Resolution Optical Flow from 1D Attention and Correlation
Stars: ✭ 91 (+68.52%)
Mutual labels:  attention
DataAugmentationNMT
Data Augmentation for Neural Machine Translation
Stars: ✭ 26 (-51.85%)
DCGCN
Densely Connected Graph Convolutional Networks for Graph-to-Sequence Learning (authors' MXNet implementation for the TACL19 paper)
Stars: ✭ 73 (+35.19%)
gqa-node-properties
Recalling node properties from a knowledge graph
Stars: ✭ 19 (-64.81%)
Mutual labels:  attention
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (-20.37%)
Mutual labels:  attention
bert attn viz
Visualize BERT's self-attention layers on text classification tasks
Stars: ✭ 41 (-24.07%)
Mutual labels:  attention
zero
Zero -- A neural machine translation system
Stars: ✭ 121 (+124.07%)
SSAN
How Does Selective Mechanism Improve Self-attention Networks?
Stars: ✭ 18 (-66.67%)
bytenet translation
A TensorFlow Implementation of Machine Translation In Neural Machine Translation in Linear Time
Stars: ✭ 60 (+11.11%)
Image-Captioning
Image Captioning with Keras
Stars: ✭ 60 (+11.11%)
Mutual labels:  attention
deepl-rb
A simple ruby gem for the DeepL API
Stars: ✭ 38 (-29.63%)
Mutual labels:  machine-translation
AiR
Official Repository for ECCV 2020 paper "AiR: Attention with Reasoning Capability"
Stars: ✭ 41 (-24.07%)
Mutual labels:  attention
osdg-tool
OSDG is an open-source tool that maps and connects activities to the UN Sustainable Development Goals (SDGs) by identifying SDG-relevant content in any text. The tool is available online at www.osdg.ai. API access available for research purposes.
Stars: ✭ 22 (-59.26%)
Mutual labels:  machine-translation
torch-multi-head-attention
Multi-head attention in PyTorch
Stars: ✭ 93 (+72.22%)
Mutual labels:  attention
keras-utility-layer-collection
Collection of custom layers and utility functions for Keras which are missing in the main framework.
Stars: ✭ 63 (+16.67%)
Mutual labels:  attention
tensorflow-chatbot-chinese
網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (-7.41%)
Mutual labels:  attention
transformer-slt
Sign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (+70.37%)
jeelizGlanceTracker
JavaScript/WebGL lib: detect if the user is looking at the screen or not from the webcam video feed. Lightweight and robust to all lighting conditions. Great for play/pause videos if the user is looking or not, or for person detection. Link to live demo.
Stars: ✭ 68 (+25.93%)
Mutual labels:  attention
pytorch-attention-augmented-convolution
A pytorch implementation of https://arxiv.org/abs/1904.09925
Stars: ✭ 20 (-62.96%)
Mutual labels:  attention
Im2LaTeX
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-70.37%)
Mutual labels:  attention
Machine-Translation-v2
英中机器文本翻译
Stars: ✭ 48 (-11.11%)
Mutual labels:  machine-translation
chinese ancient poetry
seq2seq attention tensorflow textrank context
Stars: ✭ 30 (-44.44%)
Mutual labels:  attention
attention-ocr
A pytorch implementation of the attention based ocr
Stars: ✭ 44 (-18.52%)
Mutual labels:  attention
Diverse-Structure-Inpainting
CVPR 2021: "Generating Diverse Structure for Image Inpainting With Hierarchical VQ-VAE"
Stars: ✭ 131 (+142.59%)
Mutual labels:  attention
CoVA-Web-Object-Detection
A Context-aware Visual Attention-based training pipeline for Object Detection from a Webpage screenshot!
Stars: ✭ 18 (-66.67%)
Mutual labels:  attention
NLP Toolkit
Library of state-of-the-art models (PyTorch) for NLP tasks
Stars: ✭ 92 (+70.37%)
Mutual labels:  machine-translation
dodrio
Exploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+331.48%)
Mutual labels:  multi-head-attention
how attentive are gats
Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
Stars: ✭ 200 (+270.37%)
Mutual labels:  attention
dreyeve
[TPAMI 2018] Predicting the Driver’s Focus of Attention: the DR(eye)VE Project. A deep neural network learnt to reproduce the human driver focus of attention (FoA) in a variety of real-world driving scenarios.
Stars: ✭ 88 (+62.96%)
Mutual labels:  attention
sb-nmt
Code for Synchronous Bidirectional Neural Machine Translation (SB-NMT)
Stars: ✭ 66 (+22.22%)
Mutual labels:  machine-translation
apertium-apy
📦 Apertium HTTP Server in Python
Stars: ✭ 29 (-46.3%)
Mutual labels:  machine-translation
visualization
a collection of visualization function
Stars: ✭ 189 (+250%)
Mutual labels:  attention
61-120 of 327 similar projects