free-lunch-saliencyCode for "Free-Lunch Saliency via Attention in Atari Agents"
Stars: ✭ 15 (-72.22%)
ilmultiTooling to play around with multilingual machine translation for Indian Languages.
Stars: ✭ 19 (-64.81%)
EBIM-NLIEnhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (-55.56%)
BSDThe Business Scene Dialogue corpus
Stars: ✭ 51 (-5.56%)
LFattNetAttention-based View Selection Networks for Light-field Disparity Estimation
Stars: ✭ 41 (-24.07%)
urbansA tool for translating text from source grammar to target grammar (context-free) with corresponding dictionary.
Stars: ✭ 19 (-64.81%)
inmtInteractive Neural Machine Translation tool
Stars: ✭ 44 (-18.52%)
rtgReader Translator Generator - NMT toolkit based on pytorch
Stars: ✭ 26 (-51.85%)
TRAR-VQA[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-9.26%)
LNSwipeCell一套友好的、方便集成的针对cell的左滑编辑功能!
Stars: ✭ 16 (-70.37%)
AttnSleep[IEEE TNSRE] "An Attention-based Deep Learning Approach for Sleep Stage Classification with Single-Channel EEG"
Stars: ✭ 76 (+40.74%)
reasoning attentionUnofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (-37.04%)
natural-language-joint-query-searchSearch photos on Unsplash based on OpenAI's CLIP model, support search with joint image+text queries and attention visualization.
Stars: ✭ 143 (+164.81%)
Neural-Machine-TranslationSeveral basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (-46.3%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-48.15%)
minimal-nmtA minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (-33.33%)
AoA-pytorchA Pytorch implementation of Attention on Attention module (both self and guided variants), for Visual Question Answering
Stars: ✭ 33 (-38.89%)
DeepLearningReadingDeep Learning and Machine Learning mini-projects. Current Project: Deepmind Attentive Reader (rc-data)
Stars: ✭ 78 (+44.44%)
flow1d[ICCV 2021 Oral] High-Resolution Optical Flow from 1D Attention and Correlation
Stars: ✭ 91 (+68.52%)
DCGCNDensely Connected Graph Convolutional Networks for Graph-to-Sequence Learning (authors' MXNet implementation for the TACL19 paper)
Stars: ✭ 73 (+35.19%)
bert attn vizVisualize BERT's self-attention layers on text classification tasks
Stars: ✭ 41 (-24.07%)
zeroZero -- A neural machine translation system
Stars: ✭ 121 (+124.07%)
SSANHow Does Selective Mechanism Improve Self-attention Networks?
Stars: ✭ 18 (-66.67%)
bytenet translationA TensorFlow Implementation of Machine Translation In Neural Machine Translation in Linear Time
Stars: ✭ 60 (+11.11%)
deepl-rbA simple ruby gem for the DeepL API
Stars: ✭ 38 (-29.63%)
AiROfficial Repository for ECCV 2020 paper "AiR: Attention with Reasoning Capability"
Stars: ✭ 41 (-24.07%)
osdg-toolOSDG is an open-source tool that maps and connects activities to the UN Sustainable Development Goals (SDGs) by identifying SDG-relevant content in any text. The tool is available online at www.osdg.ai. API access available for research purposes.
Stars: ✭ 22 (-59.26%)
keras-utility-layer-collectionCollection of custom layers and utility functions for Keras which are missing in the main framework.
Stars: ✭ 63 (+16.67%)
tensorflow-chatbot-chinese網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (-7.41%)
transformer-sltSign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (+70.37%)
jeelizGlanceTrackerJavaScript/WebGL lib: detect if the user is looking at the screen or not from the webcam video feed. Lightweight and robust to all lighting conditions. Great for play/pause videos if the user is looking or not, or for person detection. Link to live demo.
Stars: ✭ 68 (+25.93%)
Im2LaTeXAn implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-70.37%)
attention-ocrA pytorch implementation of the attention based ocr
Stars: ✭ 44 (-18.52%)
CoVA-Web-Object-DetectionA Context-aware Visual Attention-based training pipeline for Object Detection from a Webpage screenshot!
Stars: ✭ 18 (-66.67%)
NLP ToolkitLibrary of state-of-the-art models (PyTorch) for NLP tasks
Stars: ✭ 92 (+70.37%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+331.48%)
how attentive are gatsCode for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
Stars: ✭ 200 (+270.37%)
dreyeve[TPAMI 2018] Predicting the Driver’s Focus of Attention: the DR(eye)VE Project. A deep neural network learnt to reproduce the human driver focus of attention (FoA) in a variety of real-world driving scenarios.
Stars: ✭ 88 (+62.96%)
sb-nmtCode for Synchronous Bidirectional Neural Machine Translation (SB-NMT)
Stars: ✭ 66 (+22.22%)
apertium-apy📦 Apertium HTTP Server in Python
Stars: ✭ 29 (-46.3%)
visualizationa collection of visualization function
Stars: ✭ 189 (+250%)