All Projects → soobinseo → Attentive-Neural-Process

soobinseo / Attentive-Neural-Process

Licence: Apache-2.0 license
A Pytorch Implementation of Attentive Neural Process

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Attentive-Neural-Process

tensorflow-chatbot-chinese
網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (-16.67%)
Mutual labels:  attention
DeepLearningReading
Deep Learning and Machine Learning mini-projects. Current Project: Deepmind Attentive Reader (rc-data)
Stars: ✭ 78 (+30%)
Mutual labels:  attention
External-Attention-pytorch
🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
Stars: ✭ 7,344 (+12140%)
Mutual labels:  attention
AiR
Official Repository for ECCV 2020 paper "AiR: Attention with Reasoning Capability"
Stars: ✭ 41 (-31.67%)
Mutual labels:  attention
flow1d
[ICCV 2021 Oral] High-Resolution Optical Flow from 1D Attention and Correlation
Stars: ✭ 91 (+51.67%)
Mutual labels:  attention
Recurrent-Independent-Mechanisms
Implementation of the paper Recurrent Independent Mechanisms (https://arxiv.org/pdf/1909.10893.pdf)
Stars: ✭ 90 (+50%)
Mutual labels:  attention
Im2LaTeX
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-73.33%)
Mutual labels:  attention
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-51.67%)
Mutual labels:  attention
NeuralProcesses.jl
A framework for composing Neural Processes in Julia
Stars: ✭ 69 (+15%)
Mutual labels:  neural-processes
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-18.33%)
Mutual labels:  attention
bert attn viz
Visualize BERT's self-attention layers on text classification tasks
Stars: ✭ 41 (-31.67%)
Mutual labels:  attention
reasoning attention
Unofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (-43.33%)
Mutual labels:  attention
RecycleNet
Attentional Learning of Trash Classification
Stars: ✭ 23 (-61.67%)
Mutual labels:  attention
keras-utility-layer-collection
Collection of custom layers and utility functions for Keras which are missing in the main framework.
Stars: ✭ 63 (+5%)
Mutual labels:  attention
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+101.67%)
Mutual labels:  attention
pytorch-attention-augmented-convolution
A pytorch implementation of https://arxiv.org/abs/1904.09925
Stars: ✭ 20 (-66.67%)
Mutual labels:  attention
DeepMove
Codes for WWW'18 Paper-DeepMove: Predicting Human Mobility with Attentional Recurrent Network
Stars: ✭ 120 (+100%)
Mutual labels:  attention
free-lunch-saliency
Code for "Free-Lunch Saliency via Attention in Atari Agents"
Stars: ✭ 15 (-75%)
Mutual labels:  attention
EBIM-NLI
Enhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (-60%)
Mutual labels:  attention
AttnSleep
[IEEE TNSRE] "An Attention-based Deep Learning Approach for Sleep Stage Classification with Single-Channel EEG"
Stars: ✭ 76 (+26.67%)
Mutual labels:  attention

Attentive-Neural-Process

Description

  • A pytorch implementation of Attentive Neural Process.
  • Simple code for generating samples with ANP.
  • I will update the super-resolution experiments soon.

Requirements

  • Install python 3
  • Install pytorch == 0.4.0

File description

  • preprocess.py includes all preprocessing codes when you loads data.
  • module.py contains all methods, including attention, linear and so on.
  • network.py contains whole structure of network.
  • train.py is for training ANP model.
  • generate.ipynb is for generating samples.

Results

test samples after 50 epoch training with random context selection.

  • original

* 10 contexts

* 50 contexts

* 100 contexts

* half contexts

Reference

Comments

  • Any comments for the codes are always welcome.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].