All Projects → Attend_infer_repeat → Similar Projects or Alternatives

1475 Open source projects that are alternatives of or similar to Attend_infer_repeat

Numpy Ml
Machine learning, in numpy
Stars: ✭ 11,100 (+13436.59%)
Mutual labels:  neural-networks, attention, vae
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (-47.56%)
Mutual labels:  rnn, attention, attention-mechanism
char-VAE
Inspired by the neural style algorithm in the computer vision field, we propose a high-level language model with the aim of adapting the linguistic style.
Stars: ✭ 18 (-78.05%)
Mutual labels:  generative-model, rnn, vae
EBIM-NLI
Enhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (-70.73%)
Mutual labels:  rnn, attention
Deepjazz
Deep learning driven jazz generation using Keras & Theano!
Stars: ✭ 2,766 (+3273.17%)
Mutual labels:  neural-networks, rnn
InpaintNet
Code accompanying ISMIR'19 paper titled "Learning to Traverse Latent Spaces for Musical Score Inpaintning"
Stars: ✭ 48 (-41.46%)
Mutual labels:  generative-model, vae
ntua-slp-semeval2018
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (-3.66%)
Mutual labels:  attention, attention-mechanism
visualization
a collection of visualization function
Stars: ✭ 189 (+130.49%)
Mutual labels:  attention, attention-mechanism
generative deep learning
Generative Deep Learning Sessions led by Anugraha Sinha (Machine Learning Tokyo)
Stars: ✭ 24 (-70.73%)
Mutual labels:  generative-model, vae
DiffuseVAE
A combination of VAE's and Diffusion Models for efficient, controllable and high-fidelity generation from low-dimensional latents
Stars: ✭ 81 (-1.22%)
Mutual labels:  generative-model, vae
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+401.22%)
Mutual labels:  attention-mechanism, attention
Attentionn
All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
Stars: ✭ 175 (+113.41%)
Mutual labels:  neural-networks, attention
Deep Learning With Python
Deep learning codes and projects using Python
Stars: ✭ 195 (+137.8%)
Mutual labels:  neural-networks, rnn
keras-utility-layer-collection
Collection of custom layers and utility functions for Keras which are missing in the main framework.
Stars: ✭ 63 (-23.17%)
Mutual labels:  rnn, attention
Anime4k
A High-Quality Real Time Upscaler for Anime Video
Stars: ✭ 14,083 (+17074.39%)
style-vae
Implementation of VAE and Style-GAN Architecture Achieving State of the Art Reconstruction
Stars: ✭ 25 (-69.51%)
Mutual labels:  generative-model, vae
datastories-semeval2017-task6
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-75.61%)
Mutual labels:  attention, attention-mechanism
NTUA-slp-nlp
💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-76.83%)
Mutual labels:  attention, attention-mechanism
vqvae-2
PyTorch implementation of VQ-VAE-2 from "Generating Diverse High-Fidelity Images with VQ-VAE-2"
Stars: ✭ 65 (-20.73%)
Mutual labels:  generative-model, vae
Tensorflow Generative Model Collections
Collection of generative models in Tensorflow
Stars: ✭ 3,785 (+4515.85%)
Mutual labels:  generative-model, vae
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+397.56%)
Mutual labels:  attention-mechanism, attention
Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+409.76%)
Mutual labels:  generative-model, vae
Deeplearning.ai Natural Language Processing Specialization
This repository contains my full work and notes on Coursera's NLP Specialization (Natural Language Processing) taught by the instructor Younes Bensouda Mourri and Łukasz Kaiser offered by deeplearning.ai
Stars: ✭ 473 (+476.83%)
Performer Pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+565.85%)
Mutual labels:  attention-mechanism, attention
Machine Learning
My Attempt(s) In The World Of ML/DL....
Stars: ✭ 78 (-4.88%)
Mutual labels:  rnn, attention
Pytorch Mnist Vae
Stars: ✭ 32 (-60.98%)
Mutual labels:  generative-model, vae
Gat
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Stars: ✭ 2,229 (+2618.29%)
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (+63.41%)
Mutual labels:  neural-networks, vae
Gam
A PyTorch implementation of "Graph Classification Using Structural Attention" (KDD 2018).
Stars: ✭ 227 (+176.83%)
Mutual labels:  neural-networks, attention
Pygat
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Stars: ✭ 1,853 (+2159.76%)
Im2LaTeX
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-80.49%)
Mutual labels:  attention, attention-mechanism
lstm-attention
Attention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (+6.1%)
Mutual labels:  attention, attention-mechanism
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+47.56%)
Mutual labels:  attention, attention-mechanism
Attentive Neural Processes
implementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-59.76%)
Mutual labels:  rnn, attention
Dfc Vae
Variational Autoencoder trained by Feature Perceputal Loss
Stars: ✭ 74 (-9.76%)
Mutual labels:  generative-model, vae
Linear-Attention-Mechanism
Attention mechanism
Stars: ✭ 27 (-67.07%)
Mutual labels:  attention, attention-mechanism
Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow
Stars: ✭ 33 (-59.76%)
Mutual labels:  attention, attention-mechanism
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-30.49%)
Mutual labels:  attention, attention-mechanism
AoA-pytorch
A Pytorch implementation of Attention on Attention module (both self and guided variants), for Visual Question Answering
Stars: ✭ 33 (-59.76%)
Mutual labels:  attention, attention-mechanism
Attention Over Attention Tf Qa
论文“Attention-over-Attention Neural Networks for Reading Comprehension”中AoA模型实现
Stars: ✭ 58 (-29.27%)
Mutual labels:  rnn, attention
Base-On-Relation-Method-Extract-News-DA-RNN-Model-For-Stock-Prediction--Pytorch
基於關聯式新聞提取方法之雙階段注意力機制模型用於股票預測
Stars: ✭ 33 (-59.76%)
Mutual labels:  rnn, attention
Smrt
Handle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.
Stars: ✭ 102 (+24.39%)
Mutual labels:  neural-networks, vae
Easy Deep Learning With Keras
Keras tutorial for beginners (using TF backend)
Stars: ✭ 367 (+347.56%)
Mutual labels:  neural-networks, rnn
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (+273.17%)
Mutual labels:  attention-mechanism, attention
Torchgan
Research Framework for easy and efficient training of GANs based on Pytorch
Stars: ✭ 1,156 (+1309.76%)
Deeplearning.ai Assignments
Stars: ✭ 268 (+226.83%)
Mutual labels:  neural-networks, rnn
Sentence Vae
PyTorch Re-Implementation of "Generating Sentences from a Continuous Space" by Bowman et al 2015 https://arxiv.org/abs/1511.06349
Stars: ✭ 462 (+463.41%)
Mutual labels:  generative-model, vae
Structured Self Attention
A Structured Self-attentive Sentence Embedding
Stars: ✭ 459 (+459.76%)
Mutual labels:  attention-mechanism, attention
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+591.46%)
srVAE
VAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-31.71%)
Mutual labels:  generative-model, vae
Isab Pytorch
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-74.39%)
Mutual labels:  attention-mechanism, attention
Pytorch Gat
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+1007.32%)
Mutual labels:  attention-mechanism, attention
Time Attention
Implementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-36.59%)
Mutual labels:  rnn, attention
Generative Models
Collection of generative models, e.g. GAN, VAE in Pytorch and Tensorflow.
Stars: ✭ 6,701 (+8071.95%)
Mutual labels:  generative-model, vae
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+4068.29%)
Mutual labels:  rnn, attention
Codesearchnet
Datasets, tools, and benchmarks for representation learning of code.
Stars: ✭ 1,378 (+1580.49%)
Mutual labels:  neural-networks, rnn
Attention
一些不同的Attention机制代码
Stars: ✭ 17 (-79.27%)
Mutual labels:  attention, attention-mechanism
Tf Rnn Attention
Tensorflow implementation of attention mechanism for text classification tasks.
Stars: ✭ 735 (+796.34%)
Mutual labels:  rnn, attention
Vae protein function
Protein function prediction using a variational autoencoder
Stars: ✭ 57 (-30.49%)
Mutual labels:  generative-model, vae
Global Self Attention Network
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-21.95%)
Mutual labels:  attention-mechanism, attention
1-60 of 1475 similar projects