svae cf[ WSDM '19 ] Sequential Variational Autoencoders for Collaborative Filtering
Stars: ✭ 38 (+2.7%)
Subword NmtUnsupervised Word Segmentation for Neural Machine Translation and Text Generation
Stars: ✭ 1,819 (+4816.22%)
lego-face-VAEVariational autoencoder for Lego minifig faces
Stars: ✭ 15 (-59.46%)
vae-concreteKeras implementation of a Variational Auto Encoder with a Concrete Latent Distribution
Stars: ✭ 51 (+37.84%)
NematusOpen-Source Neural Machine Translation in Tensorflow
Stars: ✭ 730 (+1872.97%)
vat nmtImplementation of "Effective Adversarial Regularization for Neural Machine Translation", ACL 2019
Stars: ✭ 22 (-40.54%)
RNNSearchAn implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (+16.22%)
VNMTCode for "Variational Neural Machine Translation" (EMNLP2016)
Stars: ✭ 54 (+45.95%)
deep-blueberryIf you've always wanted to learn about deep-learning but don't know where to start, then you might have stumbled upon the right place!
Stars: ✭ 17 (-54.05%)
JoeynmtMinimalist NMT for educational purposes
Stars: ✭ 420 (+1035.14%)
continuous BernoulliThere are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-40.54%)
normalizing-flowsPyTorch implementation of normalizing flow models
Stars: ✭ 271 (+632.43%)
multimodal-vae-publicA PyTorch implementation of "Multimodal Generative Models for Scalable Weakly-Supervised Learning" (https://arxiv.org/abs/1802.05335)
Stars: ✭ 98 (+164.86%)
benchmark VAEUnifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+3172.97%)
parallel-corpora-toolsTools for filtering and cleaning parallel and monolingual corpora for machine translation and other natural language processing tasks.
Stars: ✭ 35 (-5.41%)
Transformer ClinicUnderstanding the Difficulty of Training Transformers
Stars: ✭ 179 (+383.78%)
haskell-vaeLearning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-51.35%)
Nmt gangenerative adversarial nets for neural machine translation
Stars: ✭ 110 (+197.3%)
Rnn Nmt基于双向RNN,Attention机制的编解码神经机器翻译模型
Stars: ✭ 46 (+24.32%)
adVAEImplementation of 'Self-Adversarial Variational Autoencoder with Gaussian Anomaly Prior Distribution for Anomaly Detection'
Stars: ✭ 17 (-54.05%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+1254.05%)
eccv16 attr2imgTorch Implemention of ECCV'16 paper: Attribute2Image
Stars: ✭ 93 (+151.35%)
AC-VRNNPyTorch code for CVIU paper "AC-VRNN: Attentive Conditional-VRNN for Multi-Future Trajectory Prediction"
Stars: ✭ 21 (-43.24%)
Deep learning nlpKeras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (+1000%)
linguistic-style-transfer-pytorchImplementation of "Disentangled Representation Learning for Non-Parallel Text Style Transfer(ACL 2019)" in Pytorch
Stars: ✭ 55 (+48.65%)
vaeganAn implementation of VAEGAN (variational autoencoder + generative adversarial network).
Stars: ✭ 88 (+137.84%)
nmtNetwork mapping tool
Stars: ✭ 16 (-56.76%)
CVAE DialCVAE_XGate model in paper "Xu, Dusek, Konstas, Rieser. Better Conversations by Modeling, Filtering, and Optimizing for Coherence and Diversity"
Stars: ✭ 16 (-56.76%)
SIVIUsing neural network to build expressive hierarchical distribution; A variational method to accurately estimate posterior uncertainty; A fast and general method for Bayesian inference. (ICML 2018)
Stars: ✭ 49 (+32.43%)
VAE-Latent-Space-ExplorerInteractive exploration of MNIST variational autoencoder latent space with React and tensorflow.js.
Stars: ✭ 30 (-18.92%)
playing with vaeComparing FC VAE / FCN VAE / PCA / UMAP on MNIST / FMNIST
Stars: ✭ 53 (+43.24%)
soft-intro-vae-pytorch[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (+359.46%)
vae-torchVariational autoencoder for anomaly detection (in PyTorch).
Stars: ✭ 38 (+2.7%)
MIDI-VAENo description or website provided.
Stars: ✭ 56 (+51.35%)
Nlp pytorch projectEmbedding, NMT, Text_Classification, Text_Generation, NER etc.
Stars: ✭ 153 (+313.51%)
Nmtpynmtpy is a Python framework based on dl4mt-tutorial to experiment with Neural Machine Translation pipelines.
Stars: ✭ 127 (+243.24%)
OCDVAEContinualLearningOpen-source code for our paper: Unified Probabilistic Deep Continual Learning through Generative Replay and Open Set Recognition
Stars: ✭ 56 (+51.35%)
Njunmt TfAn open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (+162.16%)
XmunmtAn implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (+86.49%)
pyroVEDInvariant representation learning from imaging and spectral data
Stars: ✭ 23 (-37.84%)
Nlp tensorflow projectUse tensorflow to achieve some NLP project, eg: classification chatbot ner attention QAetc.
Stars: ✭ 27 (-27.03%)
vae-pytorchAE and VAE Playground in PyTorch
Stars: ✭ 53 (+43.24%)
ChatlearnerA chatbot implemented in TensorFlow based on the seq2seq model, with certain rules integrated.
Stars: ✭ 528 (+1327.03%)
BagelIPCCC 2018: Robust and Unsupervised KPI Anomaly Detection Based on Conditional Variational Autoencoder
Stars: ✭ 45 (+21.62%)
Attn2dPervasive Attention: 2D Convolutional Networks for Sequence-to-Sequence Prediction
Stars: ✭ 475 (+1183.78%)
STEPSpatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits
Stars: ✭ 39 (+5.41%)
CIKM18-LCVACode for CIKM'18 paper, Linked Causal Variational Autoencoder for Inferring Paired Spillover Effects.
Stars: ✭ 13 (-64.86%)
VAE-Gumbel-SoftmaxAn implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (+78.38%)
extractive rc by runtime mtCode and datasets of "Multilingual Extractive Reading Comprehension by Runtime Machine Translation"
Stars: ✭ 36 (-2.7%)