char-VAEInspired by the neural style algorithm in the computer vision field, we propose a high-level language model with the aim of adapting the linguistic style.
Stars: ✭ 18 (-87.59%)
Word Rnn TensorflowMulti-layer Recurrent Neural Networks (LSTM, RNN) for word-level language models in Python using TensorFlow.
Stars: ✭ 1,297 (+794.48%)
SleepeegnetSleepEEGNet: Automated Sleep Stage Scoring with Sequence to Sequence Deep Learning Approach
Stars: ✭ 89 (-38.62%)
Numpy MlMachine learning, in numpy
Stars: ✭ 11,100 (+7555.17%)
Dreamrnn based model for recommendations
Stars: ✭ 77 (-46.9%)
CaptcharecognitionEnd-to-end variable length Captcha recognition using CNN+RNN+Attention/CTC (pytorch implementation). 端到端的不定长验证码识别
Stars: ✭ 97 (-33.1%)
ArnetCVPR 2018 - Regularizing RNNs for Caption Generation by Reconstructing The Past with The Present
Stars: ✭ 94 (-35.17%)
Srl ZooState Representation Learning (SRL) zoo with PyTorch - Part of S-RL Toolbox
Stars: ✭ 125 (-13.79%)
Lstms.pthPyTorch implementations of LSTM Variants (Dropout + Layer Norm)
Stars: ✭ 111 (-23.45%)
Lstm chemImplementation of the paper - Generative Recurrent Networks for De Novo Drug Design.
Stars: ✭ 87 (-40%)
Rnn recsysOur implementation of the paper "Embedding-based News Recommendation for Millions of Users"
Stars: ✭ 135 (-6.9%)
CodeECG Classification
Stars: ✭ 78 (-46.21%)
MojitalkCode for "MojiTalk: Generating Emotional Responses at Scale" https://arxiv.org/abs/1711.04090
Stars: ✭ 107 (-26.21%)
Gru Svm[ICMLC 2018] A Neural Network Architecture Combining Gated Recurrent Unit (GRU) and Support Vector Machine (SVM) for Intrusion Detection
Stars: ✭ 76 (-47.59%)
Indrnn Pytorchpytorch implementation of Independently Recurrent Neural Networks https://arxiv.org/abs/1803.04831
Stars: ✭ 104 (-28.28%)
Codegan[Deprecated] Source Code Generation using Sequence Generative Adversarial Networks
Stars: ✭ 73 (-49.66%)
Hred Attention TensorflowAn extension on the Hierachical Recurrent Encoder-Decoder for Generative Context-Aware Query Suggestion, our implementation is in Tensorflow and uses an attention mechanism.
Stars: ✭ 68 (-53.1%)
O GanO-GAN: Extremely Concise Approach for Auto-Encoding Generative Adversarial Networks
Stars: ✭ 117 (-19.31%)
Chinese Chatbot中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (-14.48%)
Pytorch Pos TaggingA tutorial on how to implement models for part-of-speech tagging using PyTorch and TorchText.
Stars: ✭ 96 (-33.79%)
Pytorch cppDeep Learning sample programs using PyTorch in C++
Stars: ✭ 114 (-21.38%)
EasyesnPython library for Reservoir Computing using Echo State Networks
Stars: ✭ 93 (-35.86%)
QrnQuery-Reduction Networks (QRN)
Stars: ✭ 137 (-5.52%)
Rnn Handwriting GenerationHandwriting generation by RNN with TensorFlow, based on "Generating Sequences With Recurrent Neural Networks" by Alex Graves
Stars: ✭ 90 (-37.93%)
Midi RnnGenerate monophonic melodies with machine learning using a basic LSTM RNN
Stars: ✭ 124 (-14.48%)
Vae For Image GenerationImplemented Variational Autoencoder generative model in Keras for image generation and its latent space visualization on MNIST and CIFAR10 datasets
Stars: ✭ 87 (-40%)
Mnist ClassificationPytorch、Scikit-learn实现多种分类方法,包括逻辑回归(Logistic Regression)、多层感知机(MLP)、支持向量机(SVM)、K近邻(KNN)、CNN、RNN,极简代码适合新手小白入门,附英文实验报告(ACM模板)
Stars: ✭ 109 (-24.83%)
SmrtHandle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.
Stars: ✭ 102 (-29.66%)
Plasma PythonPPPL deep learning disruption prediction package
Stars: ✭ 65 (-55.17%)
Vmf vae nlpCode for EMNLP18 paper "Spherical Latent Spaces for Stable Variational Autoencoders"
Stars: ✭ 140 (-3.45%)
Rnn From ScratchUse tensorflow's tf.scan to build vanilla, GRU and LSTM RNNs
Stars: ✭ 123 (-15.17%)
Cross Lingual Voice CloningTacotron 2 - PyTorch implementation with faster-than-realtime inference modified to enable cross lingual voice cloning.
Stars: ✭ 106 (-26.9%)
Dfc VaeVariational Autoencoder trained by Feature Perceputal Loss
Stars: ✭ 74 (-48.97%)
Patterspeech-to-text in pytorch
Stars: ✭ 71 (-51.03%)
See RnnRNN and general weights, gradients, & activations visualization in Keras & TensorFlow
Stars: ✭ 102 (-29.66%)
Linear Attention Recurrent Neural NetworkA recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-17.93%)
Collaborative RnnA TensorFlow implementation of the collaborative RNN (Ko et al, 2016).
Stars: ✭ 60 (-58.62%)
CodesearchnetDatasets, tools, and benchmarks for representation learning of code.
Stars: ✭ 1,378 (+850.34%)
Tensorflow Mnist CvaeTensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (-4.14%)
Deep Learning With PythonExample projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (-7.59%)