Deep SpeelingDeep Learning neural network for correcting spelling
Stars: ✭ 45 (-56.73%)
AilearningAiLearning: 机器学习 - MachineLearning - ML、深度学习 - DeepLearning - DL、自然语言处理 NLP
Stars: ✭ 32,316 (+30973.08%)
Codegan[Deprecated] Source Code Generation using Sequence Generative Adversarial Networks
Stars: ✭ 73 (-29.81%)
Tf Rnn AttentionTensorflow implementation of attention mechanism for text classification tasks.
Stars: ✭ 735 (+606.73%)
Attentive Neural Processesimplementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-68.27%)
Eda nlpData augmentation for NLP, presented at EMNLP 2019
Stars: ✭ 902 (+767.31%)
Plasma PythonPPPL deep learning disruption prediction package
Stars: ✭ 65 (-37.5%)
Seq2seq PytorchSequence to Sequence Models with PyTorch
Stars: ✭ 678 (+551.92%)
Rnn NotebooksRNN(SimpleRNN, LSTM, GRU) Tensorflow2.0 & Keras Notebooks (Workshop materials)
Stars: ✭ 48 (-53.85%)
Word Rnn TensorflowMulti-layer Recurrent Neural Networks (LSTM, RNN) for word-level language models in Python using TensorFlow.
Stars: ✭ 1,297 (+1147.12%)
Boilerplate Dynet Rnn LmBoilerplate code for quickly getting set up to run language modeling experiments
Stars: ✭ 37 (-64.42%)
Theano Kaldi RnnTHEANO-KALDI-RNNs is a project implementing various Recurrent Neural Networks (RNNs) for RNN-HMM speech recognition. The Theano Code is coupled with the Kaldi decoder.
Stars: ✭ 31 (-70.19%)
Hred Attention TensorflowAn extension on the Hierachical Recurrent Encoder-Decoder for Generative Context-Aware Query Suggestion, our implementation is in Tensorflow and uses an attention mechanism.
Stars: ✭ 68 (-34.62%)
SleepeegnetSleepEEGNet: Automated Sleep Stage Scoring with Sequence to Sequence Deep Learning Approach
Stars: ✭ 89 (-14.42%)
Nlp overviewOverview of Modern Deep Learning Techniques Applied to Natural Language Processing
Stars: ✭ 1,104 (+961.54%)
Time AttentionImplementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-50%)
Cs224nCS224n: Natural Language Processing with Deep Learning Assignments Winter, 2017
Stars: ✭ 656 (+530.77%)
Mxnet Seq2seqSequence to sequence learning with MXNET
Stars: ✭ 51 (-50.96%)
EasyesnPython library for Reservoir Computing using Echo State Networks
Stars: ✭ 93 (-10.58%)
DeepseqslamThe Official Deep Learning Framework for Route-based Place Recognition
Stars: ✭ 49 (-52.88%)
CodeECG Classification
Stars: ✭ 78 (-25%)
Rnn Nmt基于双向RNN,Attention机制的编解码神经机器翻译模型
Stars: ✭ 46 (-55.77%)
CaptcharecognitionEnd-to-end variable length Captcha recognition using CNN+RNN+Attention/CTC (pytorch implementation). 端到端的不定长验证码识别
Stars: ✭ 97 (-6.73%)
Dreamrnn based model for recommendations
Stars: ✭ 77 (-25.96%)
Rnn Handwriting GenerationHandwriting generation by RNN with TensorFlow, based on "Generating Sequences With Recurrent Neural Networks" by Alex Graves
Stars: ✭ 90 (-13.46%)
Rnn Theano使用Theano实现的一些RNN代码,包括最基本的RNN,LSTM,以及部分Attention模型,如论文MLSTM等
Stars: ✭ 31 (-70.19%)
Gru Svm[ICMLC 2018] A Neural Network Architecture Combining Gated Recurrent Unit (GRU) and Support Vector Machine (SVM) for Intrusion Detection
Stars: ✭ 76 (-26.92%)
Lstm peptidesLong short-term memory recurrent neural networks for learning peptide and protein sequences to later design new, similar examples.
Stars: ✭ 30 (-71.15%)
CodesearchnetDatasets, tools, and benchmarks for representation learning of code.
Stars: ✭ 1,378 (+1225%)
Patterspeech-to-text in pytorch
Stars: ✭ 71 (-31.73%)
Seq2seq ChatbotChatbot in 200 lines of code using TensorLayer
Stars: ✭ 777 (+647.12%)
Pytorch Pos TaggingA tutorial on how to implement models for part-of-speech tagging using PyTorch and TorchText.
Stars: ✭ 96 (-7.69%)
Collaborative RnnA TensorFlow implementation of the collaborative RNN (Ko et al, 2016).
Stars: ✭ 60 (-42.31%)
StockpricepredictionStock Price Prediction using Machine Learning Techniques
Stars: ✭ 700 (+573.08%)
Lstm chemImplementation of the paper - Generative Recurrent Networks for De Novo Drug Design.
Stars: ✭ 87 (-16.35%)
See RnnRNN and general weights, gradients, & activations visualization in Keras & TensorFlow
Stars: ✭ 102 (-1.92%)
Text predictorChar-level RNN LSTM text generator📄.
Stars: ✭ 99 (-4.81%)
ArnetCVPR 2018 - Regularizing RNNs for Caption Generation by Reconstructing The Past with The Present
Stars: ✭ 94 (-9.62%)
Char rnn lm zhlanguage model in Chinese,基于Pytorch官方文档实现
Stars: ✭ 57 (-45.19%)