Linear Attention Recurrent Neural NetworkA recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (+52.56%)
Rnn Nmt基于双向RNN,Attention机制的编解码神经机器翻译模型
Stars: ✭ 46 (-41.03%)
Seq2seq PytorchSequence to Sequence Models with PyTorch
Stars: ✭ 678 (+769.23%)
Mxnet Seq2seqSequence to sequence learning with MXNET
Stars: ✭ 51 (-34.62%)
DeepattentionDeep Visual Attention Prediction (TIP18)
Stars: ✭ 65 (-16.67%)
Boilerplate Dynet Rnn LmBoilerplate code for quickly getting set up to run language modeling experiments
Stars: ✭ 37 (-52.56%)
TelemanomA framework for using LSTMs to detect anomalies in multivariate time series data. Includes spacecraft anomaly data and experiments from the Mars Science Laboratory and SMAP missions.
Stars: ✭ 589 (+655.13%)
Deeplearning深度学习入门教程, 优秀文章, Deep Learning Tutorial
Stars: ✭ 6,783 (+8596.15%)
Lstm peptidesLong short-term memory recurrent neural networks for learning peptide and protein sequences to later design new, similar examples.
Stars: ✭ 30 (-61.54%)
Time AttentionImplementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-33.33%)
Seq2seq ChatbotChatbot in 200 lines of code using TensorLayer
Stars: ✭ 777 (+896.15%)
DeepseqslamThe Official Deep Learning Framework for Route-based Place Recognition
Stars: ✭ 49 (-37.18%)
StockpricepredictionStock Price Prediction using Machine Learning Techniques
Stars: ✭ 700 (+797.44%)
Codegan[Deprecated] Source Code Generation using Sequence Generative Adversarial Networks
Stars: ✭ 73 (-6.41%)
Wtte RnnWTTE-RNN a framework for churn and time to event prediction
Stars: ✭ 654 (+738.46%)
Nlp overviewOverview of Modern Deep Learning Techniques Applied to Natural Language Processing
Stars: ✭ 1,104 (+1315.38%)
IndrnnTensorFlow implementation of Independently Recurrent Neural Networks
Stars: ✭ 511 (+555.13%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+542.31%)
Theano Kaldi RnnTHEANO-KALDI-RNNs is a project implementing various Recurrent Neural Networks (RNNs) for RNN-HMM speech recognition. The Theano Code is coupled with the Kaldi decoder.
Stars: ✭ 31 (-60.26%)
AilearningAiLearning: 机器学习 - MachineLearning - ML、深度学习 - DeepLearning - DL、自然语言处理 NLP
Stars: ✭ 32,316 (+41330.77%)
Eda nlpData augmentation for NLP, presented at EMNLP 2019
Stars: ✭ 902 (+1056.41%)
Gru Svm[ICMLC 2018] A Neural Network Architecture Combining Gated Recurrent Unit (GRU) and Support Vector Machine (SVM) for Intrusion Detection
Stars: ✭ 76 (-2.56%)
Tf Rnn AttentionTensorflow implementation of attention mechanism for text classification tasks.
Stars: ✭ 735 (+842.31%)
Plasma PythonPPPL deep learning disruption prediction package
Stars: ✭ 65 (-16.67%)
Rnn NotebooksRNN(SimpleRNN, LSTM, GRU) Tensorflow2.0 & Keras Notebooks (Workshop materials)
Stars: ✭ 48 (-38.46%)
Dreamrnn based model for recommendations
Stars: ✭ 77 (-1.28%)
Cs224nCS224n: Natural Language Processing with Deep Learning Assignments Winter, 2017
Stars: ✭ 656 (+741.03%)
Deep SpeelingDeep Learning neural network for correcting spelling
Stars: ✭ 45 (-42.31%)
Ad examplesA collection of anomaly detection methods (iid/point-based, graph and time series) including active learning for anomaly detection/discovery, bayesian rule-mining, description for diversity/explanation/interpretability. Analysis of incorporating label feedback with ensemble and tree-based detectors. Includes adversarial attacks with Graph Convolutional Network.
Stars: ✭ 641 (+721.79%)
Collaborative RnnA TensorFlow implementation of the collaborative RNN (Ko et al, 2016).
Stars: ✭ 60 (-23.08%)
Multi Class Text Classification Cnn RnnClassify Kaggle San Francisco Crime Description into 39 classes. Build the model with CNN, RNN (GRU and LSTM) and Word Embeddings on Tensorflow.
Stars: ✭ 570 (+630.77%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+1169.23%)
Video ClassificationTutorial for video classification/ action recognition using 3D CNN/ CNN+RNN on UCF101
Stars: ✭ 543 (+596.15%)
Patterspeech-to-text in pytorch
Stars: ✭ 71 (-8.97%)
HeadlinesAutomatically generate headlines to short articles
Stars: ✭ 516 (+561.54%)
Mozi此项目致力于构建一套最基础,最精简,可维护的react-native项目,支持ios,android 🌹
Stars: ✭ 501 (+542.31%)
RganRecurrent (conditional) generative adversarial networks for generating real-valued time series data.
Stars: ✭ 480 (+515.38%)
Attentive Neural Processesimplementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-57.69%)
Hred Attention TensorflowAn extension on the Hierachical Recurrent Encoder-Decoder for Generative Context-Aware Query Suggestion, our implementation is in Tensorflow and uses an attention mechanism.
Stars: ✭ 68 (-12.82%)
Char rnn lm zhlanguage model in Chinese,基于Pytorch官方文档实现
Stars: ✭ 57 (-26.92%)