Lstm Human Activity RecognitionHuman Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM RNN. Classifying the type of movement amongst six activity categories - Guillaume Chevalier
Stars: ✭ 2,943 (+2080%)
presidential-rnnProject 4 for Metis bootcamp. Objective was generation of character-level RNN trained on Donald Trump's statements using Keras. Also generated Markov chains, and quick pyTorch RNN as baseline. Attempted semi-supervised GAN, but was unable to test in time.
Stars: ✭ 26 (-80.74%)
Pytorch DncDifferentiable Neural Computers, Sparse Access Memory and Sparse Differentiable Neural Computers, for Pytorch
Stars: ✭ 264 (+95.56%)
Selected StoriesAn experimental web text editor that runs a LSTM model while you write to suggest new lines
Stars: ✭ 39 (-71.11%)
slopeonePHP implementation of the Weighted Slope One rating-based collaborative filtering scheme.
Stars: ✭ 85 (-37.04%)
Rnn Theano使用Theano实现的一些RNN代码,包括最基本的RNN,LSTM,以及部分Attention模型,如论文MLSTM等
Stars: ✭ 31 (-77.04%)
Movielens RecommenderA pure Python implement of Collaborative Filtering based on MovieLens' dataset.
Stars: ✭ 131 (-2.96%)
Har Stacked Residual Bidir LstmsUsing deep stacked residual bidirectional LSTM cells (RNN) with TensorFlow, we do Human Activity Recognition (HAR). Classifying the type of movement amongst 6 categories or 18 categories on 2 different datasets.
Stars: ✭ 250 (+85.19%)
captioning chainerA fast implementation of Neural Image Caption by Chainer
Stars: ✭ 17 (-87.41%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+2431.85%)
Lstm peptidesLong short-term memory recurrent neural networks for learning peptide and protein sequences to later design new, similar examples.
Stars: ✭ 30 (-77.78%)
LightnetEfficient, transparent deep learning in hundreds of lines of code.
Stars: ✭ 243 (+80%)
sgrnnTensorflow implementation of Synthetic Gradient for RNN (LSTM)
Stars: ✭ 40 (-70.37%)
Pytorch Sentiment AnalysisTutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+2277.04%)
Rnn ctcRecurrent Neural Network and Long Short Term Memory (LSTM) with Connectionist Temporal Classification implemented in Theano. Includes a Toy training example.
Stars: ✭ 220 (+62.96%)
KprnReasoning Over Knowledge Graph Paths for Recommendation
Stars: ✭ 220 (+62.96%)
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-82.96%)
HasteHaste: a fast, simple, and open RNN library
Stars: ✭ 214 (+58.52%)
Indrnn Pytorchpytorch implementation of Independently Recurrent Neural Networks https://arxiv.org/abs/1803.04831
Stars: ✭ 104 (-22.96%)
multi channel bprImplementation of Bayesian Personalized Ranking (BPR) for Multiple Feedback Channels
Stars: ✭ 25 (-81.48%)
TelemanomA framework for using LSTMs to detect anomalies in multivariate time series data. Includes spacecraft anomaly data and experiments from the Mars Science Laboratory and SMAP missions.
Stars: ✭ 589 (+336.3%)
MoHRMoHR: Recommendation Through Mixtures of Heterogeneous Item Relationships
Stars: ✭ 51 (-62.22%)
IseebetteriSeeBetter: Spatio-Temporal Video Super Resolution using Recurrent-Generative Back-Projection Networks | Python3 | PyTorch | GANs | CNNs | ResNets | RNNs | Published in Springer Journal of Computational Visual Media, September 2020, Tsinghua University Press
Stars: ✭ 202 (+49.63%)
RecqRecQ: A Python Framework for Recommender Systems (TensorFlow Based)
Stars: ✭ 883 (+554.07%)
ms-convSTAR[RSE21] Pytorch code for hierarchical time series classification with multi-stage convolutional RNN
Stars: ✭ 17 (-87.41%)
Keraspp코딩셰프의 3분 딥러닝, 케라스맛
Stars: ✭ 178 (+31.85%)
altairAssessing Source Code Semantic Similarity with Unsupervised Learning
Stars: ✭ 42 (-68.89%)
ConvLSTM-PyTorchConvLSTM/ConvGRU (Encoder-Decoder) with PyTorch on Moving-MNIST
Stars: ✭ 202 (+49.63%)
Eeg DlA Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (+22.22%)
Linear Attention Recurrent Neural NetworkA recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-11.85%)
tiny-rnnLightweight C++11 library for building deep recurrent neural networks
Stars: ✭ 41 (-69.63%)
Load forecastingLoad forcasting on Delhi area electric power load using ARIMA, RNN, LSTM and GRU models
Stars: ✭ 160 (+18.52%)
Fastfm fastFM: A Library for Factorization Machines
Stars: ✭ 908 (+572.59%)
totally humansrnn trained on r/totallynotrobots 🤖
Stars: ✭ 23 (-82.96%)
Chinese Chatbot中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (-8.15%)
CaptcharecognitionEnd-to-end variable length Captcha recognition using CNN+RNN+Attention/CTC (pytorch implementation). 端到端的不定长验证码识别
Stars: ✭ 97 (-28.15%)
Multi Class Text Classification Cnn RnnClassify Kaggle San Francisco Crime Description into 39 classes. Build the model with CNN, RNN (GRU and LSTM) and Word Embeddings on Tensorflow.
Stars: ✭ 570 (+322.22%)
char-VAEInspired by the neural style algorithm in the computer vision field, we propose a high-level language model with the aim of adapting the linguistic style.
Stars: ✭ 18 (-86.67%)