All Projects → jiangnanhugo → lmkit

jiangnanhugo / lmkit

Licence: other
language models toolkits with hierarchical softmax setting

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects
Cuda
1817 projects

Projects that are alternatives of or similar to lmkit

Haste
Haste: a fast, simple, and open RNN library
Stars: ✭ 214 (+1237.5%)
Mutual labels:  lstm, gru
tf-ran-cell
Recurrent Additive Networks for Tensorflow
Stars: ✭ 16 (+0%)
Mutual labels:  lstm, gru
Rnn ctc
Recurrent Neural Network and Long Short Term Memory (LSTM) with Connectionist Temporal Classification implemented in Theano. Includes a Toy training example.
Stars: ✭ 220 (+1275%)
Mutual labels:  lstm, gru
Load forecasting
Load forcasting on Delhi area electric power load using ARIMA, RNN, LSTM and GRU models
Stars: ✭ 160 (+900%)
Mutual labels:  lstm, gru
LSTM-GRU-from-scratch
LSTM, GRU cell implementation from scratch in tensorflow
Stars: ✭ 30 (+87.5%)
Mutual labels:  lstm, gru
Pytorch Kaldi
pytorch-kaldi is a project for developing state-of-the-art DNN/RNN hybrid speech recognition systems. The DNN part is managed by pytorch, while feature extraction, label computation, and decoding are performed with the kaldi toolkit.
Stars: ✭ 2,097 (+13006.25%)
Mutual labels:  lstm, gru
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+21262.5%)
Mutual labels:  lstm, gru
Gdax Orderbook Ml
Application of machine learning to the Coinbase (GDAX) orderbook
Stars: ✭ 60 (+275%)
Mutual labels:  lstm, gru
ConvLSTM-PyTorch
ConvLSTM/ConvGRU (Encoder-Decoder) with PyTorch on Moving-MNIST
Stars: ✭ 202 (+1162.5%)
Mutual labels:  lstm, gru
theano-recurrence
Recurrent Neural Networks (RNN, GRU, LSTM) and their Bidirectional versions (BiRNN, BiGRU, BiLSTM) for word & character level language modelling in Theano
Stars: ✭ 40 (+150%)
Mutual labels:  lstm, gru
Rnn Text Classification Tf
Tensorflow Implementation of Recurrent Neural Network (Vanilla, LSTM, GRU) for Text Classification
Stars: ✭ 114 (+612.5%)
Mutual labels:  lstm, gru
Manhattan-LSTM
Keras and PyTorch implementations of the MaLSTM model for computing Semantic Similarity.
Stars: ✭ 28 (+75%)
Mutual labels:  lstm, gru
Pytorch Rnn Text Classification
Word Embedding + LSTM + FC
Stars: ✭ 112 (+600%)
Mutual labels:  lstm, gru
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (+931.25%)
Mutual labels:  lstm, gru
See Rnn
RNN and general weights, gradients, & activations visualization in Keras & TensorFlow
Stars: ✭ 102 (+537.5%)
Mutual labels:  lstm, gru
Trafficflowprediction
Traffic Flow Prediction with Neural Networks(SAEs、LSTM、GRU).
Stars: ✭ 242 (+1412.5%)
Mutual labels:  lstm, gru
Rnn Notebooks
RNN(SimpleRNN, LSTM, GRU) Tensorflow2.0 & Keras Notebooks (Workshop materials)
Stars: ✭ 48 (+200%)
Mutual labels:  lstm, gru
Tensorflow Lstm Sin
TensorFlow 1.3 experiment with LSTM (and GRU) RNNs for sine prediction
Stars: ✭ 52 (+225%)
Mutual labels:  lstm, gru
LearningMetersPoems
Official repo of the article: Yousef, W. A., Ibrahime, O. M., Madbouly, T. M., & Mahmoud, M. A. (2019), "Learning meters of arabic and english poems with recurrent neural networks: a step forward for language understanding and synthesis", arXiv preprint arXiv:1905.05700
Stars: ✭ 18 (+12.5%)
Mutual labels:  lstm, gru
myDL
Deep Learning
Stars: ✭ 18 (+12.5%)
Mutual labels:  lstm, gru

lmkit

  1. original: Recurrent Language Model (RNNLM) with LSTM/GRU cells
  2. Sampling: noise contrastive estimation/negative sampling/blackout for RNNLM
  3. cHSM: class-based hierarchical softmax for RNNLM
  4. masked-cHSM: unequal partitioned vocabulary case for class decomposition.
  5. p-tHSM: paralleled tree-based hierarchical softmax for RNNLM
  6. tHSM: traditional tree-based hierarchical softmax with Huffman coding for RNNLM

Accepted paper

@inproceedings{DBLP:conf/ijcai/JiangRGSX17,
  author    = {Nan Jiang and
               Wenge Rong and
               Min Gao and
               Yikang Shen and
               Zhang Xiong},
  title     = {Exploration of Tree-based Hierarchical Softmax for Recurrent Language
               Models},
  booktitle = {Proceedings of the Twenty-Sixth International Joint Conference on
               Artificial Intelligence, {IJCAI} 2017, Melbourne, Australia, August
               19-25, 2017},
  pages     = {1951--1957},
  year      = {2017},
  crossref  = {DBLP:conf/ijcai/2017},
  url       = {https://doi.org/10.24963/ijcai.2017/271},
  doi       = {10.24963/ijcai.2017/271},
  timestamp = {Tue, 15 Aug 2017 14:48:05 +0200},
  biburl    = {https://dblp.org/rec/bib/conf/ijcai/JiangRGSX17},
  bibsource = {dblp computer science bibliography, https://dblp.org}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].