All Projects → HadoopIt → Rnn Nlu

HadoopIt / Rnn Nlu

A TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Rnn Nlu

Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (-18.14%)
Mutual labels:  recurrent-neural-networks, attention
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-72.79%)
Mutual labels:  recurrent-neural-networks, attention
Punctuator2
A bidirectional recurrent neural network model with attention mechanism for restoring missing punctuation in unsegmented text
Stars: ✭ 483 (+4.32%)
Mutual labels:  recurrent-neural-networks, attention
sequence labeling tf
Sequence Labeling in Tensorflow
Stars: ✭ 18 (-96.11%)
Mutual labels:  recurrent-neural-networks, sequence-labeling
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (-90.71%)
Mutual labels:  recurrent-neural-networks, attention
datastories-semeval2017-task6
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-95.68%)
Mutual labels:  recurrent-neural-networks, attention
Hey Jetson
Deep Learning based Automatic Speech Recognition with attention for the Nvidia Jetson.
Stars: ✭ 161 (-65.23%)
Mutual labels:  recurrent-neural-networks, attention
Rnnsharp
RNNSharp is a toolkit of deep recurrent neural network which is widely used for many different kinds of tasks, such as sequence labeling, sequence-to-sequence and so on. It's written by C# language and based on .NET framework 4.6 or above versions. RNNSharp supports many different types of networks, such as forward and bi-directional network, sequence-to-sequence network, and different types of layers, such as LSTM, Softmax, sampled Softmax and others.
Stars: ✭ 277 (-40.17%)
Mutual labels:  recurrent-neural-networks, sequence-labeling
Deep learning nlp
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (-12.1%)
Mutual labels:  recurrent-neural-networks, attention
Lstm Crf Pytorch
LSTM-CRF in PyTorch
Stars: ✭ 364 (-21.38%)
Mutual labels:  sequence-labeling
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (-11.88%)
Mutual labels:  attention
Nlp Projects
word2vec, sentence2vec, machine reading comprehension, dialog system, text classification, pretrained language model (i.e., XLNet, BERT, ELMo, GPT), sequence labeling, information retrieval, information extraction (i.e., entity, relation and event extraction), knowledge graph, text generation, network embedding
Stars: ✭ 360 (-22.25%)
Mutual labels:  sequence-labeling
Rmdl
RMDL: Random Multimodel Deep Learning for Classification
Stars: ✭ 375 (-19.01%)
Mutual labels:  recurrent-neural-networks
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (-11.23%)
Mutual labels:  attention
Qtrader
Reinforcement Learning for Portfolio Management
Stars: ✭ 363 (-21.6%)
Mutual labels:  recurrent-neural-networks
Tensorflow Lstm Regression
Sequence prediction using recurrent neural networks(LSTM) with TensorFlow
Stars: ✭ 433 (-6.48%)
Mutual labels:  recurrent-neural-networks
Autoner
Learning Named Entity Tagger from Domain-Specific Dictionary
Stars: ✭ 357 (-22.89%)
Mutual labels:  sequence-labeling
Sltk
序列化标注工具,基于PyTorch实现BLSTM-CNN-CRF模型,CoNLL 2003 English NER测试集F1值为91.10%(word and char feature)。
Stars: ✭ 338 (-27%)
Mutual labels:  sequence-labeling
Ban Vqa
Bilinear attention networks for visual question answering
Stars: ✭ 449 (-3.02%)
Mutual labels:  attention
Gansformer
Generative Adversarial Transformers
Stars: ✭ 421 (-9.07%)
Mutual labels:  attention

Attention-based RNN model for Spoken Language Understanding (Intent Detection & Slot Filling)

Tensorflow implementation of attention-based LSTM models for sequence classification and sequence labeling.

Updates - 2017/07/29

  • Updated code to work with the latest TensorFlow API: r1.2
  • Code cleanup and formatting
  • Note that this published code does not include the modeling of output label dependencies. One may add a loop function as in the rnn_decoder function in TensorFlow seq2seq.py example to feed emitted label embedding back to RNN state. Alternatively, sequence level optimization can be performed by adding a CRF layer on top of the RNN outputs.
  • The dataset used in the paper can be found at: https://github.com/yvchen/JointSLU/tree/master/data. We used the training set in the original ATIS train/test split, which has 4978 training samples. There are 15 test samples that have multiple intent labels for an utterance. We used the more frequent label (most likely, "flight") as the true label during evaluation.

Setup

Usage:

data_dir=data/ATIS_samples
model_dir=model_tmp
max_sequence_length=50  # max length for train/valid/test sequence
task=joint  # available options: intent; tagging; joint
bidirectional_rnn=True  # available options: True; False
use_attention=True # available options: True; False

python run_multi-task_rnn.py --data_dir $data_dir \
      --train_dir   $model_dir\
      --max_sequence_length $max_sequence_length \
      --task $task \
      --bidirectional_rnn $bidirectional_rnn \
      --use_attention $use_attention

Reference

  • Bing Liu, Ian Lane, "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling", Interspeech, 2016 (PDF)
@inproceedings{Liu+2016,
author={Bing Liu and Ian Lane},
title={Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling},
year=2016,
booktitle={Interspeech 2016},
doi={10.21437/Interspeech.2016-1352},
url={http://dx.doi.org/10.21437/Interspeech.2016-1352},
pages={685--689}
}

Contact

Feel free to email [email protected] for any pertinent questions/bugs regarding the code.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].