All Projects → sheng-z → cross-lingual-open-ie

sheng-z / cross-lingual-open-ie

Licence: other
MT/IE: Cross-lingual Open Information Extraction with Neural Sequence-to-Sequence Models

Programming Languages

python
139335 projects - #7 most used programming language
perl
6916 projects
shell
77523 projects

Projects that are alternatives of or similar to cross-lingual-open-ie

Sequence To Sequence 101
a series of tutorials on sequence to sequence learning, implemented with PyTorch.
Stars: ✭ 62 (+181.82%)
Mutual labels:  sequence-to-sequence
Video Captioning
This repository contains the code for a video captioning system inspired by Sequence to Sequence -- Video to Text. This system takes as input a video and generates a caption in English describing the video.
Stars: ✭ 131 (+495.45%)
Mutual labels:  sequence-to-sequence
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (+86.36%)
Mutual labels:  sequence-to-sequence
Xmunmt
An implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (+213.64%)
Mutual labels:  sequence-to-sequence
Delta
DELTA is a deep learning based natural language and speech processing platform.
Stars: ✭ 1,479 (+6622.73%)
Mutual labels:  sequence-to-sequence
Npmt
Towards Neural Phrase-based Machine Translation
Stars: ✭ 175 (+695.45%)
Mutual labels:  sequence-to-sequence
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+4400%)
Mutual labels:  sequence-to-sequence
ismir2019-music-style-translation
The code for the ISMIR 2019 paper “Supervised symbolic music style translation using synthetic data”.
Stars: ✭ 27 (+22.73%)
Mutual labels:  sequence-to-sequence
Mlds2018spring
Machine Learning and having it Deep and Structured (MLDS) in 2018 spring
Stars: ✭ 124 (+463.64%)
Mutual labels:  sequence-to-sequence
Speech recognition with tensorflow
Implementation of a seq2seq model for Speech Recognition using the latest version of TensorFlow. Architecture similar to Listen, Attend and Spell.
Stars: ✭ 253 (+1050%)
Mutual labels:  sequence-to-sequence
Language Translation
Neural machine translator for English2German translation.
Stars: ✭ 82 (+272.73%)
Mutual labels:  sequence-to-sequence
Openseq2seq
Toolkit for efficient experimentation with Speech Recognition, Text2Speech and NLP
Stars: ✭ 1,378 (+6163.64%)
Mutual labels:  sequence-to-sequence
Text summarization with tensorflow
Implementation of a seq2seq model for summarization of textual data. Demonstrated on amazon reviews, github issues and news articles.
Stars: ✭ 226 (+927.27%)
Mutual labels:  sequence-to-sequence
Hred Attention Tensorflow
An extension on the Hierachical Recurrent Encoder-Decoder for Generative Context-Aware Query Suggestion, our implementation is in Tensorflow and uses an attention mechanism.
Stars: ✭ 68 (+209.09%)
Mutual labels:  sequence-to-sequence
ForestCoverChange
Detecting and Predicting Forest Cover Change in Pakistani Areas Using Remote Sensing Imagery
Stars: ✭ 23 (+4.55%)
Mutual labels:  sequence-to-sequence
Machine Translation
Stars: ✭ 51 (+131.82%)
Mutual labels:  sequence-to-sequence
Seq2seq tutorial
Code For Medium Article "How To Create Data Products That Are Magical Using Sequence-to-Sequence Models"
Stars: ✭ 132 (+500%)
Mutual labels:  sequence-to-sequence
Keras-LSTM-Trajectory-Prediction
A Keras multi-input multi-output LSTM-based RNN for object trajectory forecasting
Stars: ✭ 88 (+300%)
Mutual labels:  sequence-to-sequence
Neural-Chatbot
A Neural Network based Chatbot
Stars: ✭ 68 (+209.09%)
Mutual labels:  sequence-to-sequence
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+15436.36%)
Mutual labels:  sequence-to-sequence

MT/IE: Cross-lingual Open IE

Attention-based sequence-to-sequence model for cross-lingual open IE.

Summary

A tensorflow implementation of "MT/IE: Cross-lingual Open Information Extraction with Neural Sequence-to-Sequence Models" (EACL 2017) by Sheng Zhang, Kevin Duh, and Benjamin Van Durme.

Dependencies

  • python 2.7
  • tensorflow r0.12 or later

Train

We provide you a small toy dataset (10K) to play with. To start training on this dataset, simply run:

./run.sh

Evaluate

After training for a while, you can start evaluation by:

python -m mt_ie --do_decode=True

Note: multi-bleu.perl from mosesdecoder is included for your convenience.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].