All Projects → tobyyouup → Conv_seq2seq

tobyyouup / Conv_seq2seq

Licence: apache-2.0
A tensorflow implementation of Fairseq Convolutional Sequence to Sequence Learning(Gehring et al. 2017)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Conv seq2seq

ai-visual-storytelling-seq2seq
Implementation of seq2seq model for Visual Storytelling Challenge (VIST) http://visionandlanguage.net/VIST/index.html
Stars: ✭ 50 (-83.55%)
Mutual labels:  seq2seq
Keras Text Summarization
Text summarization using seq2seq in Keras
Stars: ✭ 260 (-14.47%)
Mutual labels:  seq2seq
Komputation
Komputation is a neural network framework for the Java Virtual Machine written in Kotlin and CUDA C.
Stars: ✭ 295 (-2.96%)
Mutual labels:  seq2seq
NeuralTextSimplification
Exploring Neural Text Simplification
Stars: ✭ 64 (-78.95%)
Mutual labels:  seq2seq
Seq2Seq-Models
Basic Seq2Seq, Attention, CopyNet
Stars: ✭ 19 (-93.75%)
Mutual labels:  seq2seq
Seq2seq chatbot links
Links to the implementations of neural conversational models for different frameworks
Stars: ✭ 270 (-11.18%)
Mutual labels:  seq2seq
neural-chat
An AI chatbot using seq2seq
Stars: ✭ 30 (-90.13%)
Mutual labels:  seq2seq
Dynamic Seq2seq
seq2seq中文聊天机器人
Stars: ✭ 303 (-0.33%)
Mutual labels:  seq2seq
Deepqa
My tensorflow implementation of "A neural conversational model", a Deep learning based chatbot
Stars: ✭ 2,811 (+824.67%)
Mutual labels:  seq2seq
Trade Dst
Source code for transferable dialogue state generator (TRADE, Wu et al., 2019). https://arxiv.org/abs/1905.08743
Stars: ✭ 287 (-5.59%)
Mutual labels:  seq2seq
keras seq2seq word level
Implementation of seq2seq word-level model using keras
Stars: ✭ 12 (-96.05%)
Mutual labels:  seq2seq
2D-LSTM-Seq2Seq
PyTorch implementation of a 2D-LSTM Seq2Seq Model for NMT.
Stars: ✭ 25 (-91.78%)
Mutual labels:  seq2seq
Quick Nlp
Pytorch NLP library based on FastAI
Stars: ✭ 279 (-8.22%)
Mutual labels:  seq2seq
torch-asg
Auto Segmentation Criterion (ASG) implemented in pytorch
Stars: ✭ 42 (-86.18%)
Mutual labels:  seq2seq
Tf tutorial plus
Tutorials for TensorFlow APIs the official documentation doesn't cover
Stars: ✭ 293 (-3.62%)
Mutual labels:  seq2seq
chatbot
🤖️ 基于 PyTorch 的任务型聊天机器人(支持私有部署和 docker 部署的 Chatbot)
Stars: ✭ 77 (-74.67%)
Mutual labels:  seq2seq
Encoder decoder
Four styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
Stars: ✭ 269 (-11.51%)
Mutual labels:  seq2seq
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (+0.66%)
Mutual labels:  seq2seq
Bert seq2seq
pytorch实现bert做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持GPT2进行文章续写。
Stars: ✭ 298 (-1.97%)
Mutual labels:  seq2seq
Rnn For Joint Nlu
Tensorflow implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 281 (-7.57%)
Mutual labels:  seq2seq

Convolutional Seq2Seq

This is a tensorflow implementation of the convolutional seq2seq model released by Facebook. This model is orignially written via Torch/Lua in Fairseq. Considering Lua is not that popular as python in the industry and research community, I re-implemente this model with Tensorflow/Python after carefully reading the paper details and Torch/Lua codebase.

This implementation is based on the framework of Google seq2seq project, which has a detailed documentation on how to use this framework. In this conv seq2seq project, I implement the conv encoder, conv decoder, and attention mechanism, as well as other modules needed by the conv seq2seq model, which is not available in the original seq2seq project.

Requirement

  • Python 2.7.0+
  • Tensorflow 1.0+ (this version is strictly required)
  • and their dependencies

Please follow seq2seq project on how to install the Convolutional Sequence to Sequence Learning project.

How to use

For dataset, please follow seq2seq nmt guides to prepare your dataset

The following is an example of how to run iwslt de-en translation task.

Train

export PYTHONIOENCODING=UTF-8
export DATA_PATH="your iwslt de-en data path"

export VOCAB_SOURCE=${DATA_PATH}/vocab.de
export VOCAB_TARGET=${DATA_PATH}/vocab.en
export TRAIN_SOURCES=${DATA_PATH}/train.de
export TRAIN_TARGETS=${DATA_PATH}/train.en
export DEV_SOURCES=${DATA_PATH}/valid.de
export DEV_TARGETS=${DATA_PATH}/valid.en
export TEST_SOURCES=${DATA_PATH}/test.de
export TEST_TARGETS=${DATA_PATH}/test.en

export TRAIN_STEPS=1000000

export MODEL_DIR=${TMPDIR:-/tmp}/nmt_conv_seq2seq
mkdir -p $MODEL_DIR

python -m bin.train \
  --config_paths="
      ./example_configs/conv_seq2seq.yml,
      ./example_configs/train_seq2seq.yml,
      ./example_configs/text_metrics_bpe.yml" \
  --model_params "
      vocab_source: $VOCAB_SOURCE
      vocab_target: $VOCAB_TARGET" \
  --input_pipeline_train "
    class: ParallelTextInputPipelineFairseq
    params:
      source_files:
        - $TRAIN_SOURCES
      target_files:
        - $TRAIN_TARGETS" \
  --input_pipeline_dev "
    class: ParallelTextInputPipelineFairseq
    params:
       source_files:
        - $DEV_SOURCES
       target_files:
        - $DEV_TARGETS" \
  --batch_size 32 \
  --eval_every_n_steps 5000 \
  --train_steps $TRAIN_STEPS \
  --output_dir $MODEL_DIR

Test

export PRED_DIR=${MODEL_DIR}/pred
mkdir -p ${PRED_DIR}

decode with greedy search

python -m bin.infer \
  --tasks "
    - class: DecodeText" \
  --model_dir $MODEL_DIR \
  --model_params "
    inference.beam_search.beam_width: 1 
    decoder.class: seq2seq.decoders.ConvDecoderFairseq" \
  --input_pipeline "
    class: ParallelTextInputPipelineFairseq
    params:
      source_files:
        - $TEST_SOURCES" \
  > ${PRED_DIR}/predictions.txt

decode with beam search

python -m bin.infer \
  --tasks "
    - class: DecodeText
    - class: DumpBeams
      params:
        file: ${PRED_DIR}/beams.npz" \
  --model_dir $MODEL_DIR \
  --model_params "
    inference.beam_search.beam_width: 5 
    decoder.class: seq2seq.decoders.ConvDecoderFairseqBS" \
  --input_pipeline "
    class: ParallelTextInputPipelineFairseq
    params:
      source_files:
        - $TEST_SOURCES" \
  > ${PRED_DIR}/predictions.txt

calculate BLEU score

./bin/tools/multi-bleu.perl ${TEST_TARGETS} < ${PRED_DIR}/predictions.txt

For more detailed instructions, please refer to seq2seq project.

Issues and contributions are warmly welcome.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].