All Projects → ufal → Neuralmonkey

ufal / Neuralmonkey

Licence: bsd-3-clause
An open-source tool for sequence learning in NLP built on TensorFlow.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Neuralmonkey

Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+25.25%)
Mutual labels:  machine-translation, neural-machine-translation, sequence-to-sequence, nmt
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+147.5%)
Mutual labels:  machine-translation, neural-machine-translation, sequence-to-sequence, encoder-decoder
Nematus
Open-Source Neural Machine Translation in Tensorflow
Stars: ✭ 730 (+82.5%)
Mutual labels:  machine-translation, neural-machine-translation, sequence-to-sequence, nmt
Tf Seq2seq
Sequence to sequence learning using TensorFlow.
Stars: ✭ 387 (-3.25%)
Mutual labels:  neural-machine-translation, sequence-to-sequence, nmt, encoder-decoder
Nmt List
A list of Neural MT implementations
Stars: ✭ 359 (-10.25%)
Mutual labels:  machine-translation, neural-machine-translation, sequence-to-sequence, nmt
Xmunmt
An implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (-82.75%)
Mutual labels:  neural-machine-translation, sequence-to-sequence, nmt
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-89.25%)
Mutual labels:  neural-machine-translation, sequence-to-sequence, nmt
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+754.5%)
Mutual labels:  neural-machine-translation, sequence-to-sequence, encoder-decoder
Joeynmt
Minimalist NMT for educational purposes
Stars: ✭ 420 (+5%)
Mutual labels:  machine-translation, neural-machine-translation, nmt
Npmt
Towards Neural Phrase-based Machine Translation
Stars: ✭ 175 (-56.25%)
Mutual labels:  machine-translation, neural-machine-translation, sequence-to-sequence
Subword Nmt
Unsupervised Word Segmentation for Neural Machine Translation and Text Generation
Stars: ✭ 1,819 (+354.75%)
Mutual labels:  machine-translation, neural-machine-translation, nmt
parallel-corpora-tools
Tools for filtering and cleaning parallel and monolingual corpora for machine translation and other natural language processing tasks.
Stars: ✭ 35 (-91.25%)
Mutual labels:  machine-translation, neural-machine-translation, nmt
dynmt-py
Neural machine translation implementation using dynet's python bindings
Stars: ✭ 17 (-95.75%)
Mutual labels:  machine-translation, neural-machine-translation, sequence-to-sequence
Natural-Language-Processing
Contains various architectures and novel paper implementations for Natural Language Processing tasks like Sequence Modelling and Neural Machine Translation.
Stars: ✭ 48 (-88%)
Mutual labels:  machine-translation, sequence-to-sequence
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-91%)
Mutual labels:  image-captioning, encoder-decoder
dhs summit 2019 image captioning
Image captioning using attention models
Stars: ✭ 34 (-91.5%)
Mutual labels:  sequence-to-sequence, encoder-decoder
text-generation-transformer
text generation based on transformer
Stars: ✭ 36 (-91%)
Mutual labels:  sequence-to-sequence, encoder-decoder
Attention-Visualization
Visualization for simple attention and Google's multi-head attention.
Stars: ✭ 54 (-86.5%)
Mutual labels:  machine-translation, neural-machine-translation
banglanmt
This repository contains the code and data of the paper titled "Not Low-Resource Anymore: Aligner Ensembling, Batch Filtering, and New Datasets for Bengali-English Machine Translation" published in Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), November 16 - November 20, 2020.
Stars: ✭ 91 (-77.25%)
Mutual labels:  machine-translation, neural-machine-translation
Ergo
🧠 A tool that makes AI easier.
Stars: ✭ 264 (-34%)
Mutual labels:  gpu, neural-networks

  Ape is not a monkey Neural Monkey

Neural Sequence Learning Using TensorFlow

Build Status Documentation Status

The Neural Monkey package provides a higher level abstraction for sequential neural network models, most prominently in Natural Language Processing (NLP). It is built on TensorFlow. It can be used for fast prototyping of sequential models in NLP which can be used e.g. for neural machine translation or sentence classification.

The higher-level API brings together a collection of standard building blocks (RNN encoder and decoder, multi-layer perceptron) and a simple way of adding new building blocks implemented directly in TensorFlow.

Usage

neuralmonkey-train <EXPERIMENT_INI>
neuralmonkey-run <EXPERIMENT_INI> <DATASETS_INI>
neuralmonkey-server <EXPERIMENT_INI> [OPTION] ...
neuralmonkey-logbook --logdir <EXPERIMENTS_DIR> [OPTION] ...

Installation

  • You need Python 3.6 (or higher) to run Neural Monkey.

  • When using virtual environment, execute these commands to install the Python dependencies:

    $ source path/to/virtualenv/bin/activate
    
    # For GPU-enabled version
    (virtualenv)$ pip install --upgrade -r requirements-gpu.txt
    
    # For CPU-only version
    (virtualenv)$ pip install --upgrade -r requirements.txt
    
  • If you are using the GPU version, make sure that the LD_LIBRARY_PATH environment variable points to lib and lib64 directories of your CUDA and CuDNN installations. Similarly, your PATH variable should point to the bin subdirectory of the CUDA installation directory.

  • If the training crashes on an unknown dependency, just install it with pip. Remember to keep your virtual environment up-to-date with the package requirements file, which may be changed over time. To update the dependencies, re-run the pip install command from above (pay attention to the distinction between GPU and non-GPU versions).

Getting Started

There is a tutorial that you can follow, which gives you the overwiev of how to design your experiments with Neural Monkey.

Package Overview

  • bin: Directory with neuralmonkey executables

  • examples: Example configuration files for ready-made experiments

  • lib: Third party software

  • neuralmonkey: Python package files

  • scripts: Directory with tools that may come in handy. Note dependencies for these tools may not be listed in the project requirements.

  • tests: Test files

Documentation

You can find the API documentation of this package here. The documentation files are generated from docstrings using autodoc and Napoleon extensions to the Python documentation package Sphinx. The docstrings should follow the recommendations in the Google Python Style Guide. Additional details on the docstring formatting can be found in the Napoleon documentation as well.

Related projects

  • tflearn – a more general and less abstract deep learning toolkit built over TensorFlow

  • nlpnet – deep learning tools for tagging and parsing

  • NNBlocks – a library build over Theano containing NLP specific models

  • Nematus - A tool for training and running Neural Machine Translation models

  • seq2seq - a general-purpose encoder-decoder framework for Tensorflow

  • OpenNMT - open sourcce NMT in Torch

Citation

If you use the tool for academic purporses, please consider citing the following paper:

@article{NeuralMonkey:2017,
    author = {Jind{\v{r}}ich Helcl and Jind{\v{r}}ich Libovick{\'{y}}},
    title = {{Neural Monkey: An Open-source Tool for Sequence Learning}},
    journal = {The Prague Bulletin of Mathematical Linguistics},
    year = {2017},
    address = {Prague, Czech Republic},
    number = {107},
    pages = {5--17},
    issn = {0032-6585},
    doi = {10.1515/pralin-2017-0001},
    url = {http://ufal.mff.cuni.cz/pbml/107/art-helcl-libovicky.pdf}
}

License

The software is distributed under the BSD License.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].