All Projects → Sockeye → Similar Projects or Alternatives

1905 Open source projects that are alternatives of or similar to Sockeye

Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (-49.39%)
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+245.25%)
Transformer
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+268.28%)
Joeynmt
Minimalist NMT for educational purposes
Stars: ✭ 420 (-57.58%)
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (-58.79%)
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-93.94%)
dynmt-py
Neural machine translation implementation using dynet's python bindings
Stars: ✭ 17 (-98.28%)
Tf Seq2seq
Sequence to sequence learning using TensorFlow.
Stars: ✭ 387 (-60.91%)
Neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
Stars: ✭ 400 (-59.6%)
Gluon2pytorch
Gluon to PyTorch deep neural network model converter
Stars: ✭ 70 (-92.93%)
Mutual labels:  deep-neural-networks, mxnet, gluon
Transformer
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Stars: ✭ 271 (-72.63%)
SequenceToSequence
A seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-98.89%)
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-95.66%)
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-97.17%)
Transformers without tears
Transformers without Tears: Improving the Normalization of Self-Attention
Stars: ✭ 80 (-91.92%)
Dab
Data Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (-70.3%)
Nspm
🤖 Neural SPARQL Machines for Knowledge Graph Question Answering.
Stars: ✭ 156 (-84.24%)
Kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (-80.81%)
Nmt List
A list of Neural MT implementations
Stars: ✭ 359 (-63.74%)
Lingvo
Lingvo
Stars: ✭ 2,361 (+138.48%)
Speech recognition with tensorflow
Implementation of a seq2seq model for Speech Recognition using the latest version of TensorFlow. Architecture similar to Listen, Attend and Spell.
Stars: ✭ 253 (-74.44%)
Witwicky
Witwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-97.88%)
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-87.98%)
Pytorch Transformer
pytorch implementation of Attention is all you need
Stars: ✭ 199 (-79.9%)
Modernmt
Neural Adaptive Machine Translation that adapts to context and learns from corrections.
Stars: ✭ 231 (-76.67%)
Transformer Temporal Tagger
Code and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (-94.44%)
Mutual labels:  transformer, seq2seq, encoder-decoder
Word-Level-Eng-Mar-NMT
Translating English sentences to Marathi using Neural Machine Translation
Stars: ✭ 37 (-96.26%)
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-90.2%)
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-97.68%)
A-Persona-Based-Neural-Conversation-Model
No description or website provided.
Stars: ✭ 22 (-97.78%)
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (-53.94%)
NiuTrans.NMT
A Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (-88.69%)
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (-58.48%)
Npmt
Towards Neural Phrase-based Machine Translation
Stars: ✭ 175 (-82.32%)
Openseq2seq
Toolkit for efficient experimentation with Speech Recognition, Text2Speech and NLP
Stars: ✭ 1,378 (+39.19%)
Text summarization with tensorflow
Implementation of a seq2seq model for summarization of textual data. Demonstrated on amazon reviews, github issues and news articles.
Stars: ✭ 226 (-77.17%)
Xmunmt
An implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (-93.03%)
Embedding
Embedding模型代码和学习笔记总结
Stars: ✭ 25 (-97.47%)
Mutual labels:  transformer, seq2seq, encoder-decoder
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-96.36%)
Keras Attention
Visualizing RNNs using the attention mechanism
Stars: ✭ 697 (-29.6%)
text-generation-transformer
text generation based on transformer
Stars: ✭ 36 (-96.36%)
Pytorch Attention Guided Cyclegan
Pytorch implementation of Unsupervised Attention-guided Image-to-Image Translation.
Stars: ✭ 67 (-93.23%)
minimal-nmt
A minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (-96.36%)
pynmt
a simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-98.69%)
Nematus
Open-Source Neural Machine Translation in Tensorflow
Stars: ✭ 730 (-26.26%)
Encoder decoder
Four styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
Stars: ✭ 269 (-72.83%)
Mutual labels:  seq2seq, encoder-decoder
Keras Transformer
Transformer implemented in Keras
Stars: ✭ 273 (-72.42%)
Mutual labels:  translation, transformer
Mxnet Centernet
Gluon implementation of "Objects as Points", aka "CenterNet"
Stars: ✭ 29 (-97.07%)
Mutual labels:  mxnet, gluon
Gluon Face
An unofficial Gluon FR Toolkit for face recognition. https://gluon-face.readthedocs.io
Stars: ✭ 264 (-73.33%)
Mutual labels:  mxnet, gluon
Multi Scale Attention
Code for our paper "Multi-scale Guided Attention for Medical Image Segmentation"
Stars: ✭ 281 (-71.62%)
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+295.96%)
Mutual labels:  mxnet, gluon
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (-69.09%)
Mutual labels:  attention-mechanism, seq2seq
Attention is all you need
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
Stars: ✭ 303 (-69.39%)
Seq2seq chatbot
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 308 (-68.89%)
Mutual labels:  attention-mechanism, seq2seq
Deep Learning In Production
In this repository, I will share some useful notes and references about deploying deep learning-based models in production.
Stars: ✭ 3,104 (+213.54%)
Mutual labels:  deep-neural-networks, mxnet
Rezero
Official PyTorch Repo for "ReZero is All You Need: Fast Convergence at Large Depth"
Stars: ✭ 317 (-67.98%)
Deepchatmodels
Conversation models in TensorFlow. (website removed)
Stars: ✭ 312 (-68.48%)
Bytenet Tensorflow
ByteNet for character-level language modelling
Stars: ✭ 319 (-67.78%)
Action Recognition Visual Attention
Action recognition using soft attention based deep recurrent neural networks
Stars: ✭ 350 (-64.65%)
1-60 of 1905 similar projects