All Projects → Nmt Keras → Similar Projects or Alternatives

904 Open source projects that are alternatives of or similar to Nmt Keras

bergamot-translator
Cross platform C++ library focusing on optimized machine translation on the consumer-grade device.
Stars: ✭ 181 (-63.87%)
Openseq2seq
Toolkit for efficient experimentation with Speech Recognition, Text2Speech and NLP
Stars: ✭ 1,378 (+175.05%)
VNMT
Code for "Variational Neural Machine Translation" (EMNLP2016)
Stars: ✭ 54 (-89.22%)
Mutual labels:  theano, nmt
En-transformer
Implementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (-73.85%)
Mutual labels:  transformer, attention-mechanism
SequenceToSequence
A seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-97.8%)
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (-8.98%)
Word-Level-Eng-Mar-NMT
Translating English sentences to Marathi using Neural Machine Translation
Stars: ✭ 37 (-92.61%)
Machine-Translation-Hindi-to-english-
Machine translation is the task of converting one language to other. Unlike the traditional phrase-based translation system which consists of many small sub-components that are tuned separately, neural machine translation attempts to build and train a single, large neural network that reads a sentence and outputs a correct translation.
Stars: ✭ 19 (-96.21%)
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-88.62%)
Mutual labels:  transformer, attention-mechanism
Mt Paper Lists
MT paper lists (by conference)
Stars: ✭ 105 (-79.04%)
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-95.41%)
Mutual labels:  transformer, attention-mechanism
Transformer-Transducer
PyTorch implementation of "Transformer Transducer: A Streamable Speech Recognition Model with Transformer Encoders and RNN-T Loss" (ICASSP 2020)
Stars: ✭ 61 (-87.82%)
Transformer Clinic
Understanding the Difficulty of Training Transformers
Stars: ✭ 179 (-64.27%)
Mutual labels:  nmt, transformer
Modernmt
Neural Adaptive Machine Translation that adapts to context and learns from corrections.
Stars: ✭ 231 (-53.89%)
Opennmt
Open Source Neural Machine Translation in Torch (deprecated)
Stars: ✭ 2,339 (+366.87%)
Hardware Aware Transformers
[ACL 2020] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
Stars: ✭ 206 (-58.88%)
Mutual labels:  machine-translation, transformer
Quality-Estimation1
机器翻译子任务-翻译质量评价-复现 WMT2018 阿里论文结果
Stars: ✭ 19 (-96.21%)
Mutual labels:  transformer, nmt
dodrio
Exploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (-53.49%)
Mutual labels:  transformer, attention-mechanism
zero
Zero -- A neural machine translation system
Stars: ✭ 121 (-75.85%)
minimal-nmt
A minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (-92.81%)
Eqtransformer
EQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (-81.04%)
Mutual labels:  attention-mechanism, transformer
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-74.85%)
Attention Mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Stars: ✭ 203 (-59.48%)
Sca Cnn.cvpr17
Image Captions Generation with Spatial and Channel-wise Attention
Stars: ✭ 198 (-60.48%)
Mutual labels:  attention-mechanism, theano
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-58.28%)
Mutual labels:  attention-mechanism, transformer
transformer
Build English-Vietnamese machine translation with ProtonX Transformer. :D
Stars: ✭ 41 (-91.82%)
Mutual labels:  machine-translation, transformer
vat nmt
Implementation of "Effective Adversarial Regularization for Neural Machine Translation", ACL 2019
Stars: ✭ 22 (-95.61%)
Mutual labels:  neural-machine-translation, nmt
OverlapPredator
[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (-41.52%)
Mutual labels:  transformer, attention-mechanism
transformer-slt
Sign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (-81.64%)
enformer-pytorch
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (-70.86%)
Mutual labels:  transformer, attention-mechanism
Neural-Chatbot
A Neural Network based Chatbot
Stars: ✭ 68 (-86.43%)
transformer
A simple TensorFlow implementation of the Transformer
Stars: ✭ 25 (-95.01%)
ForestCoverChange
Detecting and Predicting Forest Cover Change in Pakistani Areas Using Remote Sensing Imagery
Stars: ✭ 23 (-95.41%)
Mutual labels:  gru, sequence-to-sequence
Neural-Machine-Translation
Several basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (-94.21%)
TS3000 TheChatBOT
Its a social networking chat-bot trained on Reddit dataset . It supports open bounded queries developed on the concept of Neural Machine Translation. Beware of its being sarcastic just like its creator 😝 BDW it uses Pytorch framework and Python3.
Stars: ✭ 20 (-96.01%)
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-75.85%)
Mutual labels:  transformer, attention-mechanism
TianChi AIEarth
TianChi AIEarth Contest Solution
Stars: ✭ 57 (-88.62%)
Mutual labels:  transformer, attention-mechanism
Compact-Global-Descriptor
Pytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-95.61%)
Hierarchical-attention-network
My implementation of "Hierarchical Attention Networks for Document Classification" in Keras
Stars: ✭ 26 (-94.81%)
Mutual labels:  gru, attention-mechanism
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-94.21%)
Mutual labels:  transformer, attention-model
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-92.81%)
Mutual labels:  transformer, attention-mechanism
sb-nmt
Code for Synchronous Bidirectional Neural Machine Translation (SB-NMT)
Stars: ✭ 66 (-86.83%)
Mutual labels:  machine-translation, transformer
FragmentVC
Any-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (-73.25%)
Mutual labels:  transformer, attention-mechanism
Transformer-in-Transformer
An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-92.02%)
Mutual labels:  transformer, attention-mechanism
Natural-Language-Processing
Contains various architectures and novel paper implementations for Natural Language Processing tasks like Sequence Modelling and Neural Machine Translation.
Stars: ✭ 48 (-90.42%)
visualization
a collection of visualization function
Stars: ✭ 189 (-62.28%)
Mutual labels:  transformer, attention-mechanism
theano-recurrence
Recurrent Neural Networks (RNN, GRU, LSTM) and their Bidirectional versions (BiRNN, BiGRU, BiLSTM) for word & character level language modelling in Theano
Stars: ✭ 40 (-92.02%)
Mutual labels:  theano, gru
attention-is-all-you-need-paper
Implementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (-80.64%)
speech-transformer
Transformer implementation speciaized in speech recognition tasks using Pytorch.
Stars: ✭ 40 (-92.02%)
attention-mechanism-keras
attention mechanism in keras, like Dense and RNN...
Stars: ✭ 19 (-96.21%)
Attention-Visualization
Visualization for simple attention and Google's multi-head attention.
Stars: ✭ 54 (-89.22%)
linformer
Implementation of Linformer for Pytorch
Stars: ✭ 119 (-76.25%)
Mutual labels:  transformer, attention-mechanism
A-Persona-Based-Neural-Conversation-Model
No description or website provided.
Stars: ✭ 22 (-95.61%)
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+25.15%)
Kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (-62.08%)
Transformers-RL
An easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (-78.64%)
Mutual labels:  transformer, attention-mechanism
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-94.41%)
galerkin-transformer
[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-77.84%)
Mutual labels:  transformer, attention-mechanism
text-generation-transformer
text generation based on transformer
Stars: ✭ 36 (-92.81%)
banglanmt
This repository contains the code and data of the paper titled "Not Low-Resource Anymore: Aligner Ensembling, Batch Filtering, and New Datasets for Bengali-English Machine Translation" published in Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), November 16 - November 20, 2020.
Stars: ✭ 91 (-81.84%)
61-120 of 904 similar projects