All Projects → shvmshukla → Machine-Translation-Hindi-to-english-

shvmshukla / Machine-Translation-Hindi-to-english-

Licence: other
Machine translation is the task of converting one language to other. Unlike the traditional phrase-based translation system which consists of many small sub-components that are tuned separately, neural machine translation attempts to build and train a single, large neural network that reads a sentence and outputs a correct translation.

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to Machine-Translation-Hindi-to-english-

Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+5110.53%)
Mutual labels:  machine-translation, attention-mechanism
Transformer
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Stars: ✭ 271 (+1326.32%)
Mutual labels:  machine-translation, attention-mechanism
Attention Mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Stars: ✭ 203 (+968.42%)
Mutual labels:  machine-translation, attention-mechanism
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+2536.84%)
Mutual labels:  machine-translation, attention-mechanism
SequenceToSequence
A seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-42.11%)
Mutual labels:  machine-translation, attention-mechanism
tai5-uan5 gian5-gi2 kang1-ku7
臺灣言語工具
Stars: ✭ 79 (+315.79%)
Mutual labels:  machine-translation
SiGAT
source code for signed graph attention networks (ICANN2019) & SDGNN (AAAI2021)
Stars: ✭ 37 (+94.74%)
Mutual labels:  attention-mechanism
En-transformer
Implementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (+589.47%)
Mutual labels:  attention-mechanism
dgcnn
Clean & Documented TF2 implementation of "An end-to-end deep learning architecture for graph classification" (M. Zhang et al., 2018).
Stars: ✭ 21 (+10.53%)
Mutual labels:  attention-mechanism
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+536.84%)
Mutual labels:  attention-mechanism
inmt
Interactive Neural Machine Translation tool
Stars: ✭ 44 (+131.58%)
Mutual labels:  machine-translation
MetricMT
The official code repository for MetricMT - a reward optimization method for NMT with learned metrics
Stars: ✭ 23 (+21.05%)
Mutual labels:  machine-translation
S2VT-seq2seq-video-captioning-attention
S2VT (seq2seq) video captioning with bahdanau & luong attention implementation in Tensorflow
Stars: ✭ 18 (-5.26%)
Mutual labels:  attention-mechanism
rtg
Reader Translator Generator - NMT toolkit based on pytorch
Stars: ✭ 26 (+36.84%)
Mutual labels:  machine-translation
LanguageModel-using-Attention
Pytorch implementation of a basic language model using Attention in LSTM network
Stars: ✭ 27 (+42.11%)
Mutual labels:  attention-mechanism
efficient-attention
An implementation of the efficient attention module.
Stars: ✭ 191 (+905.26%)
Mutual labels:  attention-mechanism
CIAN
Implementation of the Character-level Intra Attention Network (CIAN) for Natural Language Inference (NLI) upon SNLI and MultiNLI corpus
Stars: ✭ 17 (-10.53%)
Mutual labels:  attention-mechanism
axial-attention
Implementation of Axial attention - attending to multi-dimensional data efficiently
Stars: ✭ 245 (+1189.47%)
Mutual labels:  attention-mechanism
DCAN
[AAAI 2020] Code release for "Domain Conditioned Adaptation Network" https://arxiv.org/abs/2005.06717
Stars: ✭ 27 (+42.11%)
Mutual labels:  attention-mechanism
skt
Sanskrit compound segmentation using seq2seq model
Stars: ✭ 21 (+10.53%)
Mutual labels:  machine-translation
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].