Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+3423.71%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-86.6%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+920.62%)
RNNSearchAn implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-55.67%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+416.49%)
JoeynmtMinimalist NMT for educational purposes
Stars: ✭ 420 (+332.99%)
Transformer TensorflowTensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+228.87%)
ModernmtNeural Adaptive Machine Translation that adapts to context and learns from corrections.
Stars: ✭ 231 (+138.14%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+24.74%)
parallel-corpora-toolsTools for filtering and cleaning parallel and monolingual corpora for machine translation and other natural language processing tasks.
Stars: ✭ 35 (-63.92%)
Onnxt5Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (+47.42%)
TRAR-VQA[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-49.48%)
vat nmtImplementation of "Effective Adversarial Regularization for Neural Machine Translation", ACL 2019
Stars: ✭ 22 (-77.32%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-71.13%)
NiuTrans.NMTA Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (+15.46%)
visualizationa collection of visualization function
Stars: ✭ 189 (+94.85%)
transformer-sltSign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (-5.15%)
Nlp TutorialNatural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+10101.03%)
NmtpytorchSequence-to-Sequence Framework in PyTorch
Stars: ✭ 392 (+304.12%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+115.46%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+320.62%)
Nlp TutorialsSimple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+306.19%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+323.71%)
GraphtransformerGraph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (+92.78%)
Neural-Machine-TranslationSeveral basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (-70.1%)
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (-57.73%)
Transformers.jlJulia Implementation of Transformer models
Stars: ✭ 173 (+78.35%)
zeroZero -- A neural machine translation system
Stars: ✭ 121 (+24.74%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-41.24%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-38.14%)
Medical TransformerPytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (+57.73%)
Nmt ListA list of Neural MT implementations
Stars: ✭ 359 (+270.1%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+3658.76%)
Tf Seq2seqSequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+298.97%)
Ner BertBERT-NER (nert-bert) with google bert https://github.com/google-research.
Stars: ✭ 339 (+249.48%)
Deep learning nlpKeras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (+319.59%)
NeuralmonkeyAn open-source tool for sequence learning in NLP built on TensorFlow.
Stars: ✭ 400 (+312.37%)
Cell DetrOfficial and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-73.2%)
Nlp tensorflow projectUse tensorflow to achieve some NLP project, eg: classification chatbot ner attention QAetc.
Stars: ✭ 27 (-72.16%)
Rnn Nmt基于双向RNN,Attention机制的编解码神经机器翻译模型
Stars: ✭ 46 (-52.58%)
NematusOpen-Source Neural Machine Translation in Tensorflow
Stars: ✭ 730 (+652.58%)
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+482.47%)
Nlp Models TensorflowGathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Stars: ✭ 1,603 (+1552.58%)
SightseqComputer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (+19.59%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-34.02%)
Rust BertRust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Stars: ✭ 510 (+425.77%)
Transformer DynetAn Implementation of Transformer (Attention Is All You Need) in DyNet
Stars: ✭ 57 (-41.24%)
XmunmtAn implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (-28.87%)