All Projects → Machine Translation → Similar Projects or Alternatives

635 Open source projects that are alternatives of or similar to Machine Translation

enformer-pytorch
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+186.27%)
Mutual labels:  transformer
Laravel Responder
A Laravel Fractal package for building API responses, giving you the power of Fractal with Laravel's elegancy.
Stars: ✭ 673 (+1219.61%)
Mutual labels:  transformer
Rezero
Official PyTorch Repo for "ReZero is All You Need: Fast Convergence at Large Depth"
Stars: ✭ 317 (+521.57%)
Mutual labels:  transformer
TS-CAM
Codes for TS-CAM: Token Semantic Coupled Attention Map for Weakly Supervised Object Localization.
Stars: ✭ 96 (+88.24%)
Mutual labels:  transformer
dodrio
Exploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+356.86%)
Mutual labels:  transformer
Swin-Transformer-Tensorflow
Unofficial implementation of "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" (https://arxiv.org/abs/2103.14030)
Stars: ✭ 45 (-11.76%)
Mutual labels:  transformer
Conformer
Official code for Conformer: Local Features Coupling Global Representations for Visual Recognition
Stars: ✭ 345 (+576.47%)
Mutual labels:  transformer
Tsai
Time series Timeseries Deep Learning Pytorch fastai - State-of-the-art Deep Learning with Time Series and Sequences in Pytorch / fastai
Stars: ✭ 407 (+698.04%)
Mutual labels:  transformer
pytorch-transformer-kor-eng
Transformer Implementation using PyTorch for Neural Machine Translation (Korean to English)
Stars: ✭ 40 (-21.57%)
Mutual labels:  sequence-to-sequence
Bentools Etl
PHP ETL (Extract / Transform / Load) library with SOLID principles + almost no dependency.
Stars: ✭ 45 (-11.76%)
Mutual labels:  transformer
Neuralconvo
Neural conversational model in Torch
Stars: ✭ 773 (+1415.69%)
Mutual labels:  seq2seq
RSTNet
RSTNet: Captioning with Adaptive Attention on Visual and Non-Visual Words (CVPR 2021)
Stars: ✭ 71 (+39.22%)
Mutual labels:  transformer
Pyhgt
Code for "Heterogeneous Graph Transformer" (WWW'20), which is based on pytorch_geometric
Stars: ✭ 313 (+513.73%)
Mutual labels:  transformer
text2keywords
Trained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (+3.92%)
Mutual labels:  transformer
galerkin-transformer
[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (+117.65%)
Mutual labels:  transformer
HE2LaTeX
Converting handwritten equations to LaTeX
Stars: ✭ 84 (+64.71%)
Mutual labels:  sequence-to-sequence
Deepsvg
[NeurIPS 2020] Official code for the paper "DeepSVG: A Hierarchical Generative Network for Vector Graphics Animation". Includes a PyTorch library for deep learning with SVG data.
Stars: ✭ 403 (+690.2%)
Mutual labels:  transformer
paccmann proteomics
PaccMann models for protein language modeling
Stars: ✭ 28 (-45.1%)
Mutual labels:  transformer
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-54.9%)
Mutual labels:  transformer
Attention Is All You Need Keras
A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 628 (+1131.37%)
Mutual labels:  attention-is-all-you-need
TextPruner
A PyTorch-based model pruning toolkit for pre-trained language models
Stars: ✭ 94 (+84.31%)
Mutual labels:  transformer
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (+11.76%)
Mutual labels:  transformer
Cubert
Fast implementation of BERT inference directly on NVIDIA (CUDA, CUBLAS) and Intel MKL
Stars: ✭ 395 (+674.51%)
Mutual labels:  transformer
semantic-segmentation
SOTA Semantic Segmentation Models in PyTorch
Stars: ✭ 464 (+809.8%)
Mutual labels:  transformer
bert-as-a-service TFX
End-to-end pipeline with TFX to train and deploy a BERT model for sentiment analysis.
Stars: ✭ 32 (-37.25%)
Mutual labels:  transformer
Cell Detr
Official and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-49.02%)
Mutual labels:  transformer
cnn-seq2seq
No description or website provided.
Stars: ✭ 39 (-23.53%)
Mutual labels:  seq2seq
Graphormer
Graphormer is a deep learning package that allows researchers and developers to train custom models for molecule modeling tasks. It aims to accelerate the research and application in AI for molecule science, such as material design, drug discovery, etc.
Stars: ✭ 1,194 (+2241.18%)
Mutual labels:  transformer
saint
The official PyTorch implementation of recent paper - SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training
Stars: ✭ 209 (+309.8%)
Mutual labels:  transformer
KitanaQA
KitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (+13.73%)
Mutual labels:  transformer
Nmtpytorch
Sequence-to-Sequence Framework in PyTorch
Stars: ✭ 392 (+668.63%)
Mutual labels:  seq2seq
BSD
The Business Scene Dialogue corpus
Stars: ✭ 51 (+0%)
Mutual labels:  machine-translation
Laravel5 Jsonapi
Laravel 5 JSON API Transformer Package
Stars: ✭ 313 (+513.73%)
Mutual labels:  transformer
deformer
[ACL 2020] DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering
Stars: ✭ 111 (+117.65%)
Mutual labels:  transformer
Seq2seq
A general-purpose encoder-decoder framework for Tensorflow
Stars: ✭ 5,455 (+10596.08%)
Mutual labels:  machine-translation
nepali-translator
Neural Machine Translation on the Nepali-English language pair
Stars: ✭ 29 (-43.14%)
Mutual labels:  machine-translation
CSV2RDF
Streaming, transforming, SPARQL-based CSV to RDF converter. Apache license.
Stars: ✭ 48 (-5.88%)
Mutual labels:  transformer
inmt
Interactive Neural Machine Translation tool
Stars: ✭ 44 (-13.73%)
Mutual labels:  machine-translation
Textclassificationbenchmark
A Benchmark of Text Classification in PyTorch
Stars: ✭ 534 (+947.06%)
Mutual labels:  attention-is-all-you-need
Cognitive Speech Tts
Microsoft Text-to-Speech API sample code in several languages, part of Cognitive Services.
Stars: ✭ 312 (+511.76%)
Mutual labels:  transformer
transformer-models
Deep Learning Transformer models in MATLAB
Stars: ✭ 90 (+76.47%)
Mutual labels:  transformer
DolboNet
Русскоязычный чат-бот для Discord на архитектуре Transformer
Stars: ✭ 53 (+3.92%)
Mutual labels:  transformer
linformer
Implementation of Linformer for Pytorch
Stars: ✭ 119 (+133.33%)
Mutual labels:  transformer
probabilistic nlg
Tensorflow Implementation of Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation (NAACL 2019).
Stars: ✭ 28 (-45.1%)
Mutual labels:  seq2seq
laravel-mutate
Mutate Laravel attributes
Stars: ✭ 13 (-74.51%)
Mutual labels:  transformer
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-3.92%)
Mutual labels:  transformer
Multiwoz
Source code for end-to-end dialogue model from the MultiWOZ paper (Budzianowski et al. 2018, EMNLP)
Stars: ✭ 384 (+652.94%)
Mutual labels:  seq2seq
MetricMT
The official code repository for MetricMT - a reward optimization method for NMT with learned metrics
Stars: ✭ 23 (-54.9%)
Mutual labels:  machine-translation
Filipino-Text-Benchmarks
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-56.86%)
Mutual labels:  transformer
Zero-Shot-TTS
Unofficial Implementation of Zero-Shot Text-to-Speech for Text-Based Insertion in Audio Narration
Stars: ✭ 33 (-35.29%)
Mutual labels:  transformer
Wenet
Production First and Production Ready End-to-End Speech Recognition Toolkit
Stars: ✭ 617 (+1109.8%)
Mutual labels:  transformer
text-style-transfer-benchmark
Text style transfer benchmark
Stars: ✭ 56 (+9.8%)
Mutual labels:  transformer
Deepchatmodels
Conversation models in TensorFlow. (website removed)
Stars: ✭ 312 (+511.76%)
Mutual labels:  sequence-to-sequence
MinTL
MinTL: Minimalist Transfer Learning for Task-Oriented Dialogue Systems
Stars: ✭ 61 (+19.61%)
Mutual labels:  transformer
Keras Question And Answering Web Api
Question answering system developed using seq2seq and memory network model in Keras
Stars: ✭ 21 (-58.82%)
Mutual labels:  seq2seq
Getting Things Done With Pytorch
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (+1347.06%)
Mutual labels:  transformer
Chatlearner
A chatbot implemented in TensorFlow based on the seq2seq model, with certain rules integrated.
Stars: ✭ 528 (+935.29%)
Mutual labels:  sequence-to-sequence
Zhihu
This repo contains the source code in my personal column (https://zhuanlan.zhihu.com/zhaoyeyu), implemented using Python 3.6. Including Natural Language Processing and Computer Vision projects, such as text generation, machine translation, deep convolution GAN and other actual combat code.
Stars: ✭ 3,307 (+6384.31%)
Mutual labels:  machine-translation
HRFormer
This is an official implementation of our NeurIPS 2021 paper "HRFormer: High-Resolution Transformer for Dense Prediction".
Stars: ✭ 357 (+600%)
Mutual labels:  transformer
Transformer Survey Study
"A survey of Transformer" paper study 👩🏻‍💻🧑🏻‍💻 KoreaUniv. DSBA Lab
Stars: ✭ 166 (+225.49%)
Mutual labels:  transformer
301-360 of 635 similar projects