All Projects → Transformer → Similar Projects or Alternatives

1000 Open source projects that are alternatives of or similar to Transformer

Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (-72.85%)
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (-86.26%)
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (-88.73%)
pynmt
a simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-99.64%)
Pytorch Transformer
pytorch implementation of Attention is all you need
Stars: ✭ 199 (-94.54%)
Witwicky
Witwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-99.42%)
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (-88.81%)
Mutual labels:  attention-mechanism, transformer
Transformer Tts
A Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Stars: ✭ 418 (-88.54%)
Mutual labels:  attention-mechanism, transformer
Routing Transformer
Fully featured implementation of Routing Transformer
Stars: ✭ 149 (-95.91%)
Mutual labels:  attention-mechanism, transformer
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (-95.47%)
Mutual labels:  attention-mechanism, transformer
Keras Attention
Visualizing RNNs using the attention mechanism
Stars: ✭ 697 (-80.88%)
Mutual labels:  translation, attention-mechanism
Transformer Tensorflow
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (-91.25%)
Mutual labels:  translation, transformer
Dab
Data Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (-91.94%)
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (-84.5%)
Gpt 2 Tensorflow2.0
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Stars: ✭ 172 (-95.28%)
Mutual labels:  implementation, transformer
Machine Translation
Stars: ✭ 51 (-98.6%)
Overlappredator
[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (-97.09%)
Mutual labels:  attention-mechanism, transformer
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-96.74%)
Rust Bert
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Stars: ✭ 510 (-86.01%)
Mutual labels:  translation, transformer
Onnxt5
Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (-96.08%)
Mutual labels:  translation, transformer
TianChi AIEarth
TianChi AIEarth Contest Solution
Stars: ✭ 57 (-98.44%)
Mutual labels:  transformer, attention-mechanism
transformer
A simple TensorFlow implementation of the Transformer
Stars: ✭ 25 (-99.31%)
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (-87.49%)
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-99.37%)
Mutual labels:  transformer, attention-mechanism
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-96.68%)
Mutual labels:  transformer, attention-mechanism
dodrio
Exploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (-93.61%)
Mutual labels:  transformer, attention-mechanism
speech-transformer
Transformer implementation speciaized in speech recognition tasks using Pytorch.
Stars: ✭ 40 (-98.9%)
OverlapPredator
[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (-91.96%)
Mutual labels:  transformer, attention-mechanism
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (-82.8%)
enformer-pytorch
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (-96%)
Mutual labels:  transformer, attention-mechanism
Kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (-94.79%)
Transformers without tears
Transformers without Tears: Improving the Normalization of Self-Attention
Stars: ✭ 80 (-97.81%)
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-99.01%)
Mutual labels:  transformer, attention-mechanism
FragmentVC
Any-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (-96.32%)
Mutual labels:  transformer, attention-mechanism
Eqtransformer
EQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (-97.39%)
Mutual labels:  attention-mechanism, transformer
Se3 Transformer Pytorch
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-98%)
Mutual labels:  attention-mechanism, transformer
Transformer In Generating Dialogue
An Implementation of 'Attention is all you need' with Chinese Corpus
Stars: ✭ 121 (-96.68%)
Mutual labels:  attention-mechanism, transformer
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (-84.45%)
Mutual labels:  attention-mechanism, transformer
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-94.27%)
Mutual labels:  attention-mechanism, transformer
Linear Attention Transformer
Transformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (-94.38%)
Mutual labels:  attention-mechanism, transformer
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-98.35%)
Transformer-in-Transformer
An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-98.9%)
Mutual labels:  transformer, attention-mechanism
Transformer
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Stars: ✭ 271 (-92.57%)
Transformers-RL
An easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (-97.07%)
Mutual labels:  transformer, attention-mechanism
En-transformer
Implementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (-96.41%)
Mutual labels:  transformer, attention-mechanism
linformer
Implementation of Linformer for Pytorch
Stars: ✭ 119 (-96.74%)
Mutual labels:  transformer, attention-mechanism
Keras Transformer
Transformer implemented in Keras
Stars: ✭ 273 (-92.51%)
Mutual labels:  translation, transformer
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-98.44%)
Mutual labels:  transformer, attention-mechanism
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-99.23%)
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-97.34%)
Mutual labels:  translation, transformer
visualization
a collection of visualization function
Stars: ✭ 189 (-94.82%)
Mutual labels:  transformer, attention-mechanism
attention-is-all-you-need-paper
Implementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (-97.34%)
galerkin-transformer
[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-96.96%)
Mutual labels:  transformer, attention-mechanism
Cognitive Speech Tts
Microsoft Text-to-Speech API sample code in several languages, part of Cognitive Services.
Stars: ✭ 312 (-91.44%)
Mutual labels:  transformer
Vedastr
A scene text recognition toolbox based on PyTorch
Stars: ✭ 290 (-92.05%)
Mutual labels:  transformer
Lbry Desktop
A browser and wallet for LBRY, the decentralized, user-controlled content marketplace.
Stars: ✭ 3,597 (-1.34%)
Mutual labels:  translation
Contextualized Topic Models
A python package to run contextualized topic modeling. CTMs combine BERT with topic models to get coherent topics. Also supports multilingual tasks. Cross-lingual Zero-shot model published at EACL 2021.
Stars: ✭ 318 (-91.28%)
Mutual labels:  transformer
Localize
🏁 Automatically clean your Localizable.strings files
Stars: ✭ 311 (-91.47%)
Mutual labels:  translation
Gotext
Go (Golang) GNU gettext utilities package
Stars: ✭ 292 (-91.99%)
Mutual labels:  translation
Js Lingui
🌍📖 A readable, automated, and optimized (5 kb) internationalization for JavaScript
Stars: ✭ 3,249 (-10.89%)
Mutual labels:  translation
1-60 of 1000 similar projects