All Projects → Awesome Fast Attention → Similar Projects or Alternatives

517 Open source projects that are alternatives of or similar to Awesome Fast Attention

Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (-9.89%)
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (-34.45%)
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-95.53%)
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (-91.87%)
Mutual labels:  transformer, attention
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (-27.27%)
Transformer Tensorflow
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (-49.12%)
Mutual labels:  attention, transformer
Cell Detr
Official and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-95.85%)
Mutual labels:  attention, transformer
Pytorch Transformer
pytorch implementation of Attention is all you need
Stars: ✭ 199 (-68.26%)
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-84.53%)
Mutual labels:  attention, transformer
Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (-82.14%)
Mutual labels:  attention, transformer
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (-93.46%)
Mutual labels:  transformer, attention
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-92.19%)
Mutual labels:  transformer, attention
Graphtransformer
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (-70.18%)
Mutual labels:  attention, transformer
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (-34.93%)
Mutual labels:  attention, transformer
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-90.91%)
Mutual labels:  transformer, attention
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (-37.16%)
Mutual labels:  attention, transformer
Transformers without tears
Transformers without Tears: Improving the Normalization of Self-Attention
Stars: ✭ 80 (-87.24%)
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (-39.55%)
Mutual labels:  attention, transformer
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (-83.09%)
Mutual labels:  attention, transformer
Transformer
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+481.5%)
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (-62.52%)
Mutual labels:  attention, transformer
transformer
A simple TensorFlow implementation of the Transformer
Stars: ✭ 25 (-96.01%)
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-80.7%)
Mutual labels:  transformer, attention
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-95.37%)
Mutual labels:  transformer, attention
Attention Is All You Need Pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Stars: ✭ 6,070 (+868.1%)
visualization
a collection of visualization function
Stars: ✭ 189 (-69.86%)
Mutual labels:  transformer, attention
Transformers.jl
Julia Implementation of Transformer models
Stars: ✭ 173 (-72.41%)
Mutual labels:  attention, transformer
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+445.14%)
Mutual labels:  attention, transformer
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-90.43%)
attention-is-all-you-need-paper
Implementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (-84.53%)
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (-20.1%)
Medical Transformer
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (-75.6%)
Mutual labels:  attention, transformer
Relation-Extraction-Transformer
NLP: Relation extraction with position-aware self-attention transformer
Stars: ✭ 63 (-89.95%)
Mutual labels:  transformer, attention
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-66.67%)
Mutual labels:  attention, transformer
Machine Translation
Stars: ✭ 51 (-91.87%)
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+57.89%)
Kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (-69.7%)
Witwicky
Witwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-96.65%)
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+1478.15%)
Mutual labels:  attention, transformer
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-89.79%)
Mutual labels:  attention, transformer
speech-transformer
Transformer implementation speciaized in speech recognition tasks using Pytorch.
Stars: ✭ 40 (-93.62%)
Sightseq
Computer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (-81.5%)
Mutual labels:  attention, transformer
Keras Transformer
Transformer implemented in Keras
Stars: ✭ 273 (-56.46%)
Mutual labels:  attention, transformer
ai challenger 2018 sentiment analysis
Fine-grained Sentiment Analysis of User Reviews --- AI CHALLENGER 2018
Stars: ✭ 16 (-97.45%)
Mutual labels:  transformer, attention
Dab
Data Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (-53.11%)
Joeynmt
Minimalist NMT for educational purposes
Stars: ✭ 420 (-33.01%)
Mutual labels:  transformer
Former
Simple transformer implementation from scratch in pytorch.
Stars: ✭ 500 (-20.26%)
Mutual labels:  transformer
Transformer Tts
A Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Stars: ✭ 418 (-33.33%)
Mutual labels:  transformer
Lightseq
LightSeq: A High Performance Inference Library for Sequence Processing and Generation
Stars: ✭ 501 (-20.1%)
Mutual labels:  transformer
Recurrent Visual Attention
A PyTorch Implementation of "Recurrent Models of Visual Attention"
Stars: ✭ 414 (-33.97%)
Mutual labels:  attention
Tsai
Time series Timeseries Deep Learning Pytorch fastai - State-of-the-art Deep Learning with Time Series and Sequences in Pytorch / fastai
Stars: ✭ 407 (-35.09%)
Mutual labels:  transformer
Deep learning nlp
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (-35.09%)
Mutual labels:  attention
Typescript Is
Stars: ✭ 595 (-5.1%)
Mutual labels:  transformer
Bert paper chinese translation
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译 Chinese Translation!
Stars: ✭ 564 (-10.05%)
Mutual labels:  transformer
Awesome Visual Transformer
Collect some papers about transformer with vision. Awesome Transformer with Computer Vision (CV)
Stars: ✭ 475 (-24.24%)
Mutual labels:  transformer
Deepsvg
[NeurIPS 2020] Official code for the paper "DeepSVG: A Hierarchical Generative Network for Vector Graphics Animation". Includes a PyTorch library for deep learning with SVG data.
Stars: ✭ 403 (-35.73%)
Mutual labels:  transformer
Nlp Paper
NLP Paper
Stars: ✭ 484 (-22.81%)
Mutual labels:  transformer
Cubert
Fast implementation of BERT inference directly on NVIDIA (CUDA, CUBLAS) and Intel MKL
Stars: ✭ 395 (-37%)
Mutual labels:  transformer
Pvt
Stars: ✭ 379 (-39.55%)
Mutual labels:  transformer
Performer Pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (-12.92%)
Mutual labels:  attention
1-60 of 517 similar projects