All Projects → Pytorch Original Transformer → Similar Projects or Alternatives

7147 Open source projects that are alternatives of or similar to Pytorch Original Transformer

Pytorch Gat
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+120.92%)
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-71.05%)
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-86.13%)
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (-42.82%)
visualization
a collection of visualization function
Stars: ✭ 189 (-54.01%)
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+731.63%)
Graph attention pool
Attention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Stars: ✭ 186 (-54.74%)
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+140.88%)
Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (-72.75%)
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-49.15%)
Transformer
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+787.1%)
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-70.56%)
Action Recognition Visual Attention
Action recognition using soft attention based deep recurrent neural networks
Stars: ✭ 350 (-14.84%)
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (-0.73%)
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (-55.23%)
Dab
Data Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (-28.47%)
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+37.47%)
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-84.43%)
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+2307.54%)
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+52.55%)
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+21.9%)
Chinese Chatbot
中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (-69.83%)
Mutual labels:  jupyter-notebook, jupyter, attention
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-93.19%)
Tensorwatch
Debugging, monitoring and visualization for Python Machine Learning and Data Science
Stars: ✭ 3,191 (+676.4%)
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-91.24%)
Mutual labels:  transformer, attention-mechanism
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-85.4%)
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (-87.59%)
Mutual labels:  transformer, attention
Learning-Lab-C-Library
This library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Stars: ✭ 20 (-95.13%)
Mutual labels:  transformer, deeplearning
SentimentAnalysis
Sentiment Analysis: Deep Bi-LSTM+attention model
Stars: ✭ 32 (-92.21%)
Transformer-in-Transformer
An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-90.27%)
Mutual labels:  transformer, attention-mechanism
FragmentVC
Any-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (-67.4%)
Mutual labels:  transformer, attention-mechanism
attention-is-all-you-need-paper
Implementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (-76.4%)
pynmt
a simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-96.84%)
Mutual labels:  transformer, attention-mechanism
linformer
Implementation of Linformer for Pytorch
Stars: ✭ 119 (-71.05%)
Mutual labels:  transformer, attention-mechanism
ai challenger 2018 sentiment analysis
Fine-grained Sentiment Analysis of User Reviews --- AI CHALLENGER 2018
Stars: ✭ 16 (-96.11%)
Mutual labels:  transformer, attention
enformer-pytorch
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (-64.48%)
Mutual labels:  transformer, attention-mechanism
Attention
一些不同的Attention机制代码
Stars: ✭ 17 (-95.86%)
Mutual labels:  attention, attention-mechanism
galerkin-transformer
[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-72.99%)
Mutual labels:  transformer, attention-mechanism
Deep learning nlp
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (-0.97%)
Mutual labels:  jupyter-notebook, attention
Spark Jupyter Aws
A guide on how to set up Jupyter with Pyspark painlessly on AWS EC2 clusters, with S3 I/O support
Stars: ✭ 259 (-36.98%)
Mutual labels:  jupyter-notebook, jupyter
Da Rnn
📃 **Unofficial** PyTorch Implementation of DA-RNN (arXiv:1704.02971)
Stars: ✭ 256 (-37.71%)
Deepsvg
[NeurIPS 2020] Official code for the paper "DeepSVG: A Hierarchical Generative Network for Vector Graphics Animation". Includes a PyTorch library for deep learning with SVG data.
Stars: ✭ 403 (-1.95%)
Mutual labels:  jupyter-notebook, transformer
Deeplearning.ai Assignments
Stars: ✭ 268 (-34.79%)
Mutual labels:  jupyter-notebook, deeplearning
Spacy Notebooks
💫 Jupyter notebooks for spaCy examples and tutorials
Stars: ✭ 255 (-37.96%)
Mutual labels:  jupyter-notebook, jupyter
Icsharp
C# kernel for Jupyter
Stars: ✭ 263 (-36.01%)
Mutual labels:  jupyter-notebook, jupyter
Geopython
Notebooks and libraries for spatial/geo Python explorations
Stars: ✭ 268 (-34.79%)
Mutual labels:  jupyter-notebook, jupyter
Transformer
Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Stars: ✭ 273 (-33.58%)
Mutual labels:  jupyter-notebook, transformer
Keras Transformer
Transformer implemented in Keras
Stars: ✭ 273 (-33.58%)
Mutual labels:  attention, transformer
NTUA-slp-nlp
💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-95.38%)
Mutual labels:  attention, attention-mechanism
Football Crunching
Analysis and datasets about football (soccer)
Stars: ✭ 252 (-38.69%)
Mutual labels:  jupyter-notebook, jupyter
Gophernotes
The Go kernel for Jupyter notebooks and nteract.
Stars: ✭ 3,100 (+654.26%)
Mutual labels:  jupyter-notebook, jupyter
Transformer
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Stars: ✭ 271 (-34.06%)
Tsai
Time series Timeseries Deep Learning Pytorch fastai - State-of-the-art Deep Learning with Time Series and Sequences in Pytorch / fastai
Stars: ✭ 407 (-0.97%)
Mutual labels:  jupyter-notebook, transformer
Adaptiveattention
Implementation of "Knowing When to Look: Adaptive Attention via A Visual Sentinel for Image Captioning"
Stars: ✭ 303 (-26.28%)
Attention is all you need
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
Stars: ✭ 303 (-26.28%)
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (-4.14%)
Mutual labels:  attention, transformer
Vimpyter
Edit your Jupyter notebooks in Vim/Neovim
Stars: ✭ 308 (-25.06%)
Mutual labels:  jupyter-notebook, jupyter
Spyder Notebook
Jupyter notebook integration with Spyder
Stars: ✭ 298 (-27.49%)
Mutual labels:  jupyter-notebook, jupyter
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (-25.55%)
Mutual labels:  attention-mechanism, attention
1-60 of 7147 similar projects