lucidrains / Conformer
Licence: mit
Implementation of the convolutional module from the Conformer paper, for use in Transformers
Stars: ✭ 103
Programming Languages
python
139335 projects - #7 most used programming language
Projects that are alternatives of or similar to Conformer
Routing Transformer
Fully featured implementation of Routing Transformer
Stars: ✭ 149 (+44.66%)
Mutual labels: artificial-intelligence, transformer
Omninet
Official Pytorch implementation of "OmniNet: A unified architecture for multi-modal multi-task learning" | Authors: Subhojeet Pramanik, Priyanka Agrawal, Aman Hussain
Stars: ✭ 448 (+334.95%)
Mutual labels: artificial-intelligence, transformer
Linear Attention Transformer
Transformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (+99.03%)
Mutual labels: artificial-intelligence, transformer
Mixture Of Experts
A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models
Stars: ✭ 68 (-33.98%)
Mutual labels: artificial-intelligence, transformer
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+102.91%)
Mutual labels: artificial-intelligence, transformer
Se3 Transformer Pytorch
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-29.13%)
Mutual labels: artificial-intelligence, transformer
Chemgan Challenge
Code for the paper: Benhenda, M. 2017. ChemGAN challenge for drug discovery: can AI reproduce natural chemical diversity? arXiv preprint arXiv:1708.08227.
Stars: ✭ 98 (-4.85%)
Mutual labels: artificial-intelligence
Research And Coding
研究资源列表 A curated list of research resources
Stars: ✭ 100 (-2.91%)
Mutual labels: artificial-intelligence
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-5.83%)
Mutual labels: transformer
Papers Literature Ml Dl Rl Ai
Highly cited and useful papers related to machine learning, deep learning, AI, game theory, reinforcement learning
Stars: ✭ 1,341 (+1201.94%)
Mutual labels: artificial-intelligence
Top Deep Learning
Top 200 deep learning Github repositories sorted by the number of stars.
Stars: ✭ 1,365 (+1225.24%)
Mutual labels: artificial-intelligence
Antialiased Cnns
pip install antialiased-cnns to improve stability and accuracy
Stars: ✭ 1,363 (+1223.3%)
Mutual labels: artificial-intelligence
Helix theory
螺旋论(theory of helix)—— “熵减机理论(可用来构建AGI、复杂性系统等)”
Stars: ✭ 98 (-4.85%)
Mutual labels: artificial-intelligence
Scientificsummarizationdatasets
Datasets I have created for scientific summarization, and a trained BertSum model
Stars: ✭ 100 (-2.91%)
Mutual labels: transformer
Har Keras Cnn
Human Activity Recognition (HAR) with 1D Convolutional Neural Network in Python and Keras
Stars: ✭ 97 (-5.83%)
Mutual labels: artificial-intelligence
Bert ocr.pytorch
Unofficial PyTorch implementation of 2D Attentional Irregular Scene Text Recognizer
Stars: ✭ 101 (-1.94%)
Mutual labels: transformer
Rlai Exercises
Exercise Solutions for Reinforcement Learning: An Introduction [2nd Edition]
Stars: ✭ 97 (-5.83%)
Mutual labels: artificial-intelligence
Text predictor
Char-level RNN LSTM text generator📄.
Stars: ✭ 99 (-3.88%)
Mutual labels: artificial-intelligence
Deep Image Analogy Pytorch
Visual Attribute Transfer through Deep Image Analogy in PyTorch!
Stars: ✭ 100 (-2.91%)
Mutual labels: artificial-intelligence
Conformer
Implementation of the convolutional module from the Conformer paper, for improving the local inductive bias in Transformers.
Install
$ pip install conformer
Usage
The Conformer convolutional module, the main novelty of the paper
import torch
from conformer import ConformerConvModule
layer = ConformerConvModule(
dim = 512,
causal = False, # auto-regressive or not - 1d conv will be made causal with padding if so
expansion_factor = 2, # what multiple of the dimension to expand for the depthwise convolution
kernel_size = 31, # kernel size, 17 - 31 was said to be optimal
dropout = 0. # dropout at the very end
)
x = torch.randn(1, 1024, 512)
x = layer(x) + x
1 Conformer Block
import torch
from conformer import ConformerBlock
block = ConformerBlock(
dim = 512,
dim_head = 64,
heads = 8,
ff_mult = 4,
conv_expansion_factor = 2,
conv_kernel_size = 31,
attn_dropout = 0.,
ff_dropout = 0.,
conv_dropout = 0.
)
x = torch.randn(1, 1024, 512)
block(x) # (1, 1024, 512)
Citations
@misc{gulati2020conformer,
title={Conformer: Convolution-augmented Transformer for Speech Recognition},
author={Anmol Gulati and James Qin and Chung-Cheng Chiu and Niki Parmar and Yu Zhang and Jiahui Yu and Wei Han and Shibo Wang and Zhengdong Zhang and Yonghui Wu and Ruoming Pang},
year={2020},
eprint={2005.08100},
archivePrefix={arXiv},
primaryClass={eess.AS}
}
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].