Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+204.48%)
enformer-pytorchImplementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+8.96%)
Image-CaptionUsing LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-73.13%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-17.16%)
Linear Attention TransformerTransformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (+52.99%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (-11.19%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+55.97%)
visualizationa collection of visualization function
Stars: ✭ 189 (+41.04%)
EqtransformerEQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (-29.1%)
Overlappredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (-20.9%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-9.7%)
Transformer TtsA Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Stars: ✭ 418 (+211.94%)
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (-2.24%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-70.15%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+638.81%)
Awesome Bert NlpA curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+323.13%)
Eeg DlA Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (+23.13%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+2620.9%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+118.66%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+206.72%)
Se3 Transformer PytorchImplementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-45.52%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+273.88%)
Transformers-RLAn easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (-20.15%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-90.3%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-82.84%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-57.46%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+73.88%)
DeepPhonemizerGrapheme to phoneme conversion with deep learning.
Stars: ✭ 152 (+13.43%)
LaTeX-OCRpix2tex: Using a ViT to convert images of equations into LaTeX code.
Stars: ✭ 1,566 (+1068.66%)
egfr-attDrug effect prediction using neural network
Stars: ✭ 17 (-87.31%)
j2j2 is a minimalist concatenative programming language that makes up for its simplicity by its ability to natively bind with C libraries' ABI *and types*, *without glue*
Stars: ✭ 37 (-72.39%)
CSV2RDFStreaming, transforming, SPARQL-based CSV to RDF converter. Apache license.
Stars: ✭ 48 (-64.18%)
keras-deep-learningVarious implementations and projects on CNN, RNN, LSTM, GAN, etc
Stars: ✭ 22 (-83.58%)
FNet-pytorchUnofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms
Stars: ✭ 204 (+52.24%)
AoA-pytorchA Pytorch implementation of Attention on Attention module (both self and guided variants), for Visual Question Answering
Stars: ✭ 33 (-75.37%)
graphtransRepresenting Long-Range Context for Graph Neural Networks with Global Attention
Stars: ✭ 45 (-66.42%)
blacklighta stack-based concatenative virtual machine for implementing highly concurrent languages
Stars: ✭ 42 (-68.66%)
TS-CAMCodes for TS-CAM: Token Semantic Coupled Attention Map for Weakly Supervised Object Localization.
Stars: ✭ 96 (-28.36%)
NTUA-slp-nlp💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-85.82%)
ntua-slp-semeval2018Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (-41.04%)
3HANAn original implementation of "3HAN: A Deep Neural Network for Fake News Detection" (ICONIP 2017)
Stars: ✭ 29 (-78.36%)
speech-transformerTransformer implementation speciaized in speech recognition tasks using Pytorch.
Stars: ✭ 40 (-70.15%)
domain-attentioncodes for paper "Domain Attention Model for Multi-Domain Sentiment Classification"
Stars: ✭ 22 (-83.58%)
XpersonaXPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (-59.7%)
DAF3DDeep Attentive Features for Prostate Segmentation in 3D Transrectal Ultrasound
Stars: ✭ 60 (-55.22%)
PDNThe official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-67.16%)
Vision-Language-TransformerVision-Language Transformer and Query Generation for Referring Segmentation (ICCV 2021)
Stars: ✭ 127 (-5.22%)
abcnn pytorchImplementation of ABCNN(Attention-Based Convolutional Neural Network) on Pytorch
Stars: ✭ 35 (-73.88%)
RSTNetRSTNet: Captioning with Adaptive Attention on Visual and Non-Visual Words (CVPR 2021)
Stars: ✭ 71 (-47.01%)
transformer-sltSign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (-31.34%)
Transformer-ocrHandwritten text recognition using transformers.
Stars: ✭ 92 (-31.34%)