Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+181.51%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-91.1%)
Image-CaptionUsing LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-75.34%)
Awesome Bert NlpA curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+288.36%)
Linear Attention TransformerTransformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (+40.41%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+179.45%)
Transformers-RLAn easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (-26.71%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+578.08%)
Overlappredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (-27.4%)
go enrichmentTranscripts annotation and GO enrichment Fisher tests
Stars: ✭ 24 (-83.56%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+2397.26%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-72.6%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+100.68%)
EqtransformerEQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (-34.93%)
Eeg DlA Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (+13.01%)
FragmentVCAny-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (-8.22%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-60.96%)
Transformer TtsA Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Stars: ✭ 418 (+186.3%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+243.15%)
Se3 Transformer PytorchImplementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-50%)
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (-10.27%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-17.12%)
switchdeInference of switch-like differential expression along single-cell trajectories
Stars: ✭ 19 (-86.99%)
graphsimR package: Simulate Expression data from igraph network using mvtnorm (CRAN; JOSS)
Stars: ✭ 16 (-89.04%)
visualizationa collection of visualization function
Stars: ✭ 189 (+29.45%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-23.97%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (-18.49%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+43.15%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-84.25%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+59.59%)
DAF3DDeep Attentive Features for Prostate Segmentation in 3D Transrectal Ultrasound
Stars: ✭ 60 (-58.9%)
MERINGUEcharacterizing spatial gene expression heterogeneity in spatially resolved single-cell transcriptomics data with nonuniform cellular densities
Stars: ✭ 33 (-77.4%)
interARTICInterARTIC - An interactive local web application for viral whole genome sequencing utilising the artic network pipelines..
Stars: ✭ 22 (-84.93%)
Vision-Language-TransformerVision-Language Transformer and Query Generation for Referring Segmentation (ICCV 2021)
Stars: ✭ 127 (-13.01%)
CSV2RDFStreaming, transforming, SPARQL-based CSV to RDF converter. Apache license.
Stars: ✭ 48 (-67.12%)
Scaff10XPipeline for scaffolding and breaking a genome assembly using 10x genomics linked-reads
Stars: ✭ 21 (-85.62%)
transformer-sltSign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (-36.99%)
faster lmm dA faster lmm for GWAS. Supports GPU backend.
Stars: ✭ 12 (-91.78%)
souporcellClustering scRNAseq by genotypes
Stars: ✭ 88 (-39.73%)
text2keywordsTrained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (-63.7%)
NTUA-slp-nlp💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-86.99%)
Patient2VecPatient2Vec: A Personalized Interpretable Deep Representation of the Longitudinal Electronic Health Record
Stars: ✭ 85 (-41.78%)
SANETArbitrary Style Transfer with Style-Attentional Networks
Stars: ✭ 105 (-28.08%)
jgi-queryA simple command-line tool to download data from Joint Genome Institute databases
Stars: ✭ 38 (-73.97%)
rnaseq-nfA proof of concept of RNAseq pipeline
Stars: ✭ 44 (-69.86%)
halonet-pytorchImplementation of the 😇 Attention layer from the paper, Scaling Local Self-Attention For Parameter Efficient Visual Backbones
Stars: ✭ 181 (+23.97%)
unimapA EXPERIMENTAL fork of minimap2 optimized for assembly-to-reference alignment
Stars: ✭ 76 (-47.95%)
LaTeX-OCRpix2tex: Using a ViT to convert images of equations into LaTeX code.
Stars: ✭ 1,566 (+972.6%)
jbrowse-componentsMonorepo with JBrowse 2 web, JBrowse 2 desktop, the JB core package, and core plugins. To customize behaviors, write an in-house plugin.
Stars: ✭ 89 (-39.04%)
DeepPhonemizerGrapheme to phoneme conversion with deep learning.
Stars: ✭ 152 (+4.11%)
TS-CAMCodes for TS-CAM: Token Semantic Coupled Attention Map for Weakly Supervised Object Localization.
Stars: ✭ 96 (-34.25%)
psichomicsInteractive R package to quantify, analyse and visualise alternative splicing
Stars: ✭ 26 (-82.19%)