All Projects → h-transformer-1d → Similar Projects or Alternatives

673 Open source projects that are alternatives of or similar to h-transformer-1d

Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+239.67%)
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+237.19%)
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-52.89%)
visualization
a collection of visualization function
Stars: ✭ 189 (+56.2%)
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+72.73%)
OverlapPredator
[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+142.15%)
Mutual labels:  transformer, attention-mechanism
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (-57.85%)
Mutual labels:  transformer, attention
linformer
Implementation of Linformer for Pytorch
Stars: ✭ 119 (-1.65%)
Mutual labels:  transformer, attention-mechanism
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+225.62%)
Mutual labels:  transformer, attention
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+418.18%)
Mutual labels:  transformer, attention
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+368.6%)
Mutual labels:  transformer, attention-mechanism
Cell Detr
Official and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-78.51%)
Mutual labels:  transformer, attention
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-76.86%)
Mutual labels:  transformer, attention
Relation-Extraction-Transformer
NLP: Relation extraction with position-aware self-attention transformer
Stars: ✭ 63 (-47.93%)
Mutual labels:  transformer, attention
Transformer-in-Transformer
An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-66.94%)
Mutual labels:  transformer, attention-mechanism
enformer-pytorch
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+20.66%)
Mutual labels:  transformer, attention-mechanism
Transformer
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+2913.22%)
Mutual labels:  transformer, attention-mechanism
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (+213.22%)
Mutual labels:  transformer, attention
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+366.94%)
Mutual labels:  transformer, attention
galerkin-transformer
[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-8.26%)
Mutual labels:  transformer, attention-mechanism
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-19.83%)
Mutual labels:  transformer, attention
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (-12.4%)
Mutual labels:  transformer, attention
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-59.5%)
Mutual labels:  transformer, attention
Routing Transformer
Fully featured implementation of Routing Transformer
Stars: ✭ 149 (+23.14%)
Mutual labels:  transformer, attention-mechanism
Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (-7.44%)
Mutual labels:  transformer, attention
Medical Transformer
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (+26.45%)
Mutual labels:  transformer, attention
Transformers.jl
Julia Implementation of Transformer models
Stars: ✭ 173 (+42.98%)
Mutual labels:  transformer, attention
dodrio
Exploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+92.56%)
Mutual labels:  transformer, attention-mechanism
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-80.99%)
Mutual labels:  transformer, attention-mechanism
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+2724.79%)
Mutual labels:  transformer, attention
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-70.25%)
Mutual labels:  transformer, attention-mechanism
FragmentVC
Any-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (+10.74%)
Mutual labels:  transformer, attention-mechanism
pynmt
a simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-89.26%)
Mutual labels:  transformer, attention-mechanism
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-76.03%)
Mutual labels:  transformer, attention
Transformer Tensorflow
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+163.64%)
Mutual labels:  transformer, attention
Keras Transformer
Transformer implemented in Keras
Stars: ✭ 273 (+125.62%)
Mutual labels:  transformer, attention
ai challenger 2018 sentiment analysis
Fine-grained Sentiment Analysis of User Reviews --- AI CHALLENGER 2018
Stars: ✭ 16 (-86.78%)
Mutual labels:  transformer, attention
lstm-attention
Attention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (-28.1%)
Mutual labels:  attention, attention-mechanism
Transformers-RL
An easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (-11.57%)
Mutual labels:  transformer, attention-mechanism
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+314.05%)
Mutual labels:  transformer, attention-mechanism
Transformer Tts
A Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Stars: ✭ 418 (+245.45%)
Mutual labels:  transformer, attention-mechanism
Neat Vision
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Stars: ✭ 213 (+76.03%)
Mutual labels:  attention, attention-mechanism
Eqtransformer
EQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (-21.49%)
Mutual labels:  transformer, attention-mechanism
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+8077.69%)
Mutual labels:  transformer, attention
Overlappredator
[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (-12.4%)
Mutual labels:  transformer, attention-mechanism
Se3 Transformer Pytorch
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-39.67%)
Mutual labels:  transformer, attention-mechanism
Transformer In Generating Dialogue
An Implementation of 'Attention is all you need' with Chinese Corpus
Stars: ✭ 121 (+0%)
Mutual labels:  transformer, attention-mechanism
Sightseq
Computer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (-4.13%)
Mutual labels:  transformer, attention
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (+36.36%)
Mutual labels:  transformer, attention-mechanism
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-47.11%)
Mutual labels:  transformer, attention
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (+94.21%)
Mutual labels:  transformer, attention
Im2LaTeX
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-86.78%)
Mutual labels:  attention, attention-mechanism
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (-66.12%)
Mutual labels:  transformer, attention
Linear Attention Transformer
Transformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (+69.42%)
Mutual labels:  transformer, attention-mechanism
Hnatt
Train and visualize Hierarchical Attention Networks
Stars: ✭ 192 (+58.68%)
Mutual labels:  attention, attention-mechanism
Guided Attention Inference Network
Contains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Stars: ✭ 204 (+68.6%)
Mutual labels:  attention, attention-mechanism
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+718.18%)
Mutual labels:  transformer, attention-mechanism
Graphtransformer
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (+54.55%)
Mutual labels:  transformer, attention
TianChi AIEarth
TianChi AIEarth Contest Solution
Stars: ✭ 57 (-52.89%)
Mutual labels:  transformer, attention-mechanism
En-transformer
Implementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (+8.26%)
Mutual labels:  transformer, attention-mechanism
1-60 of 673 similar projects