All Projects → Graphtransformer → Similar Projects or Alternatives

508 Open source projects that are alternatives of or similar to Graphtransformer

Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (-40.11%)
Mutual labels:  attention, transformer
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+11.76%)
Mutual labels:  attention, transformer
Cell Detr
Official and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-86.1%)
Mutual labels:  attention, transformer
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (-78.07%)
Mutual labels:  transformer, attention
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (+102.67%)
Mutual labels:  attention, transformer
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (-43.32%)
Mutual labels:  attention, transformer
Sightseq
Computer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (-37.97%)
Mutual labels:  attention, transformer
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+1727.81%)
Mutual labels:  attention, transformer
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-85.03%)
Mutual labels:  transformer, attention
Keras Transformer
Transformer implemented in Keras
Stars: ✭ 273 (+45.99%)
Mutual labels:  attention, transformer
Transformer Tensorflow
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+70.59%)
Mutual labels:  attention, transformer
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-65.78%)
Mutual labels:  attention, transformer
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-69.52%)
Mutual labels:  transformer, attention
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+119.79%)
Mutual labels:  attention, transformer
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+118.18%)
Mutual labels:  attention, transformer
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (-72.73%)
Mutual labels:  transformer, attention
visualization
a collection of visualization function
Stars: ✭ 189 (+1.07%)
Mutual labels:  transformer, attention
Medical Transformer
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (-18.18%)
Mutual labels:  attention, transformer
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-73.8%)
Mutual labels:  transformer, attention
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-35.29%)
Mutual labels:  transformer, attention
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-84.49%)
Mutual labels:  transformer, attention
ai challenger 2018 sentiment analysis
Fine-grained Sentiment Analysis of User Reviews --- AI CHALLENGER 2018
Stars: ✭ 16 (-91.44%)
Mutual labels:  transformer, attention
Relation-Extraction-Transformer
NLP: Relation extraction with position-aware self-attention transformer
Stars: ✭ 63 (-66.31%)
Mutual labels:  transformer, attention
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+202.14%)
Mutual labels:  attention, transformer
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+235.29%)
Mutual labels:  attention, transformer
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (+25.67%)
Mutual labels:  attention, transformer
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-48.13%)
Mutual labels:  attention, transformer
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+110.7%)
Mutual labels:  attention, transformer
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+5191.44%)
Mutual labels:  attention, transformer
Transformers.jl
Julia Implementation of Transformer models
Stars: ✭ 173 (-7.49%)
Mutual labels:  attention, transformer
Onnxt5
Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (-23.53%)
Mutual labels:  transformer
End2end Asr Pytorch
End-to-End Automatic Speech Recognition on PyTorch
Stars: ✭ 175 (-6.42%)
Mutual labels:  transformer
Nlp research
NLP research:基于tensorflow的nlp深度学习项目,支持文本分类/句子匹配/序列标注/文本生成 四大任务
Stars: ✭ 141 (-24.6%)
Mutual labels:  transformer
Prediction Flow
Deep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (-26.2%)
Mutual labels:  attention
Deep Time Series Prediction
Seq2Seq, Bert, Transformer, WaveNet for time series prediction.
Stars: ✭ 183 (-2.14%)
Mutual labels:  attention
Multimodal Sentiment Analysis
Attention-based multimodal fusion for sentiment analysis
Stars: ✭ 172 (-8.02%)
Mutual labels:  attention
Vqa regat
Research Code for ICCV 2019 paper "Relation-aware Graph Attention Network for Visual Question Answering"
Stars: ✭ 129 (-31.02%)
Mutual labels:  attention
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-32.62%)
Mutual labels:  attention
Gpt 2 Tensorflow2.0
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Stars: ✭ 172 (-8.02%)
Mutual labels:  transformer
Chinese Chatbot
中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (-33.69%)
Mutual labels:  attention
Asr syllable
基于卷积神经网络的语音识别声学模型的研究
Stars: ✭ 127 (-32.09%)
Mutual labels:  attention
Self Attentive Tensorflow
Tensorflow implementation of "A Structured Self-Attentive Sentence Embedding"
Stars: ✭ 189 (+1.07%)
Mutual labels:  attention
Pyramid Attention Networks Pytorch
Implementation of Pyramid Attention Networks for Semantic Segmentation.
Stars: ✭ 182 (-2.67%)
Mutual labels:  attention
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (-11.76%)
Mutual labels:  transformer
Absa keras
Keras Implementation of Aspect based Sentiment Analysis
Stars: ✭ 126 (-32.62%)
Mutual labels:  attention
Ccnet Pure Pytorch
Criss-Cross Attention for Semantic Segmentation in pure Pytorch with a faster and more precise implementation.
Stars: ✭ 124 (-33.69%)
Mutual labels:  attention
Effective transformer
Running BERT without Padding
Stars: ✭ 169 (-9.63%)
Mutual labels:  transformer
Fastpunct
Punctuation restoration and spell correction experiments.
Stars: ✭ 121 (-35.29%)
Mutual labels:  attention
Fairseq Image Captioning
Transformer-based image captioning extension for pytorch/fairseq
Stars: ✭ 180 (-3.74%)
Mutual labels:  transformer
Hey Jetson
Deep Learning based Automatic Speech Recognition with attention for the Nvidia Jetson.
Stars: ✭ 161 (-13.9%)
Mutual labels:  attention
Transformer In Generating Dialogue
An Implementation of 'Attention is all you need' with Chinese Corpus
Stars: ✭ 121 (-35.29%)
Mutual labels:  transformer
Nlp Models Tensorflow
Gathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Stars: ✭ 1,603 (+757.22%)
Mutual labels:  attention
Symfony Jsonapi
JSON API Transformer Bundle for Symfony 2 and Symfony 3
Stars: ✭ 114 (-39.04%)
Mutual labels:  transformer
Mmsegmentation
OpenMMLab Semantic Segmentation Toolbox and Benchmark.
Stars: ✭ 2,875 (+1437.43%)
Mutual labels:  transformer
Kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (+1.6%)
Mutual labels:  transformer
Sentimentanalysis
Sentiment analysis neural network trained by fine-tuning BERT, ALBERT, or DistilBERT on the Stanford Sentiment Treebank.
Stars: ✭ 186 (-0.53%)
Mutual labels:  transformer
Transformer Clinic
Understanding the Difficulty of Training Transformers
Stars: ✭ 179 (-4.28%)
Mutual labels:  transformer
Hrnet Semantic Segmentation
The OCR approach is rephrased as Segmentation Transformer: https://arxiv.org/abs/1909.11065. This is an official implementation of semantic segmentation for HRNet. https://arxiv.org/abs/1908.07919
Stars: ✭ 2,369 (+1166.84%)
Mutual labels:  transformer
Leader Line
Draw a leader line in your web page.
Stars: ✭ 1,872 (+901.07%)
Mutual labels:  attention
Embedding As Service
One-Stop Solution to encode sentence to fixed length vectors from various embedding techniques
Stars: ✭ 151 (-19.25%)
Mutual labels:  transformer
1-60 of 508 similar projects