Image Caption GeneratorA neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (+250%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (-30.56%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+2650%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (+11.11%)
Image captioninggenerate captions for images using a CNN-RNN model that is trained on the Microsoft Common Objects in COntext (MS COCO) dataset
Stars: ✭ 51 (+41.67%)
FragmentVCAny-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (+272.22%)
enformer-pytorchImplementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+305.56%)
Transformer TtsA Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Stars: ✭ 418 (+1061.11%)
Awesome Bert NlpA curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+1475%)
Se3 Transformer PytorchImplementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (+102.78%)
Image CaptioningImage Captioning using InceptionV3 and beam search
Stars: ✭ 290 (+705.56%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (+66.67%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-63.89%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+1033.33%)
LightseqLightSeq: A High Performance Inference Library for Sequence Processing and Generation
Stars: ✭ 501 (+1291.67%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (+230.56%)
Routing TransformerFully featured implementation of Routing Transformer
Stars: ✭ 149 (+313.89%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+713.89%)
visualizationa collection of visualization function
Stars: ✭ 189 (+425%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+9394.44%)
NeuralmonkeyAn open-source tool for sequence learning in NLP built on TensorFlow.
Stars: ✭ 400 (+1011.11%)
AdaptiveattentionImplementation of "Knowing When to Look: Adaptive Attention via A Visual Sentinel for Image Captioning"
Stars: ✭ 303 (+741.67%)
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (+263.89%)
TS3000 TheChatBOTIts a social networking chat-bot trained on Reddit dataset . It supports open bounded queries developed on the concept of Neural Machine Translation. Beware of its being sarcastic just like its creator 😝 BDW it uses Pytorch framework and Python3.
Stars: ✭ 20 (-44.44%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (+58.33%)
AoanetCode for paper "Attention on Attention for Image Captioning". ICCV 2019
Stars: ✭ 242 (+572.22%)
Sca Cnn.cvpr17Image Captions Generation with Spatial and Channel-wise Attention
Stars: ✭ 198 (+450%)
captioning chainerA fast implementation of Neural Image Caption by Chainer
Stars: ✭ 17 (-52.78%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+10027.78%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (+208.33%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+1041.67%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+1291.67%)
OmninetOfficial Pytorch implementation of "OmniNet: A unified architecture for multi-modal multi-task learning" | Authors: Subhojeet Pramanik, Priyanka Agrawal, Aman Hussain
Stars: ✭ 448 (+1144.44%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+547.22%)
RSTNetRSTNet: Captioning with Adaptive Attention on Visual and Non-Visual Words (CVPR 2021)
Stars: ✭ 71 (+97.22%)
Eeg DlA Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (+358.33%)
SightseqComputer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (+222.22%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+480.56%)
Linear Attention TransformerTransformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (+469.44%)
Transformers-RLAn easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (+197.22%)
Overlappredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (+194.44%)
udacity-cvnd-projectsMy solutions to the projects assigned for the Udacity Computer Vision Nanodegree
Stars: ✭ 36 (+0%)
catrImage Captioning Using Transformer
Stars: ✭ 206 (+472.22%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+236.11%)
Tf Seq2seqSequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+975%)
Seq2seq chatbot new基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 144 (+300%)
EqtransformerEQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (+163.89%)
Transformer Temporal TaggerCode and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (+52.78%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-36.11%)
minimal-nmtA minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (+0%)