All Categories → Machine Learning → attention-mechanism

Top 223 attention-mechanism open source projects

Kac Net
Implementation of Knowledge Aided Consistency for Weakly Supervised Phrase Grounding in Tensorflow
Eqtransformer
EQTransformer, a python package for earthquake signal detection and phase picking using AI.
Attention unet
Raw implementation of attention gated U-Net by Keras
Grounder
Implementation of Grounding of Textual Phrases in Images by Reconstruction in Tensorflow
Sturcture Inpainting
Source code of AAAI 2020 paper 'Learning to Incorporate Structure Knowledge for Image Inpainting'
Simplednn
SimpleDNN is a machine learning lightweight open-source library written in Kotlin designed to support relevant neural network architectures in natural language processing tasks
Deepaffinity
Protein-compound affinity prediction through unified RNN-CNN
Hierarchical Attention Networks
TensorFlow implementation of the paper "Hierarchical Attention Networks for Document Classification"
Fake news detection deep learning
Fake News Detection using Deep Learning models in Tensorflow
Se3 Transformer Pytorch
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Group Level Emotion Recognition
Model submitted for the ICMI 2018 EmotiW Group-Level Emotion Recognition Challenge
Pytorch Attention Guided Cyclegan
Pytorch implementation of Unsupervised Attention-guided Image-to-Image Translation.
Deepattention
Deep Visual Attention Prediction (TIP18)
Global Self Attention Network
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Ca Net
Code for Comprehensive Attention Convolutional Neural Networks for Explainable Medical Image Segmentation.
Textclassifier
Text classifier for Hierarchical Attention Networks for Document Classification
Fed Att
Attentive Federated Learning for Private NLM
Isab Pytorch
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Show Attend And Tell
TensorFlow Implementation of "Show, Attend and Tell"
Ag Cnn
This is a reimplementation of AG-CNN. ("Thorax Disease Classification with Attention Guided Convolutional Neural Network","Diagnose like a Radiologist: Attention Guided Convolutional Neural Network for Thorax Disease Classification")
Pytorch Gat
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Chatbot cn
基于金融-司法领域(兼有闲聊性质)的聊天机器人,其中的主要模块有信息抽取、NLU、NLG、知识图谱等,并且利用Django整合了前端展示,目前已经封装了nlp和kg的restful接口
Pointer summarizer
pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Performer Pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch
Moran v2
MORAN: A Multi-Object Rectified Attention Network for Scene Text Recognition
Keras Self Attention
Attention mechanism for processing sequential data that considers the context for each timestamp.
Deeplearning.ai Natural Language Processing Specialization
This repository contains my full work and notes on Coursera's NLP Specialization (Natural Language Processing) taught by the instructor Younes Bensouda Mourri and Łukasz Kaiser offered by deeplearning.ai
Transformer Tts
A Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Paperrobot
Code for PaperRobot: Incremental Draft Generation of Scientific Ideas
Simgnn
A PyTorch implementation of "SimGNN: A Neural Network Approach to Fast Graph Similarity Computation" (WSDM 2019).
Transformer
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Keras Gat
Keras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://arxiv.org/abs/1710.10903)
Yolo Multi Backbones Attention
Model Compression—YOLOv3 with multi lightweight backbones(ShuffleNetV2 HuaWei GhostNet), attention, prune and quantization
Seq2seq chatbot
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Tensorflow end2end speech recognition
End-to-End speech recognition implementation base on TensorFlow (CTC, Attention, and MTL training)
Alphafold2
To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released
Adaptiveattention
Implementation of "Knowing When to Look: Adaptive Attention via A Visual Sentinel for Image Captioning"
Attention is all you need
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
Vit Pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Multi Scale Attention
Code for our paper "Multi-scale Guided Attention for Medical Image Segmentation"
Timesformer Pytorch
Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
Transformer
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
61-120 of 223 attention-mechanism projects