Awesome-Human-Activity-RecognitionAn up-to-date & curated list of Awesome IMU-based Human Activity Recognition(Ubiquitous Computing) papers, methods & resources. Please note that most of the collections of researches are mainly based on IMU data.
Stars: ✭ 72 (+140%)
NARREThis is our implementation of NARRE:Neural Attentional Regression with Review-level Explanations
Stars: ✭ 100 (+233.33%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+303.33%)
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (+336.67%)
hexiaMid-level PyTorch Based Framework for Visual Question Answering.
Stars: ✭ 24 (-20%)
visdialVisual Dialog: Light-weight Transformer for Many Inputs (ECCV 2020)
Stars: ✭ 27 (-10%)
memory-compressed-attentionImplementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"
Stars: ✭ 47 (+56.67%)
SequenceToSequenceA seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-63.33%)
R2Plus1D-C3DA PyTorch implementation of R2Plus1D and C3D based on CVPR 2017 paper "A Closer Look at Spatiotemporal Convolutions for Action Recognition" and CVPR 2014 paper "Learning Spatiotemporal Features with 3D Convolutional Networks"
Stars: ✭ 54 (+80%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (+90%)
dgcnnClean & Documented TF2 implementation of "An end-to-end deep learning architecture for graph classification" (M. Zhang et al., 2018).
Stars: ✭ 21 (-30%)
awesome-egocentric-visionA curated list of egocentric (first-person) vision and related area resources
Stars: ✭ 103 (+243.33%)
concept-based-xaiLibrary implementing state-of-the-art Concept-based and Disentanglement Learning methods for Explainable AI
Stars: ✭ 41 (+36.67%)
SAMNThis is our implementation of SAMN: Social Attentional Memory Network
Stars: ✭ 45 (+50%)
LSTM-AttentionA Comparison of LSTMs and Attention Mechanisms for Forecasting Financial Time Series
Stars: ✭ 53 (+76.67%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+676.67%)
Squeeze-and-Recursion-Temporal-GatesCode for : [Pattern Recognit. Lett. 2021] "Learn to cycle: Time-consistent feature discovery for action recognition" and [IJCNN 2021] "Multi-Temporal Convolutions for Human Action Recognition in Videos".
Stars: ✭ 62 (+106.67%)
STAM-pytorchImplementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Stars: ✭ 109 (+263.33%)
Im2LaTeXAn implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-46.67%)
RETRO-pytorchImplementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+1476.67%)
TS3000 TheChatBOTIts a social networking chat-bot trained on Reddit dataset . It supports open bounded queries developed on the concept of Neural Machine Translation. Beware of its being sarcastic just like its creator 😝 BDW it uses Pytorch framework and Python3.
Stars: ✭ 20 (-33.33%)
question-generationNeural Models for Key Phrase Detection and Question Generation
Stars: ✭ 29 (-3.33%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-23.33%)
long-short-transformerImplementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (+243.33%)
CIANImplementation of the Character-level Intra Attention Network (CIAN) for Natural Language Inference (NLI) upon SNLI and MultiNLI corpus
Stars: ✭ 17 (-43.33%)
Compact-Global-DescriptorPytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-26.67%)
uniformer-pytorchImplementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Stars: ✭ 90 (+200%)
troveWeakly supervised medical named entity classification
Stars: ✭ 55 (+83.33%)
Learning-From-RulesImplementation of experiments in paper "Learning from Rules Generalizing Labeled Exemplars" to appear in ICLR2020 (https://openreview.net/forum?id=SkeuexBtDr)
Stars: ✭ 46 (+53.33%)
Brain-Tumor-SegmentationAttention-Guided Version of 2D UNet for Automatic Brain Tumor Segmentation
Stars: ✭ 125 (+316.67%)
ChangeFormerOfficial PyTorch implementation of our IGARSS'22 paper: A Transformer-Based Siamese Network for Change Detection
Stars: ✭ 220 (+633.33%)
resolutions-2019A list of data mining and machine learning papers that I implemented in 2019.
Stars: ✭ 19 (-36.67%)
Machine-Translation-Hindi-to-english-Machine translation is the task of converting one language to other. Unlike the traditional phrase-based translation system which consists of many small sub-components that are tuned separately, neural machine translation attempts to build and train a single, large neural network that reads a sentence and outputs a correct translation.
Stars: ✭ 19 (-36.67%)
Optic-Disc-UnetAttention Unet model with post process for retina optic disc segmention
Stars: ✭ 77 (+156.67%)
LMFD-PADLearnable Multi-level Frequency Decomposition and Hierarchical Attention Mechanism for Generalized Face Presentation Attack Detection
Stars: ✭ 27 (-10%)
glimpse cloudsPytorch implementation of the paper "Glimpse Clouds: Human Activity Recognition from Unstructured Feature Points", F. Baradel, C. Wolf, J. Mille , G.W. Taylor, CVPR 2018
Stars: ✭ 30 (+0%)
efficient-attentionAn implementation of the efficient attention module.
Stars: ✭ 191 (+536.67%)
stipcvRealtime implemnetation of spatial-temporal local features
Stars: ✭ 14 (-53.33%)
DCAN[AAAI 2020] Code release for "Domain Conditioned Adaptation Network" https://arxiv.org/abs/2005.06717
Stars: ✭ 27 (-10%)
amta-netAsymmetric Multi-Task Attention Network for Prostate Bed Segmentation in CT Images
Stars: ✭ 26 (-13.33%)
Multi-task-Conditional-Attention-NetworksA prototype version of our submitted paper: Conversion Prediction Using Multi-task Conditional Attention Networks to Support the Creation of Effective Ad Creatives.
Stars: ✭ 21 (-30%)
SA-DLSentiment Analysis with Deep Learning models. Implemented with Tensorflow and Keras.
Stars: ✭ 35 (+16.67%)
SiGATsource code for signed graph attention networks (ICANN2019) & SDGNN (AAAI2021)
Stars: ✭ 37 (+23.33%)
axial-attentionImplementation of Axial attention - attending to multi-dimensional data efficiently
Stars: ✭ 245 (+716.67%)
datastories-semeval2017-task6Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-33.33%)
danaDANA: Dimension-Adaptive Neural Architecture (UbiComp'21)( ACM IMWUT)
Stars: ✭ 28 (-6.67%)
stanford-cs231n-assignments-2020This repository contains my solutions to the assignments for Stanford's CS231n "Convolutional Neural Networks for Visual Recognition" (Spring 2020).
Stars: ✭ 84 (+180%)
weaselWeakly Supervised End-to-End Learning (NeurIPS 2021)
Stars: ✭ 117 (+290%)
rnn-text-classification-tfTensorflow implementation of Attention-based Bidirectional RNN text classification.
Stars: ✭ 26 (-13.33%)