dreyeve[TPAMI 2018] Predicting the Driver’s Focus of Attention: the DR(eye)VE Project. A deep neural network learnt to reproduce the human driver focus of attention (FoA) in a variety of real-world driving scenarios.
Stars: ✭ 88 (+450%)
External-Attention-pytorch🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
Stars: ✭ 7,344 (+45800%)
natural-language-joint-query-searchSearch photos on Unsplash based on OpenAI's CLIP model, support search with joint image+text queries and attention visualization.
Stars: ✭ 143 (+793.75%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+21262.5%)
Im2LaTeXAn implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (+0%)
db-safedeleteAttempts to invoke force delete, if it fails - falls back to soft delete
Stars: ✭ 16 (+0%)
crud-app❄️ A simple and beautiful CRUD application built with React.
Stars: ✭ 61 (+281.25%)
AppnpA PyTorch implementation of "Predict then Propagate: Graph Neural Networks meet Personalized PageRank" (ICLR 2019).
Stars: ✭ 234 (+1362.5%)
free-lunch-saliencyCode for "Free-Lunch Saliency via Attention in Atari Agents"
Stars: ✭ 15 (-6.25%)
SQLiteHelper🗄 This project comes in handy when you want to write a sql statement easily and smarter.
Stars: ✭ 57 (+256.25%)
ErasureChrome extension for deleting your YouTube comment history.
Stars: ✭ 48 (+200%)
tensorflow-chatbot-chinese網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (+212.5%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+656.25%)
attention-ocrA pytorch implementation of the attention based ocr
Stars: ✭ 44 (+175%)
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (+156.25%)
AttnSleep[IEEE TNSRE] "An Attention-based Deep Learning Approach for Sleep Stage Classification with Single-Channel EEG"
Stars: ✭ 76 (+375%)
Long Range ArenaLong Range Arena for Benchmarking Efficient Transformers
Stars: ✭ 235 (+1368.75%)
woodpeckerwoodpecker http client for Android
Stars: ✭ 17 (+6.25%)
flow1d[ICCV 2021 Oral] High-Resolution Optical Flow from 1D Attention and Correlation
Stars: ✭ 91 (+468.75%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+1206.25%)
reasoning attentionUnofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (+112.5%)
jeelizGlanceTrackerJavaScript/WebGL lib: detect if the user is looking at the screen or not from the webcam video feed. Lightweight and robust to all lighting conditions. Great for play/pause videos if the user is looking or not, or for person detection. Link to live demo.
Stars: ✭ 68 (+325%)
bert attn vizVisualize BERT's self-attention layers on text classification tasks
Stars: ✭ 41 (+156.25%)
rmfrNode.js implementation of rm -fr – recursive removal of files and directories
Stars: ✭ 23 (+43.75%)
AiROfficial Repository for ECCV 2020 paper "AiR: Attention with Reasoning Capability"
Stars: ✭ 41 (+156.25%)
datastories-semeval2017-task6Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (+25%)
keras-utility-layer-collectionCollection of custom layers and utility functions for Keras which are missing in the main framework.
Stars: ✭ 63 (+293.75%)
EBIM-NLIEnhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (+50%)
mern-stack-crudMERN stack (MongoDB, Express, React and Node.js) create read update and delete (CRUD) web application example
Stars: ✭ 142 (+787.5%)
how attentive are gatsCode for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
Stars: ✭ 200 (+1150%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (+75%)
wastebasketA crossplatform go library for moving files to the trashbin
Stars: ✭ 30 (+87.5%)
TRAR-VQA[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (+206.25%)
lstm-attentionAttention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (+443.75%)
MGANExploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (+175%)
RecycleNetAttentional Learning of Trash Classification
Stars: ✭ 23 (+43.75%)
Astgcn⚠️[Deprecated] no longer maintained, please use the code in https://github.com/guoshnBJTU/ASTGCN-r-pytorch
Stars: ✭ 246 (+1437.5%)
aws-s3-bucket-purgerA program that will purge any AWS S3 Bucket of objects and versions quickly
Stars: ✭ 18 (+12.5%)
Ai lawall kinds of baseline models for long text classificaiton( text categorization)
Stars: ✭ 243 (+1418.75%)
object.omitReturn a copy of an object without the given keys.
Stars: ✭ 79 (+393.75%)
chatbot一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (+487.5%)
DeepMoveCodes for WWW'18 Paper-DeepMove: Predicting Human Mobility with Attentional Recurrent Network
Stars: ✭ 120 (+650%)
gnn-lspeSource code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+931.25%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (+256.25%)
DeepLearningReadingDeep Learning and Machine Learning mini-projects. Current Project: Deepmind Attentive Reader (rc-data)
Stars: ✭ 78 (+387.5%)