mmnMoore Machine Networks (MMN): Learning Finite-State Representations of Recurrent Policy Networks
Stars: ✭ 39 (+160%)
DeepMoveCodes for WWW'18 Paper-DeepMove: Predicting Human Mobility with Attentional Recurrent Network
Stars: ✭ 120 (+700%)
tensorflow-chatbot-chinese網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (+233.33%)
thermostatCollection of NLP model explanations and accompanying analysis tools
Stars: ✭ 126 (+740%)
xai-iml-sotaInteresting resources related to Explainable Artificial Intelligence, Interpretable Machine Learning, Interactive Machine Learning, Human in Loop and Visual Analytics.
Stars: ✭ 51 (+240%)
breakout-Deep-Q-NetworkReinforcement Learning | tensorflow implementation of DQN, Dueling DQN and Double DQN performed on Atari Breakout
Stars: ✭ 69 (+360%)
mllpThe code of AAAI 2020 paper "Transparent Classification with Multilayer Logical Perceptrons and Random Binarization".
Stars: ✭ 15 (+0%)
flow1d[ICCV 2021 Oral] High-Resolution Optical Flow from 1D Attention and Correlation
Stars: ✭ 91 (+506.67%)
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (+173.33%)
bert attn vizVisualize BERT's self-attention layers on text classification tasks
Stars: ✭ 41 (+173.33%)
hierarchical-dnn-interpretationsUsing / reproducing ACD from the paper "Hierarchical interpretations for neural network predictions" 🧠 (ICLR 2019)
Stars: ✭ 110 (+633.33%)
ALPS 2021XAI Tutorial for the Explainable AI track in the ALPS winter school 2021
Stars: ✭ 55 (+266.67%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+706.67%)
Im2LaTeXAn implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (+6.67%)
Transformer-MM-Explainability[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+3126.67%)
attention-ocrA pytorch implementation of the attention based ocr
Stars: ✭ 44 (+193.33%)
transformers-interpretModel explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Stars: ✭ 861 (+5640%)
dreyeve[TPAMI 2018] Predicting the Driver’s Focus of Attention: the DR(eye)VE Project. A deep neural network learnt to reproduce the human driver focus of attention (FoA) in a variety of real-world driving scenarios.
Stars: ✭ 88 (+486.67%)
adversarial-robustness-publicCode for AAAI 2018 accepted paper: "Improving the Adversarial Robustness and Interpretability of Deep Neural Networks by Regularizing their Input Gradients"
Stars: ✭ 49 (+226.67%)
MGANExploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (+193.33%)
TRAR-VQA[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (+226.67%)
concept-based-xaiLibrary implementing state-of-the-art Concept-based and Disentanglement Learning methods for Explainable AI
Stars: ✭ 41 (+173.33%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+22686.67%)
Long Range ArenaLong Range Arena for Benchmarking Efficient Transformers
Stars: ✭ 235 (+1466.67%)
natural-language-joint-query-searchSearch photos on Unsplash based on OpenAI's CLIP model, support search with joint image+text queries and attention visualization.
Stars: ✭ 143 (+853.33%)
RecycleNetAttentional Learning of Trash Classification
Stars: ✭ 23 (+53.33%)
Fruit-APIA Universal Deep Reinforcement Learning Framework
Stars: ✭ 61 (+306.67%)
AiROfficial Repository for ECCV 2020 paper "AiR: Attention with Reasoning Capability"
Stars: ✭ 41 (+173.33%)
megMolecular Explanation Generator
Stars: ✭ 14 (-6.67%)
keras-utility-layer-collectionCollection of custom layers and utility functions for Keras which are missing in the main framework.
Stars: ✭ 63 (+320%)
6502.NetA .Net-based Cross-Assembler for Several 8-Bit Microprocessors
Stars: ✭ 44 (+193.33%)
glcapsnetGlobal-Local Capsule Network (GLCapsNet) is a capsule-based architecture able to provide context-based eye fixation prediction for several autonomous driving scenarios, while offering interpretability both globally and locally.
Stars: ✭ 33 (+120%)
interpretable-mlTechniques & resources for training interpretable ML models, explaining ML models, and debugging ML models.
Stars: ✭ 17 (+13.33%)
kernel-modNeurIPS 2018. Linear-time model comparison tests.
Stars: ✭ 17 (+13.33%)
fastbasicFastBasic - Fast BASIC interpreter for the Atari 8-bit computers
Stars: ✭ 108 (+620%)
adaptive-waveletsAdaptive, interpretable wavelets across domains (NeurIPS 2021)
Stars: ✭ 58 (+286.67%)
DQN-pytorchA PyTorch implementation of Human-Level Control through Deep Reinforcement Learning
Stars: ✭ 23 (+53.33%)
how attentive are gatsCode for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
Stars: ✭ 200 (+1233.33%)
DeepLearningReadingDeep Learning and Machine Learning mini-projects. Current Project: Deepmind Attentive Reader (rc-data)
Stars: ✭ 78 (+420%)
awesome-listAwesome Lists of retrocomputing resources (6502, Apple 2, Atari, ...)
Stars: ✭ 38 (+153.33%)
External-Attention-pytorch🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
Stars: ✭ 7,344 (+48860%)
lstm-attentionAttention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (+480%)
ArenaRData generator for Arena - interactive XAI dashboard
Stars: ✭ 28 (+86.67%)
atari-leaderboardA leaderboard of human and machine performance on the Arcade Learning Environment (ALE).
Stars: ✭ 22 (+46.67%)
EBIM-NLIEnhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (+60%)
Astgcn⚠️[Deprecated] no longer maintained, please use the code in https://github.com/guoshnBJTU/ASTGCN-r-pytorch
Stars: ✭ 246 (+1540%)
EgoCNNCode for "Distributed, Egocentric Representations of Graphs for Detecting Critical Structures" (ICML 2019)
Stars: ✭ 16 (+6.67%)
Ai lawall kinds of baseline models for long text classificaiton( text categorization)
Stars: ✭ 243 (+1520%)
AttnSleep[IEEE TNSRE] "An Attention-based Deep Learning Approach for Sleep Stage Classification with Single-Channel EEG"
Stars: ✭ 76 (+406.67%)
salvadorA free, open-source compressor for the ZX0 format
Stars: ✭ 35 (+133.33%)
retroreA curated list of original and reverse-engineered vintage 6502 game sourcecode.
Stars: ✭ 22 (+46.67%)
reasoning attentionUnofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (+126.67%)
Virtual-Jaguar-RxVirtual Jaguar, an Atari Jaguar emulator, with integrated debugger
Stars: ✭ 35 (+133.33%)