CorrelationLayerPure Pytorch implementation of Correlation Layer that commonly used in learning based optical flow estimator
Stars: ✭ 22 (-75.82%)
TRAR-VQA[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-46.15%)
mcorrInferring bacterial recombination rates from large-scale sequencing datasets.
Stars: ✭ 29 (-68.13%)
dreyeve[TPAMI 2018] Predicting the Driver’s Focus of Attention: the DR(eye)VE Project. A deep neural network learnt to reproduce the human driver focus of attention (FoA) in a variety of real-world driving scenarios.
Stars: ✭ 88 (-3.3%)
Astgcn⚠️[Deprecated] no longer maintained, please use the code in https://github.com/guoshnBJTU/ASTGCN-r-pytorch
Stars: ✭ 246 (+170.33%)
tunetaIntelligently optimizes technical indicators and optionally selects the least intercorrelated for use in machine learning models
Stars: ✭ 77 (-15.38%)
mmflowOpenMMLab optical flow toolbox and benchmark
Stars: ✭ 711 (+681.32%)
ANCOMBCDifferential abundance (DA) and correlation analyses for microbial absolute abundance data
Stars: ✭ 60 (-34.07%)
MSRGCNOfficial implementation of MSR-GCN (ICCV2021 paper)
Stars: ✭ 42 (-53.85%)
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (-54.95%)
Neat VisionNeat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Stars: ✭ 213 (+134.07%)
Machine-Learning-for-Asset-ManagersImplementation of code snippets, exercises and application to live data from Machine Learning for Asset Managers (Elements in Quantitative Finance) written by Prof. Marcos López de Prado.
Stars: ✭ 168 (+84.62%)
TreeCorrCode for efficiently computing 2-point and 3-point correlation functions. For documentation, go to
Stars: ✭ 85 (-6.59%)
CrowdFlowOptical Flow Dataset and Benchmark for Visual Crowd Analysis
Stars: ✭ 87 (-4.4%)
DGCADifferential Gene Correlation Analysis
Stars: ✭ 32 (-64.84%)
how attentive are gatsCode for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
Stars: ✭ 200 (+119.78%)
msdaLibrary for multi-dimensional, multi-sensor, uni/multivariate time series data analysis, unsupervised feature selection, unsupervised deep anomaly detection, and prototype of explainable AI for anomaly detector
Stars: ✭ 80 (-12.09%)
Ai lawall kinds of baseline models for long text classificaiton( text categorization)
Stars: ✭ 243 (+167.03%)
gapdecoderGoogle Arts And Culture Downloader. Python script to download high-resolution images from google arts & culture.
Stars: ✭ 78 (-14.29%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+129.67%)
tensorflow-chatbot-chinese網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (-45.05%)
Doc Han AttHierarchical Attention Networks for Chinese Sentiment Classification
Stars: ✭ 206 (+126.37%)
xmcaMaximum Covariance Analysis in Python
Stars: ✭ 41 (-54.95%)
GraphtransformerGraph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (+105.49%)
NaosA mildly opiniated modern cloud service architecture blueprint + reference implementation
Stars: ✭ 19 (-79.12%)
Im2LaTeXAn implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-82.42%)
Foveation-SegmentationPyTorch implementation of Foveation for Segmentation of Ultra-High Resolution Images
Stars: ✭ 38 (-58.24%)
nfc-laboratoryNFC signal and protocol analyzer using SDR receiver
Stars: ✭ 41 (-54.95%)
attention-ocrA pytorch implementation of the attention based ocr
Stars: ✭ 44 (-51.65%)
CorBinianCorBinian: A toolbox for modelling and simulating high-dimensional binary and count-data with correlations
Stars: ✭ 15 (-83.52%)
reasoning attentionUnofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (-62.64%)
heatmapsBetter heatmaps in Python
Stars: ✭ 117 (+28.57%)
SnowflakeNet(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
Stars: ✭ 74 (-18.68%)
servicestack-request-correlationA plugin for ServiceStack that creates a correlation id that allows requests to be tracked across multiple services
Stars: ✭ 12 (-86.81%)
AiROfficial Repository for ECCV 2020 paper "AiR: Attention with Reasoning Capability"
Stars: ✭ 41 (-54.95%)
InstanceRefer[ICCV 2021] InstanceRefer: Cooperative Holistic Understanding for Visual Grounding on Point Clouds through Instance Multi-level Contextual Referring
Stars: ✭ 64 (-29.67%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+3656.04%)
G-SFDAcode for our ICCV 2021 paper 'Generalized Source-free Domain Adaptation'
Stars: ✭ 88 (-3.3%)
Long Range ArenaLong Range Arena for Benchmarking Efficient Transformers
Stars: ✭ 235 (+158.24%)
keras-utility-layer-collectionCollection of custom layers and utility functions for Keras which are missing in the main framework.
Stars: ✭ 63 (-30.77%)
AppnpA PyTorch implementation of "Predict then Propagate: Graph Neural Networks meet Personalized PageRank" (ICLR 2019).
Stars: ✭ 234 (+157.14%)
GamA PyTorch implementation of "Graph Classification Using Structural Attention" (KDD 2018).
Stars: ✭ 227 (+149.45%)
natural-language-joint-query-searchSearch photos on Unsplash based on OpenAI's CLIP model, support search with joint image+text queries and attention visualization.
Stars: ✭ 143 (+57.14%)
Pen Net For Inpainting[CVPR'2019]PEN-Net: Learning Pyramid-Context Encoder Network for High-Quality Image Inpainting
Stars: ✭ 206 (+126.37%)
lstm-attentionAttention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (-4.4%)
Guided Attention Inference NetworkContains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Stars: ✭ 204 (+124.18%)
LLVIPLLVIP: A Visible-infrared Paired Dataset for Low-light Vision
Stars: ✭ 438 (+381.32%)
HnattTrain and visualize Hierarchical Attention Networks
Stars: ✭ 192 (+110.99%)
MGANExploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (-51.65%)
GuidedNetCaffe implementation for "Guided Optical Flow Learning"
Stars: ✭ 28 (-69.23%)
humanflow2Official repository of Learning Multi-Human Optical Flow (IJCV 2019)
Stars: ✭ 37 (-59.34%)
bert attn vizVisualize BERT's self-attention layers on text classification tasks
Stars: ✭ 41 (-54.95%)