All Projects → flow1d → Similar Projects or Alternatives

308 Open source projects that are alternatives of or similar to flow1d

CorrelationLayer
Pure Pytorch implementation of Correlation Layer that commonly used in learning based optical flow estimator
Stars: ✭ 22 (-75.82%)
Mutual labels:  correlation, optical-flow
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-46.15%)
Mutual labels:  attention, iccv2021
mcorr
Inferring bacterial recombination rates from large-scale sequencing datasets.
Stars: ✭ 29 (-68.13%)
Mutual labels:  correlation
dreyeve
[TPAMI 2018] Predicting the Driver’s Focus of Attention: the DR(eye)VE Project. A deep neural network learnt to reproduce the human driver focus of attention (FoA) in a variety of real-world driving scenarios.
Stars: ✭ 88 (-3.3%)
Mutual labels:  attention
Astgcn
⚠️[Deprecated] no longer maintained, please use the code in https://github.com/guoshnBJTU/ASTGCN-r-pytorch
Stars: ✭ 246 (+170.33%)
Mutual labels:  attention
tuneta
Intelligently optimizes technical indicators and optionally selects the least intercorrelated for use in machine learning models
Stars: ✭ 77 (-15.38%)
Mutual labels:  correlation
mmflow
OpenMMLab optical flow toolbox and benchmark
Stars: ✭ 711 (+681.32%)
Mutual labels:  optical-flow
ANCOMBC
Differential abundance (DA) and correlation analyses for microbial absolute abundance data
Stars: ✭ 60 (-34.07%)
Mutual labels:  correlation
MSRGCN
Official implementation of MSR-GCN (ICCV2021 paper)
Stars: ✭ 42 (-53.85%)
Mutual labels:  iccv2021
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (+158.24%)
Mutual labels:  attention
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (-54.95%)
Mutual labels:  attention
Neat Vision
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Stars: ✭ 213 (+134.07%)
Mutual labels:  attention
Machine-Learning-for-Asset-Managers
Implementation of code snippets, exercises and application to live data from Machine Learning for Asset Managers (Elements in Quantitative Finance) written by Prof. Marcos López de Prado.
Stars: ✭ 168 (+84.62%)
Mutual labels:  correlation
chinese ancient poetry
seq2seq attention tensorflow textrank context
Stars: ✭ 30 (-67.03%)
Mutual labels:  attention
TreeCorr
Code for efficiently computing 2-point and 3-point correlation functions. For documentation, go to
Stars: ✭ 85 (-6.59%)
Mutual labels:  correlation
CrowdFlow
Optical Flow Dataset and Benchmark for Visual Crowd Analysis
Stars: ✭ 87 (-4.4%)
Mutual labels:  optical-flow
DGCA
Differential Gene Correlation Analysis
Stars: ✭ 32 (-64.84%)
Mutual labels:  correlation
how attentive are gats
Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
Stars: ✭ 200 (+119.78%)
Mutual labels:  attention
msda
Library for multi-dimensional, multi-sensor, uni/multivariate time series data analysis, unsupervised feature selection, unsupervised deep anomaly detection, and prototype of explainable AI for anomaly detector
Stars: ✭ 80 (-12.09%)
Mutual labels:  correlation
Deep-Matching-Prior
Official implementation of deep matching prior
Stars: ✭ 21 (-76.92%)
Mutual labels:  iccv2021
Ai law
all kinds of baseline models for long text classificaiton( text categorization)
Stars: ✭ 243 (+167.03%)
Mutual labels:  attention
gapdecoder
Google Arts And Culture Downloader. Python script to download high-resolution images from google arts & culture.
Stars: ✭ 78 (-14.29%)
Mutual labels:  high-resolution
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+129.67%)
Mutual labels:  attention
tensorflow-chatbot-chinese
網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (-45.05%)
Mutual labels:  attention
Doc Han Att
Hierarchical Attention Networks for Chinese Sentiment Classification
Stars: ✭ 206 (+126.37%)
Mutual labels:  attention
xmca
Maximum Covariance Analysis in Python
Stars: ✭ 41 (-54.95%)
Mutual labels:  correlation
Graphtransformer
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (+105.49%)
Mutual labels:  attention
Naos
A mildly opiniated modern cloud service architecture blueprint + reference implementation
Stars: ✭ 19 (-79.12%)
Mutual labels:  correlation
Im2LaTeX
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-82.42%)
Mutual labels:  attention
Mixpanel-Statistics
Perform statistics on Mixpanel API data
Stars: ✭ 26 (-71.43%)
Mutual labels:  correlation
Foveation-Segmentation
PyTorch implementation of Foveation for Segmentation of Ultra-High Resolution Images
Stars: ✭ 38 (-58.24%)
Mutual labels:  high-resolution
nfc-laboratory
NFC signal and protocol analyzer using SDR receiver
Stars: ✭ 41 (-54.95%)
Mutual labels:  correlation
attention-ocr
A pytorch implementation of the attention based ocr
Stars: ✭ 44 (-51.65%)
Mutual labels:  attention
CorBinian
CorBinian: A toolbox for modelling and simulating high-dimensional binary and count-data with correlations
Stars: ✭ 15 (-83.52%)
Mutual labels:  correlation
reasoning attention
Unofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (-62.64%)
Mutual labels:  attention
heatmaps
Better heatmaps in Python
Stars: ✭ 117 (+28.57%)
Mutual labels:  correlation
SnowflakeNet
(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
Stars: ✭ 74 (-18.68%)
Mutual labels:  iccv2021
servicestack-request-correlation
A plugin for ServiceStack that creates a correlation id that allows requests to be tracked across multiple services
Stars: ✭ 12 (-86.81%)
Mutual labels:  correlation
AiR
Official Repository for ECCV 2020 paper "AiR: Attention with Reasoning Capability"
Stars: ✭ 41 (-54.95%)
Mutual labels:  attention
Optical-Flow-based-Obstacle-Avoidance
Image based obstacle avoidance using optical flow
Stars: ✭ 24 (-73.63%)
Mutual labels:  optical-flow
Cgnl Network.pytorch
Compact Generalized Non-local Network (NIPS 2018)
Stars: ✭ 252 (+176.92%)
Mutual labels:  attention
InstanceRefer
[ICCV 2021] InstanceRefer: Cooperative Holistic Understanding for Visual Grounding on Point Clouds through Instance Multi-level Contextual Referring
Stars: ✭ 64 (-29.67%)
Mutual labels:  iccv2021
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+3656.04%)
Mutual labels:  attention
G-SFDA
code for our ICCV 2021 paper 'Generalized Source-free Domain Adaptation'
Stars: ✭ 88 (-3.3%)
Mutual labels:  iccv2021
Long Range Arena
Long Range Arena for Benchmarking Efficient Transformers
Stars: ✭ 235 (+158.24%)
Mutual labels:  attention
keras-utility-layer-collection
Collection of custom layers and utility functions for Keras which are missing in the main framework.
Stars: ✭ 63 (-30.77%)
Mutual labels:  attention
Appnp
A PyTorch implementation of "Predict then Propagate: Graph Neural Networks meet Personalized PageRank" (ICLR 2019).
Stars: ✭ 234 (+157.14%)
Mutual labels:  attention
Joint-Motion-Estimation-and-Segmentation
[MICCAI'18] Joint Learning of Motion Estimation and Segmentation for Cardiac MR Image Sequences
Stars: ✭ 45 (-50.55%)
Mutual labels:  optical-flow
Gam
A PyTorch implementation of "Graph Classification Using Structural Attention" (KDD 2018).
Stars: ✭ 227 (+149.45%)
Mutual labels:  attention
natural-language-joint-query-search
Search photos on Unsplash based on OpenAI's CLIP model, support search with joint image+text queries and attention visualization.
Stars: ✭ 143 (+57.14%)
Mutual labels:  attention
Pen Net For Inpainting
[CVPR'2019]PEN-Net: Learning Pyramid-Context Encoder Network for High-Quality Image Inpainting
Stars: ✭ 206 (+126.37%)
Mutual labels:  attention
lstm-attention
Attention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (-4.4%)
Mutual labels:  attention
Guided Attention Inference Network
Contains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Stars: ✭ 204 (+124.18%)
Mutual labels:  attention
LLVIP
LLVIP: A Visible-infrared Paired Dataset for Low-light Vision
Stars: ✭ 438 (+381.32%)
Mutual labels:  iccv2021
Hnatt
Train and visualize Hierarchical Attention Networks
Stars: ✭ 192 (+110.99%)
Mutual labels:  attention
MGAN
Exploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (-51.65%)
Mutual labels:  attention
GuidedNet
Caffe implementation for "Guided Optical Flow Learning"
Stars: ✭ 28 (-69.23%)
Mutual labels:  optical-flow
humanflow2
Official repository of Learning Multi-Human Optical Flow (IJCV 2019)
Stars: ✭ 37 (-59.34%)
Mutual labels:  optical-flow
bert attn viz
Visualize BERT's self-attention layers on text classification tasks
Stars: ✭ 41 (-54.95%)
Mutual labels:  attention
pytorch-attention-augmented-convolution
A pytorch implementation of https://arxiv.org/abs/1904.09925
Stars: ✭ 20 (-78.02%)
Mutual labels:  attention
1-60 of 308 similar projects