All Projects → SunghwanHong → Cost-Aggregation-transformers

SunghwanHong / Cost-Aggregation-transformers

Licence: GPL-3.0 license
Official implementation of CATs

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Cost-Aggregation-transformers

cisip-FIRe
Fast Image Retrieval (FIRe) is an open source project to promote image retrieval research. It implements most of the major binary hashing methods to date, together with different popular backbone networks and public datasets.
Stars: ✭ 40 (-66.67%)
Mutual labels:  neurips, neurips-2021
AIPaperCompleteDownload
Complete download for papers in various top conferences
Stars: ✭ 64 (-46.67%)
Mutual labels:  neurips
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (-32.5%)
Mutual labels:  neurips
icml-nips-iclr-dataset
Papers, authors and author affiliations from ICML, NeurIPS and ICLR 2006-2021
Stars: ✭ 21 (-82.5%)
Mutual labels:  neurips
DeepIPR
This is the code repo of our NeurIPS2019 work that proposes novel passport-based DNN ownership verification schemes, i.e. we embed passport layer into various deep learning architectures (e.g. AlexNet, ResNet) for Intellectual Property Right (IPR) protection.
Stars: ✭ 63 (-47.5%)
Mutual labels:  neurips
DiscoBox
The Official PyTorch Implementation of DiscoBox.
Stars: ✭ 95 (-20.83%)
Mutual labels:  semantic-correspondence
Releasing Research Code
Tips for releasing research code in Machine Learning (with official NeurIPS 2020 recommendations)
Stars: ✭ 1,840 (+1433.33%)
Mutual labels:  neurips
Awesome-Computer-Vision-Paper-List
This repository contains all the papers accepted in top conference of computer vision, with convenience to search related papers.
Stars: ✭ 248 (+106.67%)
Mutual labels:  neurips
NeuroAI
NeuroAI-UW seminar, a regular weekly seminar for the UW community, organized by NeuroAI Shlizerman Lab.
Stars: ✭ 36 (-70%)
Mutual labels:  neurips
stagin
STAGIN: Spatio-Temporal Attention Graph Isomorphism Network
Stars: ✭ 34 (-71.67%)
Mutual labels:  neurips
eeg-gcnn
Resources for the paper titled "EEG-GCNN: Augmenting Electroencephalogram-based Neurological Disease Diagnosis using a Domain-guided Graph Convolutional Neural Network". Accepted for publication (with an oral spotlight!) at ML4H Workshop, NeurIPS 2020.
Stars: ✭ 50 (-58.33%)
Mutual labels:  neurips
score flow
Official code for "Maximum Likelihood Training of Score-Based Diffusion Models", NeurIPS 2021 (spotlight)
Stars: ✭ 49 (-59.17%)
Mutual labels:  neurips-2021
unsup-parts
Unsupervised Part Discovery from Contrastive Reconstruction (NeurIPS 2021)
Stars: ✭ 35 (-70.83%)
Mutual labels:  neurips-2021
NeuroSEED
Implementation of Neural Distance Embeddings for Biological Sequences (NeuroSEED) in PyTorch (NeurIPS 2021)
Stars: ✭ 40 (-66.67%)
Mutual labels:  neurips-2021
SemiSeg-AEL
Semi-Supervised Semantic Segmentation via Adaptive Equalization Learning, NeurIPS 2021 (Spotlight)
Stars: ✭ 79 (-34.17%)
Mutual labels:  neurips-2021
pcan
Prototypical Cross-Attention Networks for Multiple Object Tracking and Segmentation, NeurIPS 2021 Spotlight
Stars: ✭ 294 (+145%)
Mutual labels:  neurips-2021
progressive-coordinate-transforms
Progressive Coordinate Transforms for Monocular 3D Object Detection, NeurIPS 2021
Stars: ✭ 55 (-54.17%)
Mutual labels:  neurips-2021
SoCo
[NeurIPS 2021 Spotlight] Aligning Pretraining for Detection via Object-Level Contrastive Learning
Stars: ✭ 125 (+4.17%)
Mutual labels:  neurips-2021
DiGCL
The PyTorch implementation of Directed Graph Contrastive Learning (DiGCL), NeurIPS-2021
Stars: ✭ 27 (-77.5%)
Mutual labels:  neurips-2021
Entity-Graph-VLN
Code of the NeurIPS 2021 paper: Language and Visual Entity Relationship Graph for Agent Navigation
Stars: ✭ 34 (-71.67%)
Mutual labels:  neurips-2021

PWC
PWC
PWC

CATs: Cost Aggregation Transformers for Visual Correspondence NeurIPS'21

For more information, check out the paper on [arXiv].

Check out our journal extension! It will be appeared at TPAMI, but currenclty available at: [arXiv]. Also, the code implementation is available at : https://github.com/KU-CVLAB/CATs-PlusPlus

Network

Our model CATs is illustrated below:

alt text

Environment Settings

git clone https://github.com/SunghwanHong/CATs
cd CATs

conda create -n CATs python=3.6
conda activate CATs

pip install torch==1.8.0+cu111 torchvision==0.9.0+cu111 torchaudio==0.8.0 -f https://download.pytorch.org/whl/torch_stable.html
pip install -U scikit-image
pip install git+https://github.com/albumentations-team/albumentations
pip install tensorboardX termcolor timm tqdm requests pandas

Evaluation

  • Download pre-trained weights on Link
  • All datasets are automatically downloaded into directory specified by argument datapath

Result on SPair-71k: (PCK 49.9%)

  python test.py --pretrained "/path_to_pretrained_model/spair" --benchmark spair

Result on SPair-71k, feature backbone frozen: (PCK 42.4%)

  python test.py --pretrained "/path_to_pretrained_model/spair_frozen" --benchmark spair

Results on PF-PASCAL: (PCK 75.4%, 92.6%, 96.4%)

  python test.py --pretrained "/path_to_pretrained_model/pfpascal" --benchmark pfpascal

Results on PF-PACAL, feature backbone frozen: (PCK 67.5%, 89.1%, 94.9%)

  python test.py --pretrained "/path_to_pretrained_model/pfpascal_frozen" --benchmark pfpascal

Acknowledgement

We borrow code from public projects (huge thanks to all the projects). We mainly borrow code from DHPF and GLU-Net.

BibTeX

If you find this research useful, please consider citing:

@inproceedings{cho2021cats,
  title={CATs: Cost Aggregation Transformers for Visual Correspondence},
  author={Cho, Seokju and Hong, Sunghwan and Jeon, Sangryul and Lee, Yunsung and Sohn, Kwanghoon and Kim, Seungryong},
  booktitle={Thirty-Fifth Conference on Neural Information Processing Systems},
  year={2021}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].