All Projects → yiqun-wang → MGCN

yiqun-wang / MGCN

Licence: other
MGCN: Descriptor Learning using Multiscale GCNs (SIGGRAPH 2020)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to MGCN

hsn
Code for SIGGRAPH paper CNNs on Surfaces using Rotation-Equivariant Features
Stars: ✭ 71 (+129.03%)
Mutual labels:  siggraph
DOT
Decomposed Optimization Time Integration (DOT) is a domain-decomposed optimization method for fast, reliable simulation of deformation dynamics. DOT efficiently converges with frame-rate time-steps across a wide range of extreme conditions.
Stars: ✭ 37 (+19.35%)
Mutual labels:  siggraph
virtual sketching
General Virtual Sketching Framework for Vector Line Art (SIGGRAPH 2021)
Stars: ✭ 111 (+258.06%)
Mutual labels:  siggraph
DenseDescriptorLearning-Pytorch
Official Repo for the paper "Extremely Dense Point Correspondences using a Learned Feature Descriptor" (CVPR 2020)
Stars: ✭ 66 (+112.9%)
Mutual labels:  descriptor-learning
glcic.pytorch
This is a implement of the Siggraph2017 paper: "Globally and Locally Consistent Image Completion"
Stars: ✭ 15 (-51.61%)
Mutual labels:  siggraph
HOT
Hierarchical Optimization Time Integration (HOT) for efficient implicit timestepping of the material point method (MPM)
Stars: ✭ 83 (+167.74%)
Mutual labels:  siggraph
photo recoloring
Palette-based Photo Recoloring, implementation of the approach of Huiwen Chang et al.
Stars: ✭ 39 (+25.81%)
Mutual labels:  siggraph
OpenISS
OpenISS -- a unified multimodal motion data delivery framework.
Stars: ✭ 22 (-29.03%)
Mutual labels:  siggraph
bichon
Robust Coarse Curved TetMesh Generation
Stars: ✭ 29 (-6.45%)
Mutual labels:  siggraph

MGCN

demo

This code implements Multiscale Graph Convolutional Network (MGCN) in our SIGGRAPH 2020 paper:

"MGCN: Descriptor Learning using Multiscale GCNs"

by Yiqun Wang, Jing Ren, Dong-Ming Yan, Jianwei Guo, Xiaopeng Zhang, Peter Wonka.

Please consider citing the above paper if this code/program (or part of it) benefits your project.

Environment

	conda create -n MGCN python=3.7     # (options: 3.X)
	source activate MGCN                # (create and activate new environment if you use Anaconda)
	
	conda install pytorch=1.5.0 torchvision cudatoolkit=10.1 -c pytorch     # (options: 10.X)
	pip install torch-scatter==latest+cu101 -f https://pytorch-geometric.com/whl/torch-1.5.0.html
	pip install torch-sparse==latest+cu101 -f https://pytorch-geometric.com/whl/torch-1.5.0.html
	pip install torch-cluster==latest+cu101 -f https://pytorch-geometric.com/whl/torch-1.5.0.html
	pip install torch-geometric

Usage

  1. Put models in the folder MGCN/datasets/Faust/data_mesh/ for FAUST dataset or MGCN/datasets/Scape/data_mesh/ for SCAPE dataset

  2. generate Wavelet Energy Decomposition Signature (WEDS) descriptors and wavelets using Repository WEDS or directly download processed data (traning.pt and test.pt) below and put them into the folder MGCN/datasets/Faust/processed/ or MGCN/datasets/Scape/processed/.

  3. Training examples

	# for FAUST
	python MGCN_FAUST.py
	# for SCAPE
	python MGCN_SCAPE.py
  1. Restore checkpoint and generate descriptors
	# for FAUST
	python MGCN_FAUST.py --uc --gd -l --ln=mgcn_faust-300
	# for SCAPE
	python MGCN_SCAPE.py --uc --gd -l --ln=mgcn_scape-300
  1. Dense correspondence can be obtained by directly computing the nearest-neighbor using the L2 distance in descriptor space, and this Repository can be used to visualize the descriptors.

Processed data

Processed Dataset Download Link Description
FAUST original Google Drive, 1G 75 models for training and 15 models for testing (6890 points)
FAUST 5 resolutions Google Drive, 1G 15 X 5 models for testing (6890, 8K, 10K, 12K, 15K points)
SCAPE remeshed Google Drive, 623M 61 models for training and 10 models for testing (~5K points)
SCAPE original Google Drive, 218M 10 models for testing (12.5K points)

Cite

@article{wang2020mgcn,
  title=      {MGCN: Descriptor Learning using Multiscale GCNs},
  author=     {Wang, Yiqun and Ren, Jing and Yan, Dong-Ming and Guo, Jianwei and Zhang, Xiaopeng and Wonka, Peter},
  journal=    {ACM  Trans. on Graphics (Proc. {SIGGRAPH})},
  year=       {2020},
}

License

This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].