All Projects → zouchuhang → Silhouette-Guided-3D

zouchuhang / Silhouette-Guided-3D

Licence: MIT License
PyTorch Code of our WACV2020 paper: Silhouette Guided Point Cloud Reconstruction beyond Occlusion

Programming Languages

python
139335 projects - #7 most used programming language
C++
36643 projects - #6 most used programming language
Cuda
1817 projects
c
50402 projects - #5 most used programming language
matlab
3953 projects
Makefile
30231 projects

Projects that are alternatives of or similar to Silhouette-Guided-3D

3d Machine Learning
A resource repository for 3D machine learning
Stars: ✭ 7,405 (+20469.44%)
Mutual labels:  point-cloud, 3d-reconstruction
Openmvs
open Multi-View Stereo reconstruction library
Stars: ✭ 1,842 (+5016.67%)
Mutual labels:  point-cloud, 3d-reconstruction
3PU pytorch
pytorch implementation of >>Patch-base progressive 3D Point Set Upsampling<<
Stars: ✭ 61 (+69.44%)
Mutual labels:  point-cloud, 3d-reconstruction
BtcDet
Behind the Curtain: Learning Occluded Shapes for 3D Object Detection
Stars: ✭ 104 (+188.89%)
Mutual labels:  point-cloud, occlusion
NeuralPull
Implementation of ICML'2021:Neural-Pull: Learning Signed Distance Functions from Point Clouds by Learning to Pull Space onto Surfaces
Stars: ✭ 149 (+313.89%)
Mutual labels:  point-cloud, 3d-reconstruction
Meshlab
The open source mesh processing system
Stars: ✭ 2,619 (+7175%)
Mutual labels:  point-cloud, 3d-reconstruction
Awsome deep geometry learning
A list of resources about deep learning solutions on 3D shape processing
Stars: ✭ 105 (+191.67%)
Mutual labels:  point-cloud, 3d-reconstruction
softpool
SoftPoolNet: Shape Descriptor for Point Cloud Completion and Classification - ECCV 2020 oral
Stars: ✭ 62 (+72.22%)
Mutual labels:  point-cloud, 3d-reconstruction
Pcn
Code for PCN: Point Completion Network in 3DV'18 (Oral)
Stars: ✭ 238 (+561.11%)
Mutual labels:  point-cloud, 3d-reconstruction
Msn Point Cloud Completion
Morphing and Sampling Network for Dense Point Cloud Completion (AAAI2020)
Stars: ✭ 196 (+444.44%)
Mutual labels:  point-cloud, 3d-reconstruction
Point2Mesh
Meshing Point Clouds with Predicted Intrinsic-Extrinsic Ratio Guidance (ECCV2020)
Stars: ✭ 61 (+69.44%)
Mutual labels:  point-cloud, 3d-reconstruction
pyRANSAC-3D
A python tool for fitting primitives 3D shapes in point clouds using RANSAC algorithm
Stars: ✭ 253 (+602.78%)
Mutual labels:  point-cloud, 3d-reconstruction
back2future
Unsupervised Learning of Multi-Frame Optical Flow with Occlusions
Stars: ✭ 39 (+8.33%)
Mutual labels:  occlusion
lowshot-shapebias
Learning low-shot object classification with explicit shape bias learned from point clouds
Stars: ✭ 37 (+2.78%)
Mutual labels:  point-cloud
lvr2
Las Vegas Reconstruction 2.0
Stars: ✭ 39 (+8.33%)
Mutual labels:  point-cloud
pc2vid
converts set of point clouds to a video using three.js
Stars: ✭ 18 (-50%)
Mutual labels:  point-cloud
pcljava
A port of the Point Cloud Library (PCL) using Java Native Interface (JNI).
Stars: ✭ 19 (-47.22%)
Mutual labels:  point-cloud
awesome-lidar
😎 Awesome LIDAR list. The list includes LIDAR manufacturers, datasets, point cloud-processing algorithms, point cloud frameworks and simulators.
Stars: ✭ 217 (+502.78%)
Mutual labels:  point-cloud
self-sample
Single shape Deep Point Cloud Consolidation [TOG 2021]
Stars: ✭ 33 (-8.33%)
Mutual labels:  point-cloud
G2LTex
Code for CVPR 2018 paper --- Texture Mapping for 3D Reconstruction with RGB-D Sensor
Stars: ✭ 104 (+188.89%)
Mutual labels:  3d-reconstruction

Silhouette-Guided-3D

PyTorch implementation of our WACV 2020 paper: "Silhouette Guided Point Cloud Reconstruction beyond Occlusion"

Our short introduction video

Network architecture:

Requirements

  • Python 3
  • PyTorch >= 0.4.0
  • numpy, scipy, pickle, skimage, sklearn, random, re
  • torchvision
  • Matlab (for FSSR based post process)

Installation

  • Install mve by following the instructions. This is for FSSR based point cloud refinement.
  • Under ./matlab folder, install gptoolbox for Matlab based Poisson-Disc Sampling
  • [Optional] Install Pix3D evaluation toolkit under the current folder. Note that this requires Tensorflow.
  • [Optional] Install PCN evaluation toolkit under the current folder. Note that this requires Tensorflow. PCN toolkit is for object-centered evaluation.
  • [Optional] Install Mask-RCNN benchmark under the current folder. This is for getting visible silhouette for completion in Pix3D (We've included pre-processed results as below).

Download Data and Pre-trained Model

  • Download pre-trained models and put them under the ./model/ folder.
  • Download pre-processed DYCE dataset and put them under the ./data/ folder.
  • Download pre-processed Pix3D dataset and put them under the ./data/ folder. This includes pre-computed complete silhouette and ground truth point clouds rotated w.r.t. camera position. We've excluded examples with incorrect Mask-RCNN detections.
  • Download ShapeNet dataset and put them under the ./data/ folder.
  • Download pre-processed LSUN dataset and put them under the ./data/ folder
  • Download pre-computed result. and put them under the ./result/ folder. This includes point clouds prediction on ShapeNet and Pix3D after FSSR refinement.
  • [Optional] Download ShapeNet rendered images, and put them under ./data/ShapeNet/ folder. This is for comparing to object-centered point cloud reconstruction approach.
  • [Optional] Download ShapeNet object-centered point cloud ground truth, , and put them under ./data/ShapeNet/ folder. This is for comparing to object-centered point cloud reconstruction approach.

Training

  • Point cloud reconstruction

    python train.py
    python test.py
    

    This will save network predictions for the downstream FSSR post-refinement step.

  • Silhouette completion First train on DYCE dataset:

    python train_sc.py
    

    Then finetune on Pix3D dataset, using 5-fold cross validation ( you will need to run it 5 times by changing the fold number in L32-35 ):

    python train_sc_ft.py
    python test_sc_pix3d.py
    
  • Silhouette guidede point cloud reconstruction

    python train_occ.py
    python test_rec_pix3d.py
    

    Then perform FSSR post-refinement step as describe below

Surface Based Point Clouds Refinement

  • Start matlab
    cd matlab
    ./matlab
    
  • pre-compute FSSR params (per-pixel normal and scale), change folder name based on your saved network predictions path
    FssrPostRefine
    
  • FSSR Here we provide the sample batch-process code (need to go back to the main folder):
    cd ..
    python fssr_batch_process.py
    
  • smoothing
    cd matlab
    preComputeFssrParam
    cd ..
    
    This produces the refined point clouds for evaluation.

Evaluation

  • Pix3D
    • ICP-based fitting since Pix3D ground truth is object-centered (you can skip this step since we've included pre-computered ground truth and predictions). Code is derived from 3D-LMNET, which also includes the ground truth point cloud.
      cd pcn
      python metrics_pix3d.py
      cd ..
      
    • You need to use TensorFlow 3.0+ to run the evaluation:
      cd pix3d/eval/
      python eval_pix3d.py
      cd ../../
      
  • ShapeNet
    • You need to use TensorFlow 3.0+ to run the evaluation:
      cd pcn
      python eval_shapenet.py
      cd ..
      
    • To compare with object-centered point cloud prediction appraoch (3D-LMNET), you need to perform ICP-based fitting first:
      cd pcn
      python metrics_shapenet.py
      cd ../pix3d/eval/
      python eval_shapenet_object_centered.py
      cd ../../
      
  • DYCE (silhouette completion)
    • This is PyTorch based
      python test_sc_DYCE.py
      

Citation

Please cite our paper for any purpose of usage.

@inproceedings{zou2020silhouette,
  title={Silhouette Guided Point Cloud Reconstruction beyond Occlusion},
  author={Zou, Chuhang and Hoiem, Derek},
  booktitle={The IEEE Winter Conference on Applications of Computer Vision},
  pages={41--50},
  year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].