All Projects → shurans → Deepslidingshape

shurans / Deepslidingshape

Deep Sliding Shapes for Amodal 3D Object Detection in RGB-D Images

Deep Sliding Shapes for Amodal 3D Object Detection in RGB-D Images

S. Song, and J. Xiao. (CVPR2016)

Compile code

Download CUDA 7.5 and cuDNN 3. You will need to register with NVIDIA.

cd code/marvin
./linux.sh

Prepare data

3D region proposal network:

  • You can download the precomputed region proposal for NYU and SUNRGBD dataset by runing script:

    downloadData('../proposal','http://dss.cs.princeton.edu/Release/result/proposal/RPN_NYU/','.mat');
    downloadData('../proposal','http://dss.cs.princeton.edu/Release/result/proposal/RPN_SUNRGBD/','.mat');
    
  • To train 3D region proposal network and extract 3D region proposal cd code/matlab_code/slidingAnchor run dss_prepareAnchorbox() to prepare training data. run RPN_extract() to extract 3D region proposal. You may need the segmentation result here:

    downloadData('../seg','http://dss.cs.princeton.edu/Release/seg/','.mat');
    
  • Pretrained model and network defination can be found here

3D object detection network:

  1. change path in dss_initPath.m;
  2. run dss_marvin_script(0,100,1,[] ,1,'RPN_NYU',1,[],0,0);
  3. Pretrained model and network defination can be found here

Notes :

  • If matlab system call fails, you can try to run the command directly.
  • The rotation matrixes for some of the images in the dataset are different from the original SUNRGB-D dataset, so that the rotation only contains camera tilt angle (i.e. point cloud does not rotated on the x,y plane). We provide the data in this repo ./external/SUNRGBDtoolbox/Metadata/SUNRGBDMeta.mat. All the results and ground truth boxes provided in this repo are using this rotation matrix. To convert the rotation matrix you can reference the code "changeRoomR.m"
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].