All Projects → jiajunwu → 3dinn

jiajunwu / 3dinn

Single Image 3D Interpreter Network

Programming Languages

matlab
3953 projects

Single Image 3D Interpreter Network

This repository contains pre-trained models and evaluation code for the project 'Single Image 3D Interpreter Network' (ECCV 2016).

http://3dinterpreter.csail.mit.edu

Prerequisites

Torch

We use Torch 7 (http://torch.ch) for our implementation.

fb.mattorch and Matlab (optional)

We use .mat file with fb.mattorch for saving results, and Matlab (R2015a or later, with Computer Vision System Toolbox) for visualization.

Installation

Our current release has been tested on Ubuntu 14.04.

Clone the repository

git clone [email protected]:jiajunwu/3dinn.git

Download pretrained models (1.8GB)

cd 3dinn
./download_models.sh

Steps for evaluation

I) List input images in data/[classname].txt

II) Estimate 3D object structure

The file (src/main.lua) has the following options.

  • -gpuID: specifies the gpu to run on (1-indexed)
  • -class: which model to use for evaluation. Our current release contains four models: chair, swivelchair, bed, and sofa.
  • -batchSize: the batch size to use

Sample usages include

  • Estimate chair structure for images listed in data/class.txt
cd src
th main.lua -gpuID 1 -class chair 

III) Check visualization in www, and estimated parameters in results

Sample input & output

Datasets we used

Reference

@inproceedings{3dinterpreter,
  title={{Single Image 3D Interpreter Network}},
  author={Wu, Jiajun and Xue, Tianfan and Lim, Joseph J and Tian, Yuandong and Tenenbaum, Joshua B and Torralba, Antonio and Freeman, William T},
  booktitle={European Conference on Computer Vision},
  pages={365--382},
  year={2016}
}

For any questions, please contact Jiajun Wu ([email protected]) and Tianfan Xue ([email protected]).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].