All Projects → hassony2 → obman_render

hassony2 / obman_render

Licence: other
[cvpr19] Code to generate images from the ObMan dataset, synthetic renderings of hands holding objects (or hands in isolation)

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to obman render

graspit interface
A GraspIt! plugin exposing a ROS interface via graspit-ros
Stars: ✭ 29 (-52.46%)
Mutual labels:  grasping, graspit
UniRate
Unity plugin to easily manage the application frame rate and rendering interval. Preventing battery power consumption and device heat, especially on mobile platforms.
Stars: ✭ 26 (-57.38%)
Mutual labels:  rendering
uoais
Codes of paper "Unseen Object Amodal Instance Segmentation via Hierarchical Occlusion Modeling", ICRA 2022
Stars: ✭ 77 (+26.23%)
Mutual labels:  synthetic-data
ssr
React Server-Side Rendering Example
Stars: ✭ 265 (+334.43%)
Mutual labels:  rendering
Nabla
OpenGL/OpenGL ES/Vulkan/CUDA/OptiX Modular Rendering Framework for PC/Linux/Android
Stars: ✭ 235 (+285.25%)
Mutual labels:  rendering
genstar
Generation of Synthetic Populations Library
Stars: ✭ 17 (-72.13%)
Mutual labels:  synthetic-data
pathtracer-webgl2
Path tracing render engine for the web. Both ray tracing for conventional 3d models and ray marching for fractals implemented. Built with WebGL 2 and Angular 2.
Stars: ✭ 45 (-26.23%)
Mutual labels:  rendering
volrend
PlenOctree Volume Rendering (supports CUDA & fragment shader backends)
Stars: ✭ 419 (+586.89%)
Mutual labels:  rendering
gamedevguide
Game Development & Unreal Engine Programming Guide
Stars: ✭ 314 (+414.75%)
Mutual labels:  rendering
SHSoftwareRasterizer
软光栅器的简单实现
Stars: ✭ 31 (-49.18%)
Mutual labels:  rendering
SLProject
SLProject is a platform independent 3D computer graphics scene graph library. Read more on:
Stars: ✭ 47 (-22.95%)
Mutual labels:  rendering
prax
Experimental rendering library geared towards hybrid SSR+SPA apps. Focus on radical simplicity and performance. Tiny and dependency-free.
Stars: ✭ 18 (-70.49%)
Mutual labels:  rendering
Fake-Interior-Shader-for-GodotEngine
Interior Mapping shader for the Godot Game Engine 3.x that works with both GLES3 and GLES2.
Stars: ✭ 40 (-34.43%)
Mutual labels:  rendering
bisml
Implementation of the paper: Adaptive BRDF-Oriented Multiple Importance Sampling of Many Lights
Stars: ✭ 26 (-57.38%)
Mutual labels:  rendering
rendiation
Rendiation Rendering Framework
Stars: ✭ 31 (-49.18%)
Mutual labels:  rendering
game-feature-learning
Code for paper "Cross-Domain Self-supervised Multi-task Feature Learning using Synthetic Imagery", Ren et al., CVPR'18
Stars: ✭ 68 (+11.48%)
Mutual labels:  synthetic-data
augraphy
Augmentation pipeline for rendering synthetic paper printing, faxing, scanning and copy machine processes
Stars: ✭ 49 (-19.67%)
Mutual labels:  synthetic-data
SDMetrics
Metrics to evaluate quality and efficacy of synthetic datasets.
Stars: ✭ 67 (+9.84%)
Mutual labels:  synthetic-data
multi-contact-grasping
This project implements a simulated grasp-and-lift process in V-REP using the Barrett Hand, with an interface through a python remote API.
Stars: ✭ 52 (-14.75%)
Mutual labels:  grasping
SoftRenderer
A SoftRenderer for learning purpose.
Stars: ✭ 46 (-24.59%)
Mutual labels:  rendering

Learning Joint Reconstruction of Hands and Manipulated Objects - Demo, Training Code and Models

Yana Hasson, Gül Varol, Dimitris Tzionas, Igor Kalevatykh, Michael J. Black, Ivan Laptev, Cordelia Schmid, CVPR 2019

This code allows to generate synthetic images of hands holding objects as in the ObMan dataset.

In addition, hands-only images can also be generated, with hand-poses sampled randomly from the MANO hand pose space.

Examples of rendered images:

Hands+Objects Hands
handobject hand

Rendering generates:

  • rgb images
  • 3D ground truth for the hand and objects
  • depth maps
  • segmentation maps

For additional information about the project, see:

Installation

Setup blender

  • Download Blender 2.78c (wget https://download.blender.org/release/Blender2.78/blender-2.78c-linux-glibc219-x86_64.tar.bz2 for instance)
  • untar tar -xvf blender-2.78c-linux-glibc219-x86_64.tar.bz2
  • Download getpip.py: wget https://bootstrap.pypa.io/get-pip.py
  • Try blender-2.78c-linux-glibc219-x86_64/2.78/python/bin/python3.5m get-pip.py
    • If this fails, try:
      • Install pip path/to/blender-2.78c-linux-glibc219-x86_64/2.78/python/bin/python3.5m path/to/blender-2.78c-linux-glibc219-x86_64/2.78/python/lib/python3.5/ensurepip
      • Try to update pip path/to/blender-2.78c-linux-gliblender-2.78c-linux-glibc219-x86_64/2.78/python/bin/pip3 install --upgrade pip
  • Install dependencies
    • path/to/blender-2.78c-linux-glibc219-x86_64/2.78/python/bin/pip install -r requirements.txt

Clone repository

git clone https://github.com/hassony2/obman_render
cd obman_render

Download data dependencies

Download hand and object pickle data-structures

Download SURREAL assets

  • Go to SURREAL dataset request page
  • Create an account, and receive an email with a username and password for data download
  • Download SURREAL data dependencies using the following commands
cd download
sh download_smpl_data.sh ../assets username password
cd ..

Download MANO model

  • Go to MANO website
  • Create an account by clicking Sign Up and provide your information
  • Download Models and Code (the downloaded file should have the format mano_v*_*.zip). Note that all code and data from this download falls under the MANO license.
  • unzip the file mano_v*_*.zip: unzip mano_v*_*.zip
  • set environment variable: export MANO_LOCATION=/path/to/mano_v*_*

Modify mano code to be Python3 compatible

  • Remove print 'FINITO' at the end of file webuser/smpl_handpca_wrapper.py (line 144)
-    print 'FINITO'
  • Replace import cPickle as pickle by import pickle
-    import cPickle as pickle
+    import pickle
  • at top of webuser/smpl_handpca_wrapper.py (line 23)
  • at top of webuser/serialization.py (line 30)
  • Fix pickle encoding
    • in webuser/smpl_handpca_wrapper.py (line 74)
-    smpl_data = pickle.load(open(fname_or_dict))
+    smpl_data = pickle.load(open(fname_or_dict, 'rb'), encoding='latin1')
  • in webuser/serialization.py (line 90)
-    dd = pickle.load(open(fname_or_dict))
+    dd = pickle.load(open(fname_or_dict, 'rb'), encoding='latin1')
  • Fix model paths in webuser/smpl_handpca_wrapper.py (line 81-84)
-    with open('/is/ps2/dtzionas/mano/models/MANO_LEFT.pkl', 'rb') as f:
-        hand_l = load(f)
-    with open('/is/ps2/dtzionas/mano/models/MANO_RIGHT.pkl', 'rb') as f:
-        hand_r = load(f)
+    with open('/path/to/mano_v*_*/models/MANO_LEFT.pkl', 'rb') as f:
+        hand_l = load(f, encoding='latin1')
+    with open('/path/to/mano_v*_*/models/MANO_RIGHT.pkl', 'rb') as f:
+        hand_r = load(f, encoding='latin1')

At the time of writing the instructions mano version is 1.2 so use

-    with open('/is/ps2/dtzionas/mano/models/MANO_LEFT.pkl', 'rb') as f:
-        hand_l = load(f)
-    with open('/is/ps2/dtzionas/mano/models/MANO_RIGHT.pkl', 'rb') as f:
-        hand_r = load(f)
+    with open('/path/to/mano_v1_2/models/MANO_LEFT.pkl', 'rb') as f:
+        hand_l = load(f, encoding='latin1')
+    with open('/path/to/mano_v1_2/models/MANO_RIGHT.pkl', 'rb') as f:
+        hand_r = load(f, encoding='latin1')

Download SMPL model

  • Go to SMPL website
  • Create an account by clicking Sign Up and provide your information
  • Download and unzip SMPL for Python users, copy the models folder to assets/models. Note that all code and data from this download falls under the SMPL license.

OPTIONAL : Download LSUN dataset (to generate images on LSUN backgrounds)

Download LSUN dataset following the instructions.

OPTIONAL : Download ImageNet dataset (to generate images on ImageNet backgrounds)

  • Download original images from here

Download body+hand textures and grasp information

  • Request data on the ObMan webpage

  • Download grasp and texture zips

You should receive two links that will allow you to download bodywithands.zip and shapenet_grasps.zip.

  • Unzip texture zip
cd assets/textures
mv path/to/downloaded/bodywithands.zip .
unzip bodywithands.zip
cd ../..
  • Unzip the grasp information
cd assets/grasps
mv path/to/downloaded/shapenet_grasps.zip .
unzip shapenet_grasps.zip
cd ../../
  • Your structure should look like this:
obman_render/
  assets/
    models/
      SMPLH_female.pkl
      basicModel_f_lbs_10_207_0_v1.0.2.fbx'
      basicModel_m_lbs_10_207_0_v1.0.2.fbx'
      ...
    grasps/
      shapenet_grasps/
      shapenet_grasps_splits.csv
    SURREAL/
      smpl_data/
      	smpl_data.npz
    ...

Launch !

Minimal version on white background

Hands only

path/to/blender -noaudio -t 1 -P blender_grasps_sacred.py -- '{"frame_nb": 10, "frame_start": 0, "results_root": "datageneration/tmp", "background_datasets": ["white"]}'

Grasping objects

path/to/blender -noaudio -t 1 -P blender_hands_sacred.py -- '{"frame_nb": 10, "frame_start": 0, "results_root": "datageneration/tmp", "background_datasets": ["white"]}'

Full version with image backgrounds

Hands only

path/to/blender -noaudio -t 1 -P blender_hands_sacred.py -- '{"frame_nb": 10, "frame_start": 0, "results_root": "datageneration/tmp", "background_datasets": ["lsun", "imagenet"], "imagenet_path": "/path/to/imagenet", "lsun_path": "/path/to/lsun"}'

Grasping objects

path/to/blender -noaudio -t 1 -P blender_grasps_sacred.py -- '{"frame_nb": 10, "frame_start": 0, "results_root": "datageneration/tmp", "background_datasets": ["lsun", "imagenet"], "imagenet_path": "/path/to/imagenet", "lsun_path": "/path/to/lsun"}'

Citations

If you find this code useful for your research, consider citing:

  • the publication this code has been developped for
@INPROCEEDINGS{hasson19_obman,
  title     = {Learning joint reconstruction of hands and manipulated objects},
  author    = {Hasson, Yana and Varol, G{\"u}l and Tzionas, Dimitris and Kalevatykh, Igor and Black, Michael J. and Laptev, Ivan and Schmid, Cordelia},
  booktitle = {CVPR},
  year      = {2019}
}
  • the publication it builds upon, for synthetic data generation of humans
@INPROCEEDINGS{varol17_surreal,  
  title     = {Learning from Synthetic Humans},  
  author    = {Varol, G{\"u}l and Romero, Javier and Martin, Xavier and Mahmood, Naureen and Black, Michael J. and Laptev, Ivan and Schmid, Cordelia},  
  booktitle = {CVPR},  
  year      = {2017}  
}
  • the publication describing the used hand model: MANO:
@article{MANO:SIGGRAPHASIA:2017,
  title = {Embodied Hands: Modeling and Capturing Hands and Bodies Together},
  author = {Romero, Javier and Tzionas, Dimitrios and Black, Michael J.},
  journal = {ACM Transactions on Graphics, (Proc. SIGGRAPH Asia)},
  publisher = {ACM},
  month = nov,
  year = {2017},
  url = {http://doi.acm.org/10.1145/3130800.3130883},
  month_numeric = {11}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].