All Projects → ankurhanda → Sunrgbd Meta Data

ankurhanda / Sunrgbd Meta Data

train test labels for sunrgbd

Programming Languages

matlab
3953 projects

Projects that are alternatives of or similar to Sunrgbd Meta Data

ESANet
ESANet: Efficient RGB-D Semantic Segmentation for Indoor Scene Analysis
Stars: ✭ 154 (+21.26%)
Mutual labels:  rgbd, semantic-segmentation
Vpgnet
VPGNet: Vanishing Point Guided Network for Lane and Road Marking Detection and Recognition (ICCV 2017)
Stars: ✭ 382 (+200.79%)
Mutual labels:  dataset, semantic-segmentation
ACVR2017
An Innovative Salient Object Detection Using Center-Dark Channel Prior
Stars: ✭ 20 (-84.25%)
Mutual labels:  dataset, rgbd
Computervisiondatasets
Stars: ✭ 207 (+62.99%)
Mutual labels:  dataset, semantic-segmentation
Openvehiclevision
An opensource lib. for vehicle vision applications (written by MATLAB), lane marking detection, road segmentation
Stars: ✭ 120 (-5.51%)
Mutual labels:  dataset, semantic-segmentation
Cocostuff10k
The official homepage of the (outdated) COCO-Stuff 10K dataset.
Stars: ✭ 248 (+95.28%)
Mutual labels:  dataset, semantic-segmentation
Open3d Ml
An extension of Open3D to address 3D Machine Learning tasks
Stars: ✭ 284 (+123.62%)
Mutual labels:  semantic-segmentation, rgbd
Contactpose
Large dataset of hand-object contact, hand- and object-pose, and 2.9 M RGB-D grasp images.
Stars: ✭ 129 (+1.57%)
Mutual labels:  dataset, rgbd
Deep Segmentation
CNNs for semantic segmentation using Keras library
Stars: ✭ 69 (-45.67%)
Mutual labels:  dataset, semantic-segmentation
Label Studio
Label Studio is a multi-type data labeling and annotation tool with standardized output format
Stars: ✭ 7,264 (+5619.69%)
Mutual labels:  dataset, semantic-segmentation
Semantic Segmentation Suite
Semantic Segmentation Suite in TensorFlow. Implement, train, and test new Semantic Segmentation models easily!
Stars: ✭ 2,395 (+1785.83%)
Mutual labels:  dataset, semantic-segmentation
Autoannotationtool
A label tool aim to reduce semantic segmentation label time, rectangle and polygon annotation is supported
Stars: ✭ 113 (-11.02%)
Mutual labels:  dataset, semantic-segmentation
Gta Im Dataset
[ECCV-20] 3D human scene interaction dataset: https://people.eecs.berkeley.edu/~zhecao/hmp/index.html
Stars: ✭ 157 (+23.62%)
Mutual labels:  dataset, rgbd
RGBD-semantic-segmentation
A paper list of RGBD semantic segmentation (processing)
Stars: ✭ 264 (+107.87%)
Mutual labels:  rgbd, semantic-segmentation
Lapa Dataset
A large-scale dataset for face parsing (AAAI2020)
Stars: ✭ 149 (+17.32%)
Mutual labels:  dataset, semantic-segmentation
Semantic Kitti Api
SemanticKITTI API for visualizing dataset, processing data, and evaluating results.
Stars: ✭ 272 (+114.17%)
Mutual labels:  dataset, semantic-segmentation
Suncgtoolbox
C++ based toolbox for the SUNCG dataset
Stars: ✭ 136 (+7.09%)
Mutual labels:  semantic-segmentation, rgbd
Cvat
Powerful and efficient Computer Vision Annotation Tool (CVAT)
Stars: ✭ 6,557 (+5062.99%)
Mutual labels:  dataset, semantic-segmentation
Universal Data Tool
Collaborate & label any type of data, images, text, or documents, in an easy web interface or desktop app.
Stars: ✭ 1,356 (+967.72%)
Mutual labels:  dataset, semantic-segmentation
Cen
[NeurIPS 2020] Code release for paper "Deep Multimodal Fusion by Channel Exchanging" (In PyTorch)
Stars: ✭ 112 (-11.81%)
Mutual labels:  semantic-segmentation, rgbd

What does this repository contain?

The SUNRGBD2Dseg.mat contained in the SUNRGBDtoolbox/Metadata directory needs a RAM of about 64GB to load either in MATLAB or Octave. Therefore, for future use and to avoid any dependence on the .mat file, the data (i.e. semantic segmentation labels) is extracted and stored in this repository. We also provide links to the RGB data. If you are looking to do semantic segmentation on the RGB images, this repository is self contained for that and you should be able to do it without having to download the dataset from the original links provided in the SUN RGB-D paper. However, if you need additional depth data, you will need to download the tgz file from the dataset link. We also provide code to turn depth into DHA features used in the SceneNet paper, by using the rotation matrices provided in the SUN RGB-D dataset. To summarise, this repository contains the following

  • Train and test images path names in the SUN RGB-D provided in the sunrgbd_training_images.txt and sunrgbd_testing_images.txt respectively.
  • 37 Class labels for images both in training and test dataset compressed in the sunrgbd_test_train_labels.tar.gz file.
  • The first 5050 images in the sunrgbd_test_train_labels.tar.gz contain labels for test dataset while training set labels begin from 5051 and end at 10335.
  • Training dataset (5285 jpg images) is available from SUNRGBD-train_images.tgz
  • Test dataset (5050 jpg images) is available from SUNRGBD-test_images.tgz
  • The mappings from 37 class labels to 13 class labels are provided in SceneNetv1.0 repository. The 37 class names are stored in seg37list.mat that comes in the SUNRGBDtoolbox but it contains the following list.
>> seg = load('seg37list.mat');
>> seg.seg37list
ans = 
{
  [1,1] = wall
  [1,2] = floor
  [1,3] = cabinet
  [1,4] = bed
  [1,5] = chair
  [1,6] = sofa
  [1,7] = table
  [1,8] = door
  [1,9] = window
  [1,10] = bookshelf
  [1,11] = picture
  [1,12] = counter
  [1,13] = blinds
  [1,14] = desk
  [1,15] = shelves
  [1,16] = curtain
  [1,17] = dresser
  [1,18] = pillow
  [1,19] = mirror
  [1,20] = floor_mat
  [1,21] = clothes
  [1,22] = ceiling
  [1,23] = books
  [1,24] = fridge
  [1,25] = tv
  [1,26] = paper
  [1,27] = towel
  [1,28] = shower_curtain
  [1,29] = box
  [1,30] = whiteboard
  [1,31] = person
  [1,32] = night_stand
  [1,33] = toilet
  [1,34] = sink
  [1,35] = lamp
  [1,36] = bathtub
  [1,37] = bag
}
  • 13 class training/testing labels are provided in train13labels.tgz and test13labels.tgz respectively, in this directory.

This alleviates the burden of having to install MATLAB (that requires a license) on your computer and parsing the .mat files in the SUN RGB-D dataset.

Training on RGB data for 13 classes

Training on RGB data for 37 classes

  • Get the RGBs as above.
  • Dowload the sunrgbd_train_test_labels.tar.gz in this directory and untar it tar -xvzf sunrgbd_train_test_labels.tar.gz.
  • Create two directories mkdir -p labels/train labels/test
  • Move the first 5050 files in test directory mv ../img-00[0-4]*.png test && mv ../img-0050[0-4]*.png test && mv ../img-005050.png test and remaining in train directory mv ../img-*.png train.
  • If you need to create a .txt file with names of corresponding rgbs and labels, please follow this paste sunrgbd_rgb_files.txt -d' ' sunrgbd_labels37_files.txt where sunrgbd_rgb_files.txt contains the names of the rgb files and similarly for sunrgbd_labels37_files.txt. You should see the following
img-000001.jpg img-005051.png
img-000002.jpg img-005052.png
img-000003.jpg img-005053.png
img-000004.jpg img-005054.png
img-000005.jpg img-005055.png
img-000006.jpg img-005056.png
img-000007.jpg img-005057.png
img-000008.jpg img-005058.png
img-000009.jpg img-005059.png
img-000010.jpg img-005060.png
img-000011.jpg img-005061.png
....

Training and test data for depth

We now also provide links to depth data which are

To obtain the depth in meters, divide the png values by 10,000.

How do I compute the DHA features?

  • Download the SUN RGB-D dataset and make sure your paths are set properly. The SUN RGB-D dataset can be obtained from the link, SUN RGB-D. The SUN RGB-D toolbox needed to parse the files is available here at toolbox.
  • Depth images are contained in the depth_bfx folder in the SUN RGB-D dataset. You should be able to see that in one of the folders after you compress the .zip file SUNRGBD/kv1/NYUdata/NYU0034/depth_bfx. These depth images have been obtained by running an inpainting algorithm on the raw depth images (that contain holes and missing values) to obtain a complete depth image for which each pixel has a depth value.
  • Make sure you have octave installed on your machine. If not, please install via apt-get on your ubuntu machine sudo apt-get install octave or pacman -Sy octave on your arch linux machine.
  • Run the computeDHA_SUNRGBD.m file in Octave (type octave --no-gui in your terminal) and it should return you the DHA features saved .bin file (if you wish to save in another format you could easily modify the code).
  • Camera rotation matrices are provided in sunrgbd_train_T_fileNames.txt, to allow for mapping the floor normal vector to gravity vector to obtain height from ground plane.

How do I benchmark?

getAccuracyNYU.m available in the SceneNetv1.0 repository allows you to obtain the avereage global and class accuracies.

What are the classes and where is the mapping form the class number to the class name?

The mapping is also available at SceneNetv1.0 repository.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].