All Projects → ai4ce → DeepSoRo

ai4ce / DeepSoRo

Licence: other
[RA-L/ICRA2020] Real-time Soft Body 3D Proprioception via Deep Vision-based Sensing

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to DeepSoRo

eidos-audition
Collection of auditory models.
Stars: ✭ 25 (+56.25%)
Mutual labels:  perception
VoxelGamesLib
Multi-platform, fully-featured, data-driven, abstract and expendable minecraft minigames framework
Stars: ✭ 15 (-6.25%)
Mutual labels:  data-driven
VoxelGamesLibv2
Powerful, feature-packed, abstract and expandable Minecraft minigames framework.
Stars: ✭ 67 (+318.75%)
Mutual labels:  data-driven
python4selftrackers
Presentations on Quantified Self and Self-Tracking with Python
Stars: ✭ 26 (+62.5%)
Mutual labels:  data-driven
pymor
pyMOR - Model Order Reduction with Python
Stars: ✭ 198 (+1137.5%)
Mutual labels:  data-driven
commix
Micro-framework for data-driven composable system architectures
Stars: ✭ 46 (+187.5%)
Mutual labels:  data-driven
Pyllusion
A Parametric Framework to Generate Visual Illusions using Python
Stars: ✭ 35 (+118.75%)
Mutual labels:  perception
MotionNet
CVPR 2020, "MotionNet: Joint Perception and Motion Prediction for Autonomous Driving Based on Bird's Eye View Maps"
Stars: ✭ 141 (+781.25%)
Mutual labels:  perception
Processor
Ontology-driven Linked Data processor and server for SPARQL backends. Apache License.
Stars: ✭ 54 (+237.5%)
Mutual labels:  data-driven
the-Cooper-Mapper
An open source autonomous driving research platform for Active SLAM & Multisensor Data Fusion
Stars: ✭ 38 (+137.5%)
Mutual labels:  perception
odak
🔬 Scientific computing library for optics 🔭, computer graphics 💻 and visual perception 👀
Stars: ✭ 99 (+518.75%)
Mutual labels:  perception
wetlandmapR
Scripts, tools and example data for mapping wetland ecosystems using data driven R statistical methods like Random Forests and open source GIS
Stars: ✭ 16 (+0%)
Mutual labels:  data-driven
data-driven-range-slider
D3.js based data-driven range slider, date time support
Stars: ✭ 21 (+31.25%)
Mutual labels:  data-driven
EZyRB
Easy Reduced Basis method
Stars: ✭ 49 (+206.25%)
Mutual labels:  data-driven
scikit tt
Tensor Train Toolbox
Stars: ✭ 52 (+225%)
Mutual labels:  data-driven
ariyana
Ariyana is an ECS work in progress game engine written in Orthodox C++ and Beef with a focus on cross-platform and multiplayer games
Stars: ✭ 73 (+356.25%)
Mutual labels:  data-driven
Perception-of-Autonomous-mobile-robot
Perception of Autonomous mobile robot,Using ROS,rs-lidar-16,By SLAM,Object Detection with Yolov5 Based DNN
Stars: ✭ 40 (+150%)
Mutual labels:  perception
point-cloud-clusters
A catkin workspace in ROS which uses DBSCAN to identify which points in a point cloud belong to the same object.
Stars: ✭ 43 (+168.75%)
Mutual labels:  perception
M
Data oriented programming language for game developers
Stars: ✭ 22 (+37.5%)
Mutual labels:  data-driven
continuous-fusion
(ROS) Sensor fusion algorithm for camera+lidar.
Stars: ✭ 26 (+62.5%)
Mutual labels:  perception

DeepSoRo

Ruoyu Wang, Shiheng Wang, Songyu Du, Erdong Xiao, Wenzhen Yuan, Chen Feng

This repository contains PyTorch implementation associated with the paper: "Real-time Soft Body 3D Proprioception via Deep Vision-based Sensing", RA-L/ICRA 2020.

Abstract

Soft bodies made from flexible and deformable materials are popular in many robotics applications, but their proprioceptive sensing has been a long-standing challenge. In other words, there has hardly been a method to measure and model the high-dimensional 3D shapes of soft bodies with internal sensors. We propose a framework to measure the high-resolution 3D shapes of soft bodies in real-time with embedded cameras. The cameras capture visual patterns inside a soft body, and convolutional neural network (CNN) produces a latent code representing the deformation state, which can then be used to reconstruct the body’s 3D shape using another neural network. We test the framework on various soft bodies, such as a Baymax-shaped toy, a latex balloon, and some soft robot fingers, and achieve real-time computation (≤2.5 ms/frame) for robust shape estimation with high precision (≤1% relative error) and high resolution. We believe the method could be applied to soft robotics and human-robot interaction for proprioceptive shape sensing.

Results

Top row: Predicted 3D shape Bottom row: Ground truth 3D shape

A video demo is provided thourgh this link.

DeeoSoRo Video

Code

The code of this project is released on our GitHub repository.

Requirement

We recommand to use conda to install the required packages.

python>=3.6

pytorch>=1.4

pytorch3d>=0.2.0

open3d>=0.10.0

Please check https://github.com/facebookresearch/pytorch3d/blob/master/INSTALL.md for pytorch3d installation.

Insturctions

  1. Download the dataset: https://drive.google.com/file/d/1mrsSqivo2GCJ_frP_ehMg5cE5K0Yj48y/view?usp=sharing
  2. unzip the BaymaxData.zip.

Train

python train_baymax.py -d ${PATH_TO_DATASET}/BaymaxData/train -o ${PATH_TO_OUTPUTS}

Test

python test_baymax.py -d ${PATH_TO_DATASET}/BaymaxData/test -m ${PATH_TO_OUTPUTS}/params/ep_${EPOCH_INDEX}.pth

Citation

If you find DeepSoRo useful in your research, please cite:

@article{wang2019real,
  title={Real-time Soft Robot 3D Proprioception via Deep Vision-based Sensing},
  author={Wang, Ruoyu and Wang, Shiheng and Du, Songyu and Xiao, Erdong and Yuan, Wenzhen and Feng, Chen},
  journal={arXiv preprint arXiv:1904.03820},
  year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].