All Projects → AustinDeric → yak

AustinDeric / yak

Licence: MIT license
yak (yet another kinfu) is a library and ROS wrapper for Truncated Signed Distance Fields (TSDFs).

Programming Languages

C++
36643 projects - #6 most used programming language
Cuda
1817 projects
CMake
9771 projects

Projects that are alternatives of or similar to yak

hybrid planning experiments
Sampler + Optimizing Motion Planning Demonstrations
Stars: ✭ 23 (-8%)
Mutual labels:  ros, ros-industrial
godel
ROS-Industrial Focused Technical Project: Robotic Blending
Stars: ✭ 64 (+156%)
Mutual labels:  ros, ros-industrial
Spatio temporal voxel layer
A new voxel layer leveraging modern 3D graphics tools to modernize navigation environmental representations
Stars: ✭ 246 (+884%)
Mutual labels:  ros
smacha
SMACHA is a meta-scripting, templating, and code generation engine for rapid prototyping of ROS SMACH state machines.
Stars: ✭ 15 (-40%)
Mutual labels:  ros
naive-surface-nets
Implements a simple, readable naive surface nets algorithm
Stars: ✭ 31 (+24%)
Mutual labels:  meshing
wildmeshing-python
Python bindings for TriWild.
Stars: ✭ 37 (+48%)
Mutual labels:  meshing
abb libegm
A C++ library for interfacing with ABB robot controllers supporting Externally Guided Motion (689-1)
Stars: ✭ 71 (+184%)
Mutual labels:  ros-industrial
Opencr
Software for ROS Embedded board (a.k.a. OpenCR). OpenCR means Open-source Control Module for ROS.
Stars: ✭ 240 (+860%)
Mutual labels:  ros
the-Cooper-Mapper
An open source autonomous driving research platform for Active SLAM & Multisensor Data Fusion
Stars: ✭ 38 (+52%)
Mutual labels:  ros
puma
Poisson Surface Reconstruction for LiDAR Odometry and Mapping
Stars: ✭ 302 (+1108%)
Mutual labels:  surface-reconstruction
goroslib
ROS client library for the Go programming language
Stars: ✭ 226 (+804%)
Mutual labels:  ros-industrial
MicroStructPy
Microstructure modeling, mesh generation, analysis, and visualization.
Stars: ✭ 42 (+68%)
Mutual labels:  meshing
docker
ROS-Industrial docker and cloud tools
Stars: ✭ 23 (-8%)
Mutual labels:  ros-industrial
semantic-tsdf
Semantic-TSDF for Self-driving Static Scene Reconstruction
Stars: ✭ 14 (-44%)
Mutual labels:  tsdf
flexgui industrial
Moved to: https://github.com/PPM-Robotics-AS/flexgui4.0
Stars: ✭ 30 (+20%)
Mutual labels:  ros-industrial
gazebo cars
Gazebo Models for different types of cars
Stars: ✭ 31 (+24%)
Mutual labels:  ros
Rdbox
RDBOX is an advanced IT platform for robotics and IoT developers that highly integrates cloud-native and edge computing technologies.
Stars: ✭ 246 (+884%)
Mutual labels:  ros
shape as points
[NeurIPS'21] Shape As Points: A Differentiable Poisson Solver
Stars: ✭ 398 (+1492%)
Mutual labels:  surface-reconstruction
polatory
Fast, memory-efficient 3D spline interpolation and global kriging, via RBF (radial basis function) interpolation.
Stars: ✭ 82 (+228%)
Mutual labels:  surface-reconstruction
staubli
ROS-Industrial Staubli support (http://wiki.ros.org/staubli)
Stars: ✭ 15 (-40%)
Mutual labels:  ros-industrial

yak

yak (yet another kinfu) is a library and ROS wrapper for Truncated Signed Distance Fields (TSDFs).

A TSDF is a probabilistic representations of a solid surface in 3D space. It's a useful tool for combining many noisy incomplete sensor readings into a single smooth and complete model.

To break down the name:

Distance field: Each voxel in the volume contains a value that represents its metric distance from the closest point on the surface. Voxels very far from the surface have high-magnitude distance values, while those near the surface have values approaching zero.

Signed: Voxels outside the surface have positive distances, while voxels inside the surface have negative distances. This allows the representation of solid objects. The distance field becomes a gradient that shifts from positive to negative as it crosses the surface.

Truncated: Only the distance values of voxels very close to the surface are regularly updated. Distances beyond a certain threshold have their values capped at +/- 1. This decreases the cost of integrating new readings, since not every voxel in the volume needs to be updated.

yak handles two very different use cases. It can reconstruct from a RGBD camera moved around by a human without any knowledge of pose relative to the global frame. It can also reconstruct from a sensor mounted on a robot arm using pose hints provided by TF and robot kinematics. The idea is that this second case doesn't need to deduce sensor motion by comparing the most recent reading to previous readings via ICP, so it should work better in situations with incomplete sensor readings.

The human-guided situation should work out of the box without any other packages. You might need to force the sensor to reset to get the volume positioned in a desirable orientation around your target. The easiest way to do this is to cover and uncover the camera lens.

The robot-assisted situation is currently partially hardcoded to use the sensors and work cell models from the Godel blending project. This will change soon to make it more generalized!

yak_meshing

Aluminum part reconstructed with yak and meshed with yak_meshing

yak_meshing is a ROS package to mesh TSDF volumes generated by Kinect Fusion-like packages.

Meshing happens through the /get_mesh service, which in turn calls the kinfu_ros /get_tsdf service. yak_meshing_node expects a serialized TSDF voxel volume, which is a list of TSDF values and weights for every occupied voxel along with a list of the coordinates of each occupied voxel. OpenVDB's voxel meshing algorithm generates a triangular mesh along the zero-value isosurface of the TSDF volume. The mesh is saved as a .obj, which can be viewed and manipulated in a program like Meshlab or Blender.

nbv_planner

Candidate poses generated and evaluated by nbv_planner

nbv_planner is a ROS package to perform Next Best View analysis using data provided from RGBD cameras like the Asus Xtion. It uses octomap to track voxel occupancy and integrate new readings.

Call the /get_nbv service to return a sorted list (best to worst) of candidate poses near the volume that could expose unknown voxels. Currently evaluation of poses is conducted by casting rays corresponding to the camera's field of view into the octomap. More hits on unknowns = better view.

Executing rosrun nbv_planner exploration_controller_node will execute an exploration routine that will try to move the robot to views that expose unknown regions of a user-specified volume, using the NBV evaluation explained above. The octomap server should be running.

Operating Instructions for Human-Guided Reconstruction

  1. Start TSDF/KinFu processes: roslaunch yak launch_xtion_default.launch
  2. Launch the drivers for the RGBD camera. For the Asus Xtion, this is roslaunch openni2_launch openni2.launch.
  3. Start mapping! Since yak doesn't have any way to relate the pose of the camera to the global frame, the initial position of the volume will be centered in front of the camera. You might have to force the camera to reset a few times to get the volume positioned where you need it.
  4. When you decide that the reconstruction is good enough: rosservice call /get_mesh

Operating Instructions for Autonomous Exploration and Reconstruction

  1. If you intend to use an actual robot, make sure that its state and motion servers are running, and that autonomous motion is allowed (deadman switch engaged, or auto mode).
  2. roslaunch a moveit planning/execution launch file. My command looks like: roslaunch godel_irb2400_moveit_config moveit_planning_execution.launch robot_ip:=192.168.125.1 sim:=False use_ftp:=False. Wait for rviz and moveit to start up.
  3. Launch the TSDF reconstruction nodes. For example, roslaunch yak launch_xtion_robot.launch.
  4. Launch the drivers for the RGBD camera. For the Asus Xtion, this is roslaunch openni2_launch openni2.launch.
  5. Start the octomap server: roslaunch nbv_planner octomap_mapping.launch
  6. When you want to start exploration: rosrun nbv_planner exploration_controller_node
  7. When you decide that the reconstruction is good enough: rosservice call /get_mesh

Build with Docker:

nvidia-docker run -v "<absolute path to your yak workspace:/yak_ws>" rosindustrial/yak:kinetic catkin build --workspace /yak_ws -DCMAKE_LIBRARY_PATH=/usr/local/nvidia/lib64/
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].