All Projects → NVIDIA-ISAAC-ROS → isaac_ros_visual_odometry

NVIDIA-ISAAC-ROS / isaac_ros_visual_odometry

Licence: other
Visual odometry package based on hardware-accelerated NVIDIA Elbrus library with world class quality and performance.

Programming Languages

C++
36643 projects - #6 most used programming language
python
139335 projects - #7 most used programming language
CMake
9771 projects
c
50402 projects - #5 most used programming language

Projects that are alternatives of or similar to isaac ros visual odometry

isaac ros apriltag
CUDA-accelerated Apriltag detection
Stars: ✭ 42 (-58.42%)
Mutual labels:  ros, ros2, ros2-foxy
Robotics-Object-Pose-Estimation
A complete end-to-end demonstration in which we collect training data in Unity and use that data to train a deep neural network to predict the pose of a cube. This model is then deployed in a simulated robotic pick-and-place task.
Stars: ✭ 153 (+51.49%)
Mutual labels:  ros, perception
point-cloud-clusters
A catkin workspace in ROS which uses DBSCAN to identify which points in a point cloud belong to the same object.
Stars: ✭ 43 (-57.43%)
Mutual labels:  ros, perception
aws-robomaker-robotics-curriculum
Introductory robotics curriculum
Stars: ✭ 78 (-22.77%)
Mutual labels:  ros, ros2
KRS
The Kria Robotics Stack (KRS) is a ROS 2 superset for industry, an integrated set of robot libraries and utilities to accelerate the development, maintenance and commercialization of industrial-grade robotic solutions while using adaptive computing.
Stars: ✭ 26 (-74.26%)
Mutual labels:  ros, ros2
microstrain inertial
ROS driver for all of MicroStrain's current G and C series products. To learn more visit
Stars: ✭ 44 (-56.44%)
Mutual labels:  ros, ros2-foxy
trac ik
ROS 2 port of `trac_ik`, an alternative Inverse Kinematics solver to the popular inverse Jacobian methods in KDL.
Stars: ✭ 14 (-86.14%)
Mutual labels:  ros, ros2
Awesome Robotic Tooling
Tooling for professional robotic development in C++ and Python with a touch of ROS, autonomous driving and aerospace.
Stars: ✭ 1,876 (+1757.43%)
Mutual labels:  ros, ros2
aztarna
aztarna, a footprinting tool for robots.
Stars: ✭ 85 (-15.84%)
Mutual labels:  ros, ros2
NuttX
Official micro-ROS RTOS
Stars: ✭ 63 (-37.62%)
Mutual labels:  ros, ros2
scout ros
This repository ROS package is suit for scout 2 and scout mini
Stars: ✭ 30 (-70.3%)
Mutual labels:  ros, ros2
the-Cooper-Mapper
An open source autonomous driving research platform for Active SLAM & Multisensor Data Fusion
Stars: ✭ 38 (-62.38%)
Mutual labels:  ros, perception
linorobot2
Autonomous mobile robots (2WD, 4WD, Mecanum Drive)
Stars: ✭ 97 (-3.96%)
Mutual labels:  ros, ros2
drl grasping
Deep Reinforcement Learning for Robotic Grasping from Octrees
Stars: ✭ 160 (+58.42%)
Mutual labels:  ros, ros2
dynamixel control
ros2_control packages for ROBOTIS Dynamixel
Stars: ✭ 69 (-31.68%)
Mutual labels:  ros2, ros2-foxy
rclex
Rclex: ROS 2 Client Library for Elixir
Stars: ✭ 77 (-23.76%)
Mutual labels:  ros, ros2
astuff sensor msgs
A set of messages specific to each sensor supported by AutonomouStuff.
Stars: ✭ 37 (-63.37%)
Mutual labels:  ros, ros2
awesome-webots
Awesome Webots
Stars: ✭ 46 (-54.46%)
Mutual labels:  ros, ros2
robosherlock
www.robosherlock.org
Stars: ✭ 23 (-77.23%)
Mutual labels:  ros, perception
ros-gst-bridge
a bidirectional ros to gstreamer bridge and utilities for dynamic pipelines
Stars: ✭ 46 (-54.46%)
Mutual labels:  ros, ros2

Isaac ROS Visual Odometry

This repository provides a ROS2 package that estimates stereo visual inertial odometry using the Isaac Elbrus GPU-accelerated library. It takes in a time synced pair of stereo images (grayscale) along with respective camera intrinsics to publish the current pose of the camera relative to its start pose.

Elbrus is based on two core technologies: Visual Odometry (VO) and Simultaneous Localization and Mapping (SLAM).

Visual Odometry is a method for estimating a camera position relative to its start position. This method has an iterative nature. At each iteration it considers two consequential input frames (stereo pairs). On both frames, it finds a set of keypoints. Matching keypoints in these two sets gives the ability to estimate the transition and relative rotation of the camera between frames.

Simultaneous Localization and Mapping is a method built on top of the VO predictions. It aims to improve the quality of VO estimations by leveraging the knowledge of previously seen parts of a trajectory. It detects if the current scene was seen in the past (i.e. a loop in camera movement) and runs an additional optimization procedure to tune previously obtained poses.

Along with visual data Elbrus can optionally use Inertial Measurement Unit (IMU) measurements. It automatically switches to IMU when VO is unable to estimate a pose–for example, when there is dark lighting or long solid surfaces in front of a camera.

Elbrus delivers real-time tracking performance: more than 60 FPS for VGA resolution. For the KITTI benchmark, the algorithm achieves a drift of ~1% in localization and an orientation error of 0.003 degrees per meter of motion.

Elbrus allows for robust tracking in various environments and with different use cases: indoor, outdoor, aerial, HMD, automotive, and robotics.

This has been tested on ROS2 (Foxy) and should build and run on x86_64 and aarch64 (Jetson).

System Requirements

This Isaac ROS package is designed and tested to be compatible with ROS2 Foxy on Jetson hardware.

Jetson

  • AGX Xavier or Xavier NX
  • JetPack 4.6

x86_64

  • CUDA 10.2+ supported discrete GPU
  • Ubuntu 20.04+

Note: For best performance on Jetson, ensure that power settings are configured appropriately (Power Management for Jetson).

Docker

Precompiled ROS2 Foxy packages are not available for JetPack 4.6 (based on Ubuntu 18.04 Bionic). You can either manually compile ROS2 Foxy and required dependent packages from source or use the Isaac ROS development Docker image from Isaac ROS Common.

You must first install the Nvidia Container Toolkit to make use of the Docker container development/runtime environment.

Configure nvidia-container-runtime as the default runtime for Docker by editing /etc/docker/daemon.json to include the following:

    "runtimes": {
        "nvidia": {
            "path": "nvidia-container-runtime",
            "runtimeArgs": []
        }
    },
    "default-runtime": "nvidia"

and then restarting Docker: sudo systemctl daemon-reload && sudo systemctl restart docker

Run the following script in isaac_ros_common to build the image and launch the container:

$ scripts/run_dev.sh <optional path>

You can either provide an optional path to mirror in your host ROS workspace with Isaac ROS packages, which will be made available in the container as /workspaces/isaac_ros-dev, or you can setup a new workspace in the container.

Package Dependencies

Note: isaac_ros_common is used for running tests and/or creating a development container and isaac_ros_image_pipeline is used as an executable dependency in launch files.

Rectified vs Smooth Transform

Elbrus provides the user with two possible modes: visual odometry with SLAM (rectified transform) or pure visual odometry (smooth transform).

Using rectified transform (enable_rectified_pose = true) is more beneficial in cases when the camera path is frequently encircled or entangled. Though rectified transform increases computational resource demands for VO, it usually reduces the final position drift dramatically.

On the other hand, using pure VO (smooth tranform) may be useful on relatively short and straight camera trails. This choice will reduce compute usage and increase working speed without causing essential pose drift.

Coordinate Frames

This section describes the coordinate frames that are involved in the VisualOdomertyNode. The frames discussed below are oriented as follows:

  1. left_camera_frame: The frame associated with left eye of the stereo camera. Note that this is not the same as optical frame.
  2. right_camera_frame: The frame associated with right eye of the stereo camera. Note that this is not the same as optical frame.
  3. imu_frame: The frame associated with the IMU sensor (if available).
  4. fixed_frame: The fixed frame that aligns with the start pose of the stereo camera. The tracked poses are published in the TF tree with respect to this frame.
  5. current_smooth_frame: The moving frame that tracks the current smooth pose of the stereo camera. The poses in this frame represent a smooth continous motion. Note that the frame can drift over time.
  6. current_rectified_frame: The moving frame that tracks the current rectified pose of the stereo camera. The poses in this frame represent an accurate position and orientation. Note that the frame can have sudden jumps.

Quickstart

  1. Create a ROS2 workspace if one is not already prepared:
    mkdir -p your_ws/src
    Note: The workspace can have any name; these steps use the name your_ws.
  2. Clone this package repository to your_ws/src/isaac_ros_visual_odometry. Check that you have Git LFS installed before cloning to pull down all large files.
    sudo apt-get install git-lfs
    cd your_ws/src && git clone https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_visual_odometry
  3. Build and source the workspace:
    cd your_ws && colcon build --symlink-install && source install/setup.bash
  4. (Optional) Run tests to verify complete and correct installation:
    colcon test

This repository comes with pre-configured launch files for a real camera and a simulator.

For RealSense camera

Note: You will need to calibrate the intrinsics of your camera if you want the node to determine the 3D pose of your camera. See here for more details.

  1. Start isaac_ros_visual_odometry using the launch files:
    ros2 launch isaac_ros_visual_odometry isaac_ros_visual_odometry_realsense.launch.py

  2. To connect the RealSense camera's default TF tree to the current_smooth_frame(default value is base_link_smooth), run the following command:
    ros2 run tf2_ros static_transform_publisher 0 0 0 0 0 0 1 base_link_smooth camera_link

  3. In a separate terminal, spin up RViz to see the tracking of the camera:
    rviz2

  4. Add the TF tree in the Displays panel of the RViz.

  5. Set the Fixed frame in the Global Options to the name provided in the fixed_frame variable.

  6. Try moving your camera, and you should see something like this:

  7. If you prefer to observe the Visual Odometry output in a text mode, on a separate terminal echo the contents of the /visual_odometry/tf_stamped topic with the following command:
    ros2 topic echo /visual_odometry/tf_stamped

For Isaac Sim

  1. Make sure you have Isaac Sim set up correctly and choose the appropriate working environment[Native/Docker&Cloud]. For this walkthrough, we are using the native workstation setup for Isaac Sim.

  2. See Running For The First Time section to launch Isaac Sim from the app launcher and click on the Isaac Sim button.

  3. Set up the Isaac Sim ROS2 bridge as described here.

  4. Connect to the Nucleus server as shown in the Getting Started section if you have not done it already.

  5. Open up the Isaac ROS Common USD scene located at:

    omniverse://<your_nucleus_server>/Isaac/Samples/ROS/Scenario/carter_warehouse_apriltags_worker.usd.

    And wait for it to load completly.

  6. Enable the right camera for a stereo image pair. Go to the stage tab and select /World/Carter_ROS/ROS_Camera_Stereo_Right then tick the enabled checkbox.

  7. Press Play to start publishing data from the Isaac Sim application.

  8. In a separate terminal, start isaac_ros_visual_odometry using the launch files:
    ros2 launch isaac_ros_visual_odometry isaac_ros_visual_odometry_isaac_sim.launch.py

  9. In a separate terminal, send the signal to rotate the robot in the sim as follows:
    ros2 topic pub /cmd_vel geometry_msgs/Twist '{linear: {x: 0.0, y: 0.0, z: 0.0}, angular: {x: 0.0,y: 0.0,z: 0.05}}'

  10. In a separate terminal, run RViz to visualize the tracking of the pose of the camera:
    rviz2

  11. Add the TF tree in the Displays RViz panel.

  12. Set the Fixed frame in the Global Options to the name provided in the fixed_frame variable.

  13. You should see something like this:

  14. If you prefer to observe the Visual Odometry output in a text mode, on a separate terminal, echo the contents of the /visual_odometry/tf_stamped topic with the following command:
    ros2 topic echo /visual_odometry/tf_stamped

For Isaac Sim with Hardware in the loop (HIL)

The following instructions are for a setup where we can run the sample on a Jetson device and Isaac Sim on an x86 machine. We will use the ROS_DOMAIN_ID environment variable to have a separate logical network for Isaac Sim and the sample application.

NOTE: Before executing any of the ROS commands, make sure to set the ROS_DOMAIN_ID variable first.

  1. Complete step 5 of For Isaac Sim section if you have not done it already.

  2. Open the location of the Isaac Sim package in the terminal by clicking the Open in Terminal button.

  3. In the terminal opened by the previous step, set the ROS_DOMAIN_ID as shown:

    export ROS_DOMAIN_ID=<some_number>

  4. Launch Isaac Sim from the script as shown:

    ./isaac-sim.sh

  5. Continue with step 7 of For Isaac Sim section. Make sure to set the ROS_DOMAIN_ID variable before running the sample application.

Package Reference

isaac_ros_visual_odometry

Available Components

Component Topics Subscribed Topics Published Parameters
VisualOdometryNode /stereo_camera/left/image: The image from the left eye of the stereo camera in grayscale
/stereo_camera/left/camera_info: CameraInfo from the left eye of the stereo camera
/stereo_camera/right/image: The image from the right eye of the stereo camera in grayscale
/stereo_camera/right/camera_info: CameraInfo from the right eye of the stereo camera
visual_odometry/imu: Sensor data from IMU(optional).
/visual_odometry/tf_stamped: A consolidated message that provies information about the tracked poses. It consists of two poses: smooth_transform is calculated from pure visual odometry; rectified_transform is calculated with SLAM enabled. It also gives the states of both poses.
See more details below.
enable_rectified_pose: If enabled, rectified_transform will be populated in the message. It is enabled by default.
denoise_input_images: If enabled, input images are denoised. It can be enabled when images are noisy because of low-light conditions. It is disabled by default.
rectified_images: Flag to mark if the images coming in the subscribed topics are rectified or raw. It is enabled by default.
enable_imu: If enabled, IMU data is used. It is disabled by default.
enable_debug_mode: If enabled, a debug dump(image frames, timestamps, and camera info) is saved on to the disk at the path indicated by debug_dump_path. It is disabled by default
debug_dump_path: The path to the directory to store the debug dump data. The default value is /tmp/elbrus.
gravitational_force: The initial gravity vector defined in the odom frame. If the IMU sensor is not parallel to the floor, update all the axes with appropriate values. The default value is {0.0, 0, -9.8}.
left_camera_frame: The name of the left camera frame. The default value is left_cam.
right_camera_frame: The name of the right camera frame. The default value is right_cam.
imu_frame: The name of the imu frame. The default value is imu.
fixed_frame: The name of the frame where odomerty or poses are tracked. This is a fixed frame. The default value is odom.
current_smooth_frame: The name of the frame where smooth_transform is published. This is a moving frame. The default value is base_link_smooth.
current_rectified_frame: The name of the frame where rectified_transform is published. This is a moving frame. The default value is base_link_rectified.

Launch files

Name Description
isaac_ros_visual_odometry.launch.py Launch file to bring up visual odometry node standalone.
isaac_ros_visual_odometry_isaac_sim.launch.py Launch file which brings up visual odometry node configured for Isaac Sim.
isaac_ros_visual_odometry_realsense.launch.py Launch file which brings up visual odometry node configured for RealSense.
isaac_ros_visual_odometry_zed2.launch.py Launch file which brings up visual odometry node remapped for Zed2.

visual_odometry/tf_stamped


# The frame id in the header is used as the reference frame of both the transforms below.
std_msgs/Header header

# The frame id of the child frame to which the transforms point.
string child_frame_id

# Translation and rotation in 3-dimensions of child_frame_id from header.frame_id.
# rectified_tranform represents the most accurate position and orientation.
# It shall be consumed if robust tracking is needed. Note: It can result in sudden jumps.
geometry_msgs/Transform rectified_transform

# smooth_transform represents the position and orientation which corresponds to a smooth continous motion.
# It shall be consumed if using as another signal for e.g. in case of sensor fusion.
# Note: It might drift over time  
geometry_msgs/Transform smooth_transform

# Pure visual odometry return code:
# 0 - Unknown state
# 1 - Success
# 2 - Failed
# 3 - Success but invalidated by IMU
uint8 vo_state

# Integrator status:
# 0 - Unknown state
# 1 - Static
# 2 - Inertial
# 3 - IMU
uint8 integrator_state

Troubleshooting

  • If RViz is not showing the poses, check the Fixed Frame value.
  • If you are seeing Tracker is lost. messages frequently, it could be caused by the following issues:
    • Fast motion causing the motion blur in the frames.
    • Low-lighting conditions
    • The wrong camerainfo is being published
  • For better performance:
    • Increase the capture framerate from the camera to yield a better tracking result.
    • If input images are noisy, you can use the denoise_input_images flag in the node.

Updates

Date Changes
2021-11-15 Isaac Sim HIL documentation update
2021-10-20 Initial release
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].