All Projects → vectr-ucla → direct_lidar_odometry

vectr-ucla / direct_lidar_odometry

Licence: MIT License
Direct LiDAR Odometry: Fast Localization with Dense Point Clouds

Programming Languages

C++
36643 projects - #6 most used programming language
CMake
9771 projects

Projects that are alternatives of or similar to direct lidar odometry

Awesome Robotic Tooling
Tooling for professional robotic development in C++ and Python with a touch of ROS, autonomous driving and aerospace.
Stars: ✭ 1,876 (+828.71%)
Mutual labels:  robotics, mapping, ros, lidar, slam
Pythonrobotics
Python sample codes for robotics algorithms.
Stars: ✭ 13,934 (+6798.02%)
Mutual labels:  localization, robotics, mapping, slam
Dynamic robot localization
Point cloud registration pipeline for robot localization and 3D perception
Stars: ✭ 339 (+67.82%)
Mutual labels:  localization, robotics, mapping, lidar
Iris lama
LaMa - A Localization and Mapping library
Stars: ✭ 217 (+7.43%)
Mutual labels:  localization, robotics, mapping, slam
Kimera Vio
Visual Inertial Odometry with SLAM capabilities and 3D Mesh generation.
Stars: ✭ 741 (+266.83%)
Mutual labels:  localization, robotics, mapping, slam
Cartographer
Cartographer is a system that provides real-time simultaneous localization and mapping (SLAM) in 2D and 3D across multiple platforms and sensor configurations.
Stars: ✭ 5,754 (+2748.51%)
Mutual labels:  localization, robotics, mapping, slam
Rtabmap
RTAB-Map library and standalone application
Stars: ✭ 1,376 (+581.19%)
Mutual labels:  localization, robotics, mapping, slam
Door Slam
Distributed, Online, and Outlier Resilient SLAM for Robotic Teams
Stars: ✭ 107 (-47.03%)
Mutual labels:  localization, robotics, mapping, slam
Loam velodyne
Laser Odometry and Mapping (Loam) is a realtime method for state estimation and mapping using a 3D lidar.
Stars: ✭ 1,135 (+461.88%)
Mutual labels:  mapping, ros, lidar, slam
Lego Loam
LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain
Stars: ✭ 1,138 (+463.37%)
Mutual labels:  mapping, ros, lidar, slam
puma
Poisson Surface Reconstruction for LiDAR Odometry and Mapping
Stars: ✭ 302 (+49.5%)
Mutual labels:  mapping, lidar, slam, odometry
Evo
Python package for the evaluation of odometry and SLAM
Stars: ✭ 1,373 (+579.7%)
Mutual labels:  robotics, mapping, ros, slam
UrbanLoco
UrbanLoco: A Full Sensor Suite Dataset for Mapping and Localization in Urban Scenes
Stars: ✭ 147 (-27.23%)
Mutual labels:  localization, mapping, lidar, slam
Hdl localization
Real-time 3D localization using a (velodyne) 3D LIDAR
Stars: ✭ 332 (+64.36%)
Mutual labels:  localization, ros, lidar
Quickmcl
QuickMCL - Monte Carlo localisation for ROS
Stars: ✭ 24 (-88.12%)
Mutual labels:  localization, robotics, ros
Floam
Fast LOAM: Fast and Optimized Lidar Odometry And Mapping for indoor/outdoor localization (Lidar SLAM)
Stars: ✭ 326 (+61.39%)
Mutual labels:  localization, robotics, slam
Urbannavdataset
UrbanNav: an Open-Sourcing Localization Data Collected in Asian Urban Canyons, Including Tokyo and Hong Kong
Stars: ✭ 79 (-60.89%)
Mutual labels:  localization, lidar, slam
Mola
A Modular Optimization framework for Localization and mApping (MOLA)
Stars: ✭ 206 (+1.98%)
Mutual labels:  localization, lidar, slam
lt-mapper
A Modular Framework for LiDAR-based Lifelong Mapping
Stars: ✭ 301 (+49.01%)
Mutual labels:  mapping, lidar, slam
slam gmapping
Slam Gmapping for ROS2
Stars: ✭ 56 (-72.28%)
Mutual labels:  localization, mapping, slam

Direct LiDAR Odometry: Fast Localization with Dense Point Clouds

DLO is a lightweight and computationally-efficient frontend LiDAR odometry solution with consistent and accurate localization. It features several algorithmic innovations that increase speed, accuracy, and robustness of pose estimation in perceptually-challenging environments and has been extensively tested on aerial and legged robots.

This work was part of NASA JPL Team CoSTAR's research and development efforts for the DARPA Subterranean Challenge, in which DLO was the primary state estimation component for our fleet of autonomous aerial vehicles.


drawing drawing

drawing

Instructions

DLO requires an input point cloud of type sensor_msgs::PointCloud2 with an optional IMU input of type sensor_msgs::Imu. Note that although IMU data is not required, it can be used for initial gravity alignment and will help with point cloud registration.

Dependencies

Our system has been tested extensively on both Ubuntu 18.04 Bionic with ROS Melodic and Ubuntu 20.04 Focal with ROS Noetic, although other versions may work. The following configuration with required dependencies has been verified to be compatible:

  • Ubuntu 18.04 or 20.04
  • ROS Melodic or Noetic (roscpp, std_msgs, sensor_msgs, geometry_msgs, pcl_ros)
  • C++ 14
  • CMake >= 3.16.3
  • OpenMP >= 4.5
  • Point Cloud Library >= 1.10.0
  • Eigen >= 3.3.7

Installing the binaries from Aptitude should work though:

sudo apt install libomp-dev libpcl-dev libeigen3-dev 

Compiling

Create a catkin workspace, clone the direct_lidar_odometry repository into the src folder, and compile via the catkin_tools package (or catkin_make if preferred):

mkdir ws && cd ws && mkdir src && catkin init && cd src
git clone https://github.com/vectr-ucla/direct_lidar_odometry.git
catkin build

Execution

After sourcing the workspace, launch the DLO odometry and mapping ROS nodes via:

roslaunch direct_lidar_odometry dlo.launch \
  pointcloud_topic:=/robot/velodyne_points \
  imu_topic:=/robot/vn100/imu

Make sure to edit the pointcloud_topic and imu_topic input arguments with your specific topics. If an IMU is not being used, set the dlo/imu ROS param to false in cfg/dlo.yaml. However, if IMU data is available, please allow DLO to calibrate and gravity align for three seconds before moving. Note that the current implementation assumes that LiDAR and IMU coordinate frames coincide, so please make sure that the sensors are physically mounted near each other.

If successful, RViz will open and you will see similar terminal outputs to the following:

drawing drawing

Test Data

For your convenience, we provide example test data here (9 minutes, ~4.2GB). To run, first launch DLO (with default point cloud and IMU topics) via:

roslaunch direct_lidar_odometry dlo.launch

In a separate terminal session, play back the downloaded bag:

rosbag play dlo_test.bag

drawing

Citation

If you found this work useful, please cite our manuscript:

@article{chen2022direct,
  author={Chen, Kenny and Lopez, Brett T. and Agha-mohammadi, Ali-akbar and Mehta, Ankur},
  journal={IEEE Robotics and Automation Letters}, 
  title={Direct LiDAR Odometry: Fast Localization With Dense Point Clouds}, 
  year={2022},
  volume={7},
  number={2},
  pages={2000-2007},
  doi={10.1109/LRA.2022.3142739}
}

Acknowledgements

We thank the authors of the FastGICP and NanoFLANN open-source packages:

  • Kenji Koide, Masashi Yokozuka, Shuji Oishi, and Atsuhiko Banno, “Voxelized GICP for Fast and Accurate 3D Point Cloud Registration,” in IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2021, pp. 11 054–11 059.
  • Jose Luis Blanco and Pranjal Kumar Rai, “NanoFLANN: a C++ Header-Only Fork of FLANN, A Library for Nearest Neighbor (NN) with KD-Trees,” https://github.com/jlblancoc/nanoflann, 2014.

License

This work is licensed under the terms of the MIT license.


drawing drawing

drawing drawing

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].