All Projects → wh200720041 → Iscloam

wh200720041 / Iscloam

Licence: other
Intensity Scan Context based full SLAM implementation for autonomous driving. ICRA 2020

Projects that are alternatives of or similar to Iscloam

Pythonrobotics
Python sample codes for robotics algorithms.
Stars: ✭ 13,934 (+5906.03%)
Mutual labels:  slam, autonomous-driving, localization
JuliaAutonomy
Julia sample codes for Autonomy, Robotics and Self-Driving Algorithms.
Stars: ✭ 21 (-90.95%)
Mutual labels:  localization, slam, autonomous-driving
Iris lama
LaMa - A Localization and Mapping library
Stars: ✭ 217 (-6.47%)
Mutual labels:  slam, localization
Cartographer
Cartographer is a system that provides real-time simultaneous localization and mapping (SLAM) in 2D and 3D across multiple platforms and sensor configurations.
Stars: ✭ 5,754 (+2380.17%)
Mutual labels:  slam, localization
Mrpt
⚡️ The Mobile Robot Programming Toolkit (MRPT)
Stars: ✭ 1,190 (+412.93%)
Mutual labels:  slam, autonomous-driving
Mola
A Modular Optimization framework for Localization and mApping (MOLA)
Stars: ✭ 206 (-11.21%)
Mutual labels:  slam, localization
Floam
Fast LOAM: Fast and Optimized Lidar Odometry And Mapping for indoor/outdoor localization (Lidar SLAM)
Stars: ✭ 326 (+40.52%)
Mutual labels:  slam, localization
Deepseqslam
The Official Deep Learning Framework for Route-based Place Recognition
Stars: ✭ 49 (-78.88%)
Mutual labels:  slam, autonomous-driving
gmmloc
Implementation for IROS2020: "GMMLoc: Structure Consistent Visual Localization with Gaussian Mixture Model"
Stars: ✭ 91 (-60.78%)
Mutual labels:  localization, slam
Door Slam
Distributed, Online, and Outlier Resilient SLAM for Robotic Teams
Stars: ✭ 107 (-53.88%)
Mutual labels:  slam, localization
Rtabmap
RTAB-Map library and standalone application
Stars: ✭ 1,376 (+493.1%)
Mutual labels:  slam, localization
Awesome Robotic Tooling
Tooling for professional robotic development in C++ and Python with a touch of ROS, autonomous driving and aerospace.
Stars: ✭ 1,876 (+708.62%)
Mutual labels:  slam, autonomous-driving
Deep Learning Localization Mapping
A collection of deep learning based localization models
Stars: ✭ 300 (+29.31%)
Mutual labels:  slam, localization
UrbanLoco
UrbanLoco: A Full Sensor Suite Dataset for Mapping and Localization in Urban Scenes
Stars: ✭ 147 (-36.64%)
Mutual labels:  localization, slam
Xivo
X Inertial-aided Visual Odometry
Stars: ✭ 558 (+140.52%)
Mutual labels:  slam, localization
awesome-lidar
😎 Awesome LIDAR list. The list includes LIDAR manufacturers, datasets, point cloud-processing algorithms, point cloud frameworks and simulators.
Stars: ✭ 217 (-6.47%)
Mutual labels:  slam, autonomous-driving
Kimera Vio
Visual Inertial Odometry with SLAM capabilities and 3D Mesh generation.
Stars: ✭ 741 (+219.4%)
Mutual labels:  slam, localization
direct lidar odometry
Direct LiDAR Odometry: Fast Localization with Dense Point Clouds
Stars: ✭ 202 (-12.93%)
Mutual labels:  localization, slam
2019-UGRP-DPoom
2019 DGIST DPoom project under UGRP : SBC and RGB-D camera based full autonomous driving system for mobile robot with indoor SLAM
Stars: ✭ 35 (-84.91%)
Mutual labels:  slam, autonomous-driving
Urbannavdataset
UrbanNav: an Open-Sourcing Localization Data Collected in Asian Urban Canyons, Including Tokyo and Hong Kong
Stars: ✭ 79 (-65.95%)
Mutual labels:  slam, localization

ISCLOAM

Intensity Scan Context based Full SLAM Implementation (ISC-LOAM)

This work is an implementation of paper "Intensity Scan Context: Coding Intensity and Geometry Relations for Loop Closure Detection" in IEEE International Conference on Robotics and Automation 2020 (ICRA) paper This work is 3D lidar based Simultaneous Localization And Mapping (SLAM), including both front-end and back-end SLAM, at 20Hz.

Author: Wang Han, Nanyang Technological University, Singapore

For front-end only odometry, you may visit FLOAM (fast lidar odometry and mapping)

1. Evaluation

1.1. Demo

Watch our demo at Video Link

1.2. Mapping Example

1.3. Localization Example

1.4. Ground Truth Comparison

Green: ISCLOAM Red: Ground Truth

                  KITTI sequence 00                                  KITTI sequence 05

1.5. Localization error

Platform: Intel® Core™ i7-8700 CPU @ 3.20GHz

Average translation error : 1.08%

Average rotation error : 0.000073

1.6. Comparison

Dataset ISCLOAM FLOAM
KITTI sequence 00 0.24% 0.51%
KITTI sequence 05 0.22% 0.93%

2. Prerequisites

2.1 Ubuntu and ROS

Ubuntu 64-bit 18.04.

ROS Melodic. ROS Installation

2.2. Ceres Solver

Follow Ceres Installation.

2.3. PCL

Follow PCL Installation.

2.3. GTSAM

Follow GTSAM Installation.

2.3. OPENCV

Follow OPENCV Installation.

2.4. Trajectory visualization

For visualization purpose, this package uses hector trajectory sever, you may install the package by

sudo apt-get install ros-melodic-hector-trajectory-server

Alternatively, you may remove the hector trajectory server node if trajectory visualization is not needed

3. Build

3.1 Clone repository:

cd ~/catkin_ws/src
git clone https://github.com/wh200720041/iscloam.git
cd ..
catkin_make -j1
source ~/catkin_ws/devel/setup.bash

3.2 Download test rosbag

Download KITTI sequence 05 (10GB) or KITTI sequence 07 (4GB)

Unzip compressed file 2011_09_30_0018.zip. If your system does not have unzip. please install unzip by

sudo apt-get install unzip 

This may take a few minutes to unzip the file, by default the file location should be /home/user/Downloads/2011_09_30_0018.bag

cd ~/Downloads
unzip ~/Downloads/2011_09_30_0018.zip

3.3 Launch ROS

roslaunch iscloam iscloam.launch

3.4 Mapping Node

if you would like to generate the map of environment at the same time, you can run

roslaunch iscloam iscloam_mapping.launch

Note that the global map can be very large, so it may takes a while to perform global optimization, some lag is expected between trajectory and map since they are running in separate thread. More CPU usage will happen when loop closure is identified.

4. Test other sequence

To generate rosbag file of kitti dataset, you may use the tools provided by kitti_to_rosbag or kitti2bag

5. Other Velodyne sensor

You may use iscloam_velodyne.launch for your own velodyne sensor, such as Velodyne VLP-16.

6. Citation

If you use this work for your research, you may want to cite the paper below, your citation will be appreciated

@inproceedings{wang2020intensity,
  author={H. {Wang} and C. {Wang} and L. {Xie}},
  booktitle={2020 IEEE International Conference on Robotics and Automation (ICRA)}, 
  title={Intensity Scan Context: Coding Intensity and Geometry Relations for Loop Closure Detection}, 
  year={2020},
  volume={},
  number={},
  pages={2095-2101},
  doi={10.1109/ICRA40945.2020.9196764}
}

7.Acknowledgements

Thanks for A-LOAM and LOAM(J. Zhang and S. Singh. LOAM: Lidar Odometry and Mapping in Real-time) and LOAM_NOTED.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].