All Projects → FAIRSpace-AdMaLL → ndt_localizer

FAIRSpace-AdMaLL / ndt_localizer

Licence: GPL-3.0 License
This robot lcoalisation package for lidar-map based localisation using multi-sensor state estimation.

Programming Languages

C++
36643 projects - #6 most used programming language
c
50402 projects - #5 most used programming language
CMake
9771 projects

Projects that are alternatives of or similar to ndt localizer

Flame
FLaME: Fast Lightweight Mesh Estimation
Stars: ✭ 164 (+412.5%)
Mutual labels:  robotics, slam
Iros2018 Slam Papers
IROS2018 SLAM papers (ref from PaoPaoRobot)
Stars: ✭ 224 (+600%)
Mutual labels:  robotics, slam
Anms Codes
Efficient adaptive non-maximal suppression algorithms for homogeneous spatial keypoint distribution
Stars: ✭ 174 (+443.75%)
Mutual labels:  robotics, slam
Cleanit
Open-source Autonomy Software in Rust-lang with gRPC for the Roomba series robot vacuum cleaners. Under development.
Stars: ✭ 125 (+290.63%)
Mutual labels:  robotics, slam
direct lidar odometry
Direct LiDAR Odometry: Fast Localization with Dense Point Clouds
Stars: ✭ 202 (+531.25%)
Mutual labels:  robotics, slam
Ssl slam
SSL_SLAM: Lightweight 3-D Localization and Mapping for Solid-State LiDAR IEEE RA-L 2021
Stars: ✭ 144 (+350%)
Mutual labels:  robotics, slam
Iris lama
LaMa - A Localization and Mapping library
Stars: ✭ 217 (+578.13%)
Mutual labels:  robotics, slam
Door Slam
Distributed, Online, and Outlier Resilient SLAM for Robotic Teams
Stars: ✭ 107 (+234.38%)
Mutual labels:  robotics, slam
OPVO
Sample code of BMVC 2017 paper: "Visual Odometry with Drift-Free Rotation Estimation Using Indoor Scene Regularities"
Stars: ✭ 40 (+25%)
Mutual labels:  robotics, slam
Iros2020 Paper List
IROS2020 paperlist by paopaorobot
Stars: ✭ 247 (+671.88%)
Mutual labels:  robotics, slam
Se2clam
SE(2)-Constrained Localization and Mapping by Fusing Odometry and Vision (IEEE Transactions on Cybernetics 2019)
Stars: ✭ 116 (+262.5%)
Mutual labels:  robotics, slam
SLAM AND PATH PLANNING ALGORITHMS
This repository contains the solutions to all the exercises for the MOOC about SLAM and PATH-PLANNING algorithms given by professor Claus Brenner at Leibniz University. This repository also contains my personal notes, most of them in PDF format, and many vector graphics created by myself to illustrate the theoretical concepts. Hope you enjoy it! :)
Stars: ✭ 107 (+234.38%)
Mutual labels:  robotics, slam
M Loam
Robust Odometry and Mapping for Multi-LiDAR Systems with Online Extrinsic Calibration
Stars: ✭ 114 (+256.25%)
Mutual labels:  robotics, slam
Pythonrobotics
Python sample codes for robotics algorithms.
Stars: ✭ 13,934 (+43443.75%)
Mutual labels:  robotics, slam
Awesome Robotic Tooling
Tooling for professional robotic development in C++ and Python with a touch of ROS, autonomous driving and aerospace.
Stars: ✭ 1,876 (+5762.5%)
Mutual labels:  robotics, slam
Iros2019 Paper List
IROS2019 paper list from PaopaoRobot
Stars: ✭ 214 (+568.75%)
Mutual labels:  robotics, slam
Evo
Python package for the evaluation of odometry and SLAM
Stars: ✭ 1,373 (+4190.63%)
Mutual labels:  robotics, slam
Rtabmap
RTAB-Map library and standalone application
Stars: ✭ 1,376 (+4200%)
Mutual labels:  robotics, slam
Minisam
A general and flexible factor graph non-linear least square optimization framework
Stars: ✭ 246 (+668.75%)
Mutual labels:  robotics, slam
2019-UGRP-DPoom
2019 DGIST DPoom project under UGRP : SBC and RGB-D camera based full autonomous driving system for mobile robot with indoor SLAM
Stars: ✭ 35 (+9.38%)
Mutual labels:  robotics, slam

A ROS-based NDT localizer with multi-sensor state estimation

This repo is a ROS based multi-sensor robot localisation. An NDT localizer is loosely-coupled with wheeled odometry and IMU for continuous global localization within a pre-build point cloud map.

Prerequisites

You will need the robot_localization package. The configurations of multi-sensors of our robot are detailed in cfgs/global_ekf.yaml and cfgs/local_ekf.yaml.

Localization in a pointcloud map(pcd)

A demo video on CourtYard dataset:

IMAGE ALT TEXT HERE

How to use

Build in your ros workspace

clone this repo in your ros workspace/src/, and then catkin_make (or catkin build):

cd catkin_ws/src/
git clone https://github.com/FAIRSpace-AdMaLL/ndt_localizer.git
cd ..
catkin_make

Get a pcd map

You need a point cloud map (pcd format) for localization. You can get a HD point cloud map from any HD map supplier, or you can make one yourself (of course, the accuracy will not be as high as the HD map).

We use our offline version of lio-sam to build the point cloud map: https://github.com/FAIRSpace-AdMaLL/liosam_mapper

Previously-generated maps can be downloaded from here The court_yard data (rosbags) for mapping or testing ndt_localizer can be downloaded here: Court Yard Data The beach data (rosbags and previously-generated maps) can be downloaded here: Beach Data

Setup configuration

Config map loader

Move your map pcd file (.pcd) to the map folder inside this project (ndt_localizer/map), change the pcd_path in map_loader.launch to you pcd path, for example:

<arg name="pcd_path"  default="$(find ndt_localizer)/map/court_yard_map.pcd"/>

, then in ndt_localizer.launch modify the trajectory path (as a height map for initialization):

<arg name="path_file" default="$(find ndt_localizer)/map/court_yard_map.csv" doc="Mapping trajectory as height map" />

You also need to configure the submap parameters:

<arg name="submap_size_xy" default="50.0" />
<arg name="submap_size_z" default="20.0" />
<arg name="map_switch_thres" default="25.0" />

Config point cloud downsample

Config your Lidar point cloud topic in launch/points_downsample.launch:

<arg name="points_topic" default="/os_cloud_node/points" />

If your Lidar data is sparse (like VLP-16), you need to config smaller leaf_size in launch/points_downsample.launch like 1.0. If your lidar point cloud is dense (VLP-32, Hesai Pander40P, HDL-64 ect.), keep leaf_size as 2.0.

Config static tf

There are a static transform from /world to /map:

<node pkg="tf2_ros" type="static_transform_publisher" name="world_to_map" args="0 0 0 0 0 0 map world" />

Config ndt localizer

You can config NDT params in ndt_localizer.launch. Tha main params of NDT algorithm is:

<arg name="trans_epsilon" default="0.05" doc="The maximum difference between two consecutive transformations in order to consider convergence" />
<arg name="step_size" default="0.1" doc="The newton line search maximum step length" />
<arg name="resolution" default="3.0" doc="The ND voxel grid resolution" />
<arg name="max_iterations" default="50" doc="The number of iterations required to calculate alignment" />
<arg name="converged_param_transform_probability" default="3.0" doc="" />

These default params work nice with 64 and 32 lidar.

Run the localizer

Once you get your pcd map and configuration ready, run the localizer with:

cd catkin_ws
source devel/setup.bash
roslaunch ndt_localizer ndt_localizer.launch

Wait a few seconds for the map to load, then you can see your pcd map in rviz.

Give a initial pose of current vehicle with 2D Pose Estimate in the rviz.

This operation will send a init pose to topic /initialpose. Then you will see the localization result:

Then, play the rosbag in other terminal (e.g. rosbag play --clock court_yard_wed_repeat_night_2021-03-03-19-07-18.bag).

The robot will start localization:

The final localization msg will send to /odometry/filtered/global by a multi-sensor state estimation using wheeled odometry, IMU and lidar localisation.

The localizer also publish a tf of base_link to map:

---
transforms: 
  - 
    header: 
      seq: 0
      stamp: 
        secs: 1566536121
        nsecs: 251423898
      frame_id: "map"
    child_frame_id: "base_link"
    transform: 
      translation: 
        x: -94.8022766113
        y: 544.097351074
        z: 42.5747337341
      rotation: 
        x: 0.0243843578881
        y: 0.0533175268768
        z: -0.702325920272
        w: 0.709437048124

Acknowledgement

Thanks for AdamShan's autoware_ros implementation https://github.com/AbangLZU/ndt_localizer.git.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].