MOLAorg / Mola
Licence: gpl-3.0
A Modular Optimization framework for Localization and mApping (MOLA)
Stars: ✭ 206
Projects that are alternatives of or similar to Mola
direct lidar odometry
Direct LiDAR Odometry: Fast Localization with Dense Point Clouds
Stars: ✭ 202 (-1.94%)
Mutual labels: localization, lidar, slam
awesome-mobile-robotics
Useful links of different content related to AI, Computer Vision, and Robotics.
Stars: ✭ 243 (+17.96%)
Mutual labels: localization, datasets, slam
Urbannavdataset
UrbanNav: an Open-Sourcing Localization Data Collected in Asian Urban Canyons, Including Tokyo and Hong Kong
Stars: ✭ 79 (-61.65%)
Mutual labels: slam, lidar, localization
UrbanLoco
UrbanLoco: A Full Sensor Suite Dataset for Mapping and Localization in Urban Scenes
Stars: ✭ 147 (-28.64%)
Mutual labels: localization, lidar, slam
Loam velodyne
Laser Odometry and Mapping (Loam) is a realtime method for state estimation and mapping using a 3D lidar.
Stars: ✭ 1,135 (+450.97%)
Mutual labels: slam, lidar
Lego Loam
LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain
Stars: ✭ 1,138 (+452.43%)
Mutual labels: slam, lidar
Door Slam
Distributed, Online, and Outlier Resilient SLAM for Robotic Teams
Stars: ✭ 107 (-48.06%)
Mutual labels: slam, localization
Ssl slam2
SSL_SLAM2: Lightweight 3-D Localization and Mapping for Solid-State LiDAR (mapping and localization separated) ICRA 2021
Stars: ✭ 96 (-53.4%)
Mutual labels: slam, lidar
Awesome Robotic Tooling
Tooling for professional robotic development in C++ and Python with a touch of ROS, autonomous driving and aerospace.
Stars: ✭ 1,876 (+810.68%)
Mutual labels: slam, lidar
Kimera Vio
Visual Inertial Odometry with SLAM capabilities and 3D Mesh generation.
Stars: ✭ 741 (+259.71%)
Mutual labels: slam, localization
Cartographer
Cartographer is a system that provides real-time simultaneous localization and mapping (SLAM) in 2D and 3D across multiple platforms and sensor configurations.
Stars: ✭ 5,754 (+2693.2%)
Mutual labels: slam, localization
Ov2slam
OV²SLAM is a Fully Online and Versatile Visual SLAM for Real-Time Applications
Stars: ✭ 119 (-42.23%)
Mutual labels: slam, localization
Rtabmap
RTAB-Map library and standalone application
Stars: ✭ 1,376 (+567.96%)
Mutual labels: slam, localization
Overlap localization
chen2020iros: Learning an Overlap-based Observation Model for 3D LiDAR Localization.
Stars: ✭ 120 (-41.75%)
Mutual labels: lidar, localization
Pyicp Slam
Full-python LiDAR SLAM using ICP and Scan Context
Stars: ✭ 155 (-24.76%)
Mutual labels: slam, lidar
Semantic suma
SuMa++: Efficient LiDAR-based Semantic SLAM (Chen et al IROS 2019)
Stars: ✭ 431 (+109.22%)
Mutual labels: slam, lidar
mola
A Modular Optimization framework for Localization and mApping (MOLA).
This repository holds the MOLA git superproject. Refer to the official documentation for build instructions, demos, API reference, etc.
Demo | Preview |
---|---|
3D LiDAR SLAM from KITTI dataset | ![]() |
Graph SLAM from G2O dataset | ![]() |
Building
Clone with:
git clone --recurse-submodules https://github.com/MOLAorg/mola.git
Follow these instructions (in RST format here).
About the directory structure
Directories layout is as follows:
-
demos
: Example YAML files formola-launcher
-
docs
: Documentation and placeholder for Doxygen docs. -
externals
: All external depedencies -
modules
: All MOLA module projects.
Citation
MOLA was presented in (PDF):
@INPROCEEDINGS{Blanco-Claraco-RSS-19,
AUTHOR = {Jose Luis Blanco-Claraco},
TITLE = {A Modular Optimization Framework for Localization and Mapping},
BOOKTITLE = {Proceedings of Robotics: Science and Systems},
YEAR = {2019},
ADDRESS = {FreiburgimBreisgau, Germany},
MONTH = {June},
DOI = {10.15607/RSS.2019.XV.043}
}
License
MOLA is released under the GNU GPL v3 license, except noted otherwise in each individual module. Other options available upon request.
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].