All Projects → codac-team → codac

codac-team / codac

Licence: LGPL-3.0, GPL-3.0 licenses found Licenses found LGPL-3.0 COPYING.LESSER GPL-3.0 COPYING
Codac is a library for constraint programming over reals, trajectories and sets.

Programming Languages

C++
36643 projects - #6 most used programming language
python
139335 projects - #7 most used programming language
CMake
9771 projects
c
50402 projects - #5 most used programming language

Projects that are alternatives of or similar to codac

awesome-mobile-robotics
Useful links of different content related to AI, Computer Vision, and Robotics.
Stars: ✭ 243 (+683.87%)
Mutual labels:  localization, slam, mobile-robotics
gmmloc
Implementation for IROS2020: "GMMLoc: Structure Consistent Visual Localization with Gaussian Mixture Model"
Stars: ✭ 91 (+193.55%)
Mutual labels:  localization, slam, state-estimation
GA SLAM
🚀 SLAM for autonomous planetary rovers with global localization
Stars: ✭ 40 (+29.03%)
Mutual labels:  localization, slam
Iris lama
LaMa - A Localization and Mapping library
Stars: ✭ 217 (+600%)
Mutual labels:  localization, slam
Iscloam
Intensity Scan Context based full SLAM implementation for autonomous driving. ICRA 2020
Stars: ✭ 232 (+648.39%)
Mutual labels:  localization, slam
Dl Vision Papers
深度学习和三维视觉相关的论文
Stars: ✭ 123 (+296.77%)
Mutual labels:  localization, slam
Pythonrobotics
Python sample codes for robotics algorithms.
Stars: ✭ 13,934 (+44848.39%)
Mutual labels:  localization, slam
Mola
A Modular Optimization framework for Localization and mApping (MOLA)
Stars: ✭ 206 (+564.52%)
Mutual labels:  localization, slam
JuliaAutonomy
Julia sample codes for Autonomy, Robotics and Self-Driving Algorithms.
Stars: ✭ 21 (-32.26%)
Mutual labels:  localization, slam
ai for robotics
Visualizations of algorithms covered in Sebastian Thrun's excellent Artificial Intelligence for Robotics course on Udacity.
Stars: ✭ 125 (+303.23%)
Mutual labels:  localization, slam
slam gmapping
Slam Gmapping for ROS2
Stars: ✭ 56 (+80.65%)
Mutual labels:  localization, slam
Ov2slam
OV²SLAM is a Fully Online and Versatile Visual SLAM for Real-Time Applications
Stars: ✭ 119 (+283.87%)
Mutual labels:  localization, slam
Door Slam
Distributed, Online, and Outlier Resilient SLAM for Robotic Teams
Stars: ✭ 107 (+245.16%)
Mutual labels:  localization, slam
ros-vrep-slam
ROS and V-REP for Robot Mapping and Localization
Stars: ✭ 39 (+25.81%)
Mutual labels:  localization, slam
Rtabmap
RTAB-Map library and standalone application
Stars: ✭ 1,376 (+4338.71%)
Mutual labels:  localization, slam
Urbannavdataset
UrbanNav: an Open-Sourcing Localization Data Collected in Asian Urban Canyons, Including Tokyo and Hong Kong
Stars: ✭ 79 (+154.84%)
Mutual labels:  localization, slam
Landmark Detection Robot Tracking SLAM-
Simultaneous Localization and Mapping(SLAM) also gives you a way to track the location of a robot in the world in real-time and identify the locations of landmarks such as buildings, trees, rocks, and other world features.
Stars: ✭ 14 (-54.84%)
Mutual labels:  localization, slam
Cartographer
Cartographer is a system that provides real-time simultaneous localization and mapping (SLAM) in 2D and 3D across multiple platforms and sensor configurations.
Stars: ✭ 5,754 (+18461.29%)
Mutual labels:  localization, slam
Kimera Vio
Visual Inertial Odometry with SLAM capabilities and 3D Mesh generation.
Stars: ✭ 741 (+2290.32%)
Mutual labels:  localization, slam
Robotics-Resources
List of commonly used robotics libraries and packages
Stars: ✭ 71 (+129.03%)
Mutual labels:  slam, mobile-robotics

Codac: constraint-programming for robotics Build Status

Codac (Catalog Of Domains And Contractors) is a C++/Python library providing tools for constraint programming over reals, trajectories and sets. It has many applications in state estimation or robot localization.

See the official website: http://codac.io

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].