All Projects → rpng → Open_vins

rpng / Open_vins

Licence: gpl-3.0
An open source platform for visual-inertial navigation research.

Labels

Projects that are alternatives of or similar to Open vins

Dynslam
Master's Thesis on Simultaneous Localization and Mapping in dynamic environments. Separately reconstructs both the static environment and the dynamic objects from it, such as cars.
Stars: ✭ 446 (-44.6%)
Mutual labels:  slam
Xivo
X Inertial-aided Visual Odometry
Stars: ✭ 558 (-30.68%)
Mutual labels:  slam
Visual slam related research
视觉(语义) SLAM 相关研究跟踪
Stars: ✭ 708 (-12.05%)
Mutual labels:  slam
Cube slam
CubeSLAM: Monocular 3D Object Detection and SLAM
Stars: ✭ 464 (-42.36%)
Mutual labels:  slam
Lio Mapping
Implementation of Tightly Coupled 3D Lidar Inertial Odometry and Mapping (LIO-mapping)
Stars: ✭ 520 (-35.4%)
Mutual labels:  slam
Teaser Plusplus
A fast and robust point cloud registration library
Stars: ✭ 607 (-24.6%)
Mutual labels:  slam
Semantic suma
SuMa++: Efficient LiDAR-based Semantic SLAM (Chen et al IROS 2019)
Stars: ✭ 431 (-46.46%)
Mutual labels:  slam
Kimera Vio
Visual Inertial Odometry with SLAM capabilities and 3D Mesh generation.
Stars: ✭ 741 (-7.95%)
Mutual labels:  slam
Sfm Visual Slam
Stars: ✭ 551 (-31.55%)
Mutual labels:  slam
Turtlebot3
ROS packages for Turtlebot3
Stars: ✭ 673 (-16.4%)
Mutual labels:  slam
Slambook
No description or website provided.
Stars: ✭ 5,093 (+532.67%)
Mutual labels:  slam
Slam Book
这是一本关于SLAM的书稿,希望能清楚的介绍SLAM系统中的使用的几何方法和深度学习方法。书稿最后应该会达到400页左右,书稿每章对应的代码也会被整理出来。
Stars: ✭ 505 (-37.27%)
Mutual labels:  slam
Cartographer
Cartographer is a system that provides real-time simultaneous localization and mapping (SLAM) in 2D and 3D across multiple platforms and sensor configurations.
Stars: ✭ 5,754 (+614.78%)
Mutual labels:  slam
Loam noted
loam code noted in Chinese(loam中文注解版)
Stars: ✭ 455 (-43.48%)
Mutual labels:  slam
Probabilistic robotics
solution of exercises of the book "probabilistic robotics"
Stars: ✭ 734 (-8.82%)
Mutual labels:  slam
Icra2020 Paper List
ICRA2020 paperlist by paopaorobot
Stars: ✭ 432 (-46.34%)
Mutual labels:  slam
Robotics Toolbox Matlab
Robotics Toolbox for MATLAB
Stars: ✭ 601 (-25.34%)
Mutual labels:  slam
Gms Feature Matcher
GMS: Grid-based Motion Statistics for Fast, Ultra-robust Feature Correspondence (CVPR 17 & IJCV 20)
Stars: ✭ 797 (-0.99%)
Mutual labels:  slam
Kintinuous
Real-time large scale dense visual SLAM system
Stars: ✭ 740 (-8.07%)
Mutual labels:  slam
Mvision
机器人视觉 移动机器人 VS-SLAM ORB-SLAM2 深度学习目标检测 yolov3 行为检测 opencv PCL 机器学习 无人驾驶
Stars: ✭ 6,140 (+662.73%)
Mutual labels:  slam

OpenVINS

C/C++ CI

Welcome to the OpenVINS project! The OpenVINS project houses some core computer vision code along with a state-of-the art filter-based visual-inertial estimator. The core filter is an Extended Kalman filter which fuses inertial information with sparse visual feature tracks. These visual feature tracks are fused leveraging the Multi-State Constraint Kalman Filter (MSCKF) sliding window formulation which allows for 3D features to update the state estimate without directly estimating the feature states in the filter. Inspired by graph-based optimization systems, the included filter has modularity allowing for convenient covariance management with a proper type-based state system. Please take a look at the feature list below for full details on what the system supports.

News / Events

  • December 1, 2020 - Released improved memory management, active feature pointcloud publishing, limiting number of features in update to bound compute, and other small fixes. See v2.3 PR#117 for details.
  • November 18, 2020 - Released groundtruth generation utility package, vicon2gt to enable creation of groundtruth trajectories in a motion capture room for evaulating VIO methods.
  • July 7, 2020 - Released zero velocity update for vehicle applications and direct initialization when standing still. See PR#79 for details.
  • May 18, 2020 - Released secondary pose graph example repository ov_secondary based on VINS-Fusion. OpenVINS now publishes marginalized feature track, feature 3d position, and first camera intrinsics and extrinsics. See PR#66 for details and discussion.
  • April 3, 2020 - Released v2.0 update to the codebase with some key refactoring, ros-free building, improved dataset support, and single inverse depth feature representation. Please check out the release page for details.
  • January 21, 2020 - Our paper has been accepted for presentation in ICRA 2020. We look forward to seeing everybody there! We have also added links to a few videos of the system running on different datasets.
  • October 23, 2019 - OpenVINS placed first in the IROS 2019 FPV Drone Racing VIO Competition . We will be giving a short presentation at the workshop at 12:45pm in Macau on November 8th.
  • October 1, 2019 - We will be presenting at the Visual-Inertial Navigation: Challenges and Applications workshop at IROS 2019. The submitted workshop paper can be found at this link.
  • August 21, 2019 - Open sourced ov_maplab for interfacing OpenVINS with the maplab library.
  • August 15, 2019 - Initial release of OpenVINS repository and documentation website!

Project Features

  • Sliding window visual-inertial MSCKF
  • Modular covariance type system
  • Comprehensive documentation and derivations
  • Extendable visual-inertial simulator
    • On manifold SE(3) b-spline
    • Arbitrary number of cameras
    • Arbitrary sensor rate
    • Automatic feature generation
  • Five different feature representations
    1. Global XYZ
    2. Global inverse depth
    3. Anchored XYZ
    4. Anchored inverse depth
    5. Anchored MSCKF inverse depth
    6. Anchored single inverse depth
  • Calibration of sensor intrinsics and extrinsics
    • Camera to IMU transform
    • Camera to IMU time offset
    • Camera intrinsics
  • Environmental SLAM feature
    • OpenCV ARUCO tag SLAM features
    • Sparse feature SLAM features
  • Visual tracking support
    • Monocular camera
    • Stereo camera
    • Binocular camera
    • KLT or descriptor based
  • Static IMU initialization (sfm will be open sourced later)
  • Zero velocity detection and updates
  • Out of the box evaluation on EurocMav and TUM-VI datasets
  • Extensive evaluation suite (ATE, RPE, NEES, RMSE, etc..)

Codebase Extensions

  • ov_secondary - This is an example secondary thread which provides loop closure in a loosely coupled manner for OpenVINS. This is a modification of the code originally developed by the HKUST aerial robotics group and can be found in their VINS-Fusion repository. Here we stress that this is a loosely coupled method, thus no information is returned to the estimator to improve the underlying OpenVINS odometry. This codebase has been modified in a few key areas including: exposing more loop closure parameters, subscribing to camera intrinsics, simplifying configuration such that only topics need to be supplied, and some tweaks to the loop closure detection to improve frequency.

  • ov_maplab - This codebase contains the interface wrapper for exporting visual-inertial runs from OpenVINS into the ViMap structure taken by maplab. The state estimates and raw images are appended to the ViMap as OpenVINS runs through a dataset. After completion of the dataset, features are re-extract and triangulate with maplab's feature system. This can be used to merge multi-session maps, or to perform a batch optimization after first running the data through OpenVINS. Some example have been provided along with a helper script to export trajectories into the standard groundtruth format.

  • vicon2gt - This utility was created to generate groundtruth trajectories using a motion capture system (e.g. Vicon or OptiTrack) for use in evaluating visual-inertial estimation systems. Specifically we calculate the inertial IMU state (full 15 dof) at camera frequency rate and generate a groundtruth trajectory similar to those provided by the EurocMav datasets. Performs fusion of inertial and motion capture information and estimates all unknown spacial-temporal calibrations between the two sensors.

Demo Videos

Credit / Licensing

This code was written by the Robot Perception and Navigation Group (RPNG) at the University of Delaware. If you have any issues with the code please open an issue on our github page with relevant implementation details and references. For researchers that have leveraged or compared to this work, please cite the following:

@Conference{Geneva2020ICRA,
  Title      = {OpenVINS: A Research Platform for Visual-Inertial Estimation},
  Author     = {Patrick Geneva and Kevin Eckenhoff and Woosik Lee and Yulin Yang and Guoquan Huang},
  Booktitle  = {Proc. of the IEEE International Conference on Robotics and Automation},
  Year       = {2020},
  Address    = {Paris, France},
  Url        = {\url{https://github.com/rpng/open_vins}}
}

The codebase is licensed under the GNU General Public License v3 (GPL-3).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].