All Projects → ser94mor → sensor-fusion

ser94mor / sensor-fusion

Licence: GPL-3.0 License
Filters: KF, EKF, UKF || Process Models: CV, CTRV || Measurement Models: Radar, Lidar

Programming Languages

C++
36643 projects - #6 most used programming language
fortran
972 projects
CMake
9771 projects
c
50402 projects - #5 most used programming language
Cuda
1817 projects
shell
77523 projects

Projects that are alternatives of or similar to sensor-fusion

go-estimate
State estimation and filtering algorithms in Go
Stars: ✭ 98 (+2.08%)
Mutual labels:  unscented-kalman-filter, sensor-fusion, kalman-filter, extended-kalman-filter
pyUKF
Unscented kalman filter (UKF) library in python that supports multiple measurement updates
Stars: ✭ 52 (-45.83%)
Mutual labels:  unscented-kalman-filter, ukf, sensor-fusion, kalman-filter
CarND-Extended-Kalman-Filter-P6
Self Driving Car Project 6 - Sensor Fusion(Extended Kalman Filter)
Stars: ✭ 24 (-75%)
Mutual labels:  radar, lidar, sensor-fusion, kalman-filter
fusion-ekf
An extended Kalman Filter implementation in C++ for fusing lidar and radar sensor measurements.
Stars: ✭ 113 (+17.71%)
Mutual labels:  radar, lidar, sensor-fusion, kalman-filter
orientation tracking-unscented kalman filter
Implemented Unscented Kalman Filter (UKF) for orientation tracking. Sensors fusion of accelerometer, and gyroscope
Stars: ✭ 39 (-59.37%)
Mutual labels:  unscented-kalman-filter, ukf, sensor-fusion
Embedded UKF Library
A compact Unscented Kalman Filter (UKF) library for Teensy4/Arduino system (or any real time embedded system in general)
Stars: ✭ 31 (-67.71%)
Mutual labels:  unscented-kalman-filter, ukf, kalman-filter
Fusion Ukf
An unscented Kalman Filter implementation for fusing lidar and radar sensor measurements.
Stars: ✭ 162 (+68.75%)
Mutual labels:  radar, lidar, kalman-filter
Tracking With Extended Kalman Filter
Object (e.g Pedestrian, vehicles) tracking by Extended Kalman Filter (EKF), with fused data from both lidar and radar sensors.
Stars: ✭ 393 (+309.38%)
Mutual labels:  radar, lidar
OpenMaterial
3D model exchange format with physical material properties for virtual development, test and validation of automated driving.
Stars: ✭ 23 (-76.04%)
Mutual labels:  radar, lidar
parallel-non-linear-gaussian-smoothers
Companion code in JAX for the paper Parallel Iterated Extended and Sigma-Point Kalman Smoothers.
Stars: ✭ 17 (-82.29%)
Mutual labels:  unscented-kalman-filter, kalman-filter
continuous-fusion
(ROS) Sensor fusion algorithm for camera+lidar.
Stars: ✭ 26 (-72.92%)
Mutual labels:  lidar, sensor-fusion
vision-based estimations
Vision-based Robot 3D Pose and Velocities Estimations
Stars: ✭ 32 (-66.67%)
Mutual labels:  ukf, kalman-filter
Awesome Autonomous Driving Papers
This repository provides awesome research papers for autonomous driving perception. If you do find a problem or have any suggestions, please raise this as an issue or make a pull request with information (format of the repo): Research paper title, datasets, metrics, objects, source code, publisher, and year.
Stars: ✭ 30 (-68.75%)
Mutual labels:  fusion, lidar
Self-Driving-Car-NanoDegree-Udacity
This repository contains code and writeups for projects and labs completed as a part of UDACITY's first of it's kind self driving car nanodegree program.
Stars: ✭ 29 (-69.79%)
Mutual labels:  unscented-kalman-filter, sensor-fusion
Kalmanfilter altimeter vario
Kalman filter to estimate altitude and climbrate(sinkrate) by fusing altitude and acceleration sensor data
Stars: ✭ 31 (-67.71%)
Mutual labels:  sensor-fusion, kalman-filter
imu ekf
6-axis(3-axis acceleration sensor+3-axis gyro sensor) IMU fusion with Extended Kalman Filter.
Stars: ✭ 56 (-41.67%)
Mutual labels:  fusion, ekf
lidar radar fusion ekf ukf
Lidar and Radar Fusion with EKF and UKF
Stars: ✭ 19 (-80.21%)
Mutual labels:  ekf, kalman-filter
FAST LIO SLAM
LiDAR SLAM = FAST-LIO + Scan Context
Stars: ✭ 183 (+90.63%)
Mutual labels:  lidar, kalman-filter
camera lidar calibration
A tool used for calibrate the extrinsic between 2D laser range finder (LRF) and camera. ROS Version: https://github.com/TurtleZhong/camera_lidar_calibration_v2
Stars: ✭ 48 (-50%)
Mutual labels:  lidar
lidar body tracking
ROS Catkin package to track people using octree and cluster extraction
Stars: ✭ 68 (-29.17%)
Mutual labels:  lidar

Sensor Fusion with KF, EKF, and UKF for CV & CTRV Process Models and Lidar & Radar Measurements Models

This repository contains implementations of Kalman filter, extended Kalman filter, and unscented Kalman filter for the selected process and measurement models.

Process Models:

Measurement Models:

The project relies on the Eigen library for vector and matrix operations.

A great effort has been put in designing abstractions of filter, process model, and measurement model. The code heavily relies on C++ templates and avoids dynamic memory allocations, which is crucial for embedded systems, such as, self-driving car's onboard computer.

Configurations

There are several files defining int main(int, char**) function.

Demo

One of the configurations works with the Udacity Self-Driving Car Engineer Nanodegree Term 2 Simulator.
demo

Dependencies

  • cmake >= 3.5
  • gcc/g++ >= 6.0

Additional:

Build

The project can be built by doing the following from the project top directory.

$> mkdir build
$> cd build
$> cmake ..
$> make
$> cd ..

Run

Filters

While implementing different variations of Kalman filters, the notation from the book "Thrun, S., Burgard, W. and Fox, D., 2005. Probabilistic robotics. MIT press." was followed.

Kalman Filter

The equations below describe the Kalman filter and are implemented in the KalmanFilter class. Algorithm_Kalman_filter
For explanations of what each variable means, please, refer to comments in the code in corresponding files or the book "Thrun, S., Burgard, W. and Fox, D., 2005. Probabilistic robotics. MIT press."

Extended Kalman Filter

The equations below describe the extended Kalman filter and are implemented in the ExtendedKalmanFilter class. Algorithm_Extended_Kalman_filter For explanations of what each variable means, please, refer to comments in the code in corresponding files or the book "Thrun, S., Burgard, W. and Fox, D., 2005. Probabilistic robotics. MIT press."

Unscented Kalman Filter

The equations below describe the unscented Kalman filter and are implemented in the UnscentedKalmanFilter class. Algorithm_Unscented_Kalman_filter For explanations of what each variable means, please, refer to comments in the code in corresponding files or the book "Thrun, S., Burgard, W. and Fox, D., 2005. Probabilistic robotics. MIT press."

Process Models

The following illustration helps to understand what the state vector dimensions mean.
graph

CV (Constant Velocity)

The CV process model is a process model where the object moves linearly with constant velocity. In this project, CV process model dials with a 2D world. The state vector consists of 4 components---px, py, vx, vy---where p* represents the position and v* represents the velocity. The leftmost column in the following equation represents the additive process noise; a* represents acceleration. CV_process_model The CV process model is implemented as a CVProcessModel class.

CTRV (Constant Turn Rate and Velocity Magnitude)

The CTRV process model is a process model where the object moves with a constant turn rate and velocity, that is, with zero longitudinal and yaw accelerations. CTRV process model dials with a 2D world. The state vector consists of 5 components---px, py, v, yaw, yaw_rate---where p* represents the position, v represents the velocity module, yaw represents the yaw angle, and yaw_rate represents the yaw velocity. The leftmost column in the following equation represents the non-linear process noise; a_a represents longitudinal acceleration, and a_psi is yaw acceleration. CTRV_process_model
where CTRV_process_model_alpha and CTRV_process_model_beta The results of solving these integrals depends on the yaw_rate, see CTRVProcessModel.

Measurement Models

Lidar

The Lidar measurement model is a linear measurement model. This project does not deal with the lidar point cloud. It assumes that the lidar point cloud has already been processed and a single measurement vector has been identified for the object under consideration. The measurement vector consists of 2 components---px, py---where p* represents the position. The transformation from the state space to the Lidar measurement space is as follows
Lidar_measurement_model where Lidar_measurement_model_H The Lidar measurement model is implemented as a LidarMeasurementModel class.

Radar

The Radar measurement model is a non-linear measurement model. The measurement vector consists of 3 components---range, bearing, range_rate---where the range is a radial distance from the origin, the bearing is an angle between range and X-axis which points into the direction of the heading of the vehicle, where sensors are installed, and range_rate is a radial velocity. The transformation from the state space to the Radar measurement space is as follows
Radar_measurement_model where Radar_measurement_model_h The Radar measurement model is implemented as a RadarMeasurementModel class.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].