All Projects → mithi → Fusion Ukf

mithi / Fusion Ukf

Licence: mit
An unscented Kalman Filter implementation for fusing lidar and radar sensor measurements.

Projects that are alternatives of or similar to Fusion Ukf

fusion-ekf
An extended Kalman Filter implementation in C++ for fusing lidar and radar sensor measurements.
Stars: ✭ 113 (-30.25%)
Mutual labels:  udacity, radar, lidar, self-driving-car, kalman-filter
sensor-fusion
Filters: KF, EKF, UKF || Process Models: CV, CTRV || Measurement Models: Radar, Lidar
Stars: ✭ 96 (-40.74%)
Mutual labels:  radar, lidar, kalman-filter
CarND-Extended-Kalman-Filter-P6
Self Driving Car Project 6 - Sensor Fusion(Extended Kalman Filter)
Stars: ✭ 24 (-85.19%)
Mutual labels:  radar, lidar, kalman-filter
urban road filter
Real-time LIDAR-based Urban Road and Sidewalk detection for Autonomous Vehicles 🚗
Stars: ✭ 134 (-17.28%)
Mutual labels:  lidar, self-driving-car
Self-Driving-Car-Steering-Simulator
The aim of this project is to allow a self driving car to steer autonomously in a virtual environment.
Stars: ✭ 15 (-90.74%)
Mutual labels:  udacity, self-driving-car
FAST LIO SLAM
LiDAR SLAM = FAST-LIO + Scan Context
Stars: ✭ 183 (+12.96%)
Mutual labels:  lidar, kalman-filter
BtcDet
Behind the Curtain: Learning Occluded Shapes for 3D Object Detection
Stars: ✭ 104 (-35.8%)
Mutual labels:  lidar, self-driving-car
self-driving-car-nd
Udacity's Self-Driving Car Nanodegree project files and notes.
Stars: ✭ 50 (-69.14%)
Mutual labels:  udacity, self-driving-car
Self Driving Car
Udacity Self-Driving Car Engineer Nanodegree projects.
Stars: ✭ 2,103 (+1198.15%)
Mutual labels:  self-driving-car, kalman-filter
Uselfdrivingsimulator
Self-Driving Car Simulator
Stars: ✭ 48 (-70.37%)
Mutual labels:  self-driving-car, udacity
Extended Kalman Filter
Udacity Self-Driving Car Engineer Nanodegree. Project: Extended Kalman Filters
Stars: ✭ 27 (-83.33%)
Mutual labels:  self-driving-car, kalman-filter
Vehicle Detection
Vehicle detection using machine learning and computer vision techniques for Udacity's Self-Driving Car Engineer Nanodegree.
Stars: ✭ 1,093 (+574.69%)
Mutual labels:  self-driving-car, udacity
Awesome Robotic Tooling
Tooling for professional robotic development in C++ and Python with a touch of ROS, autonomous driving and aerospace.
Stars: ✭ 1,876 (+1058.02%)
Mutual labels:  self-driving-car, lidar
opendlv
OpenDLV - A modern microservice-based software ecosystem powered by libcluon to make vehicles autonomous.
Stars: ✭ 67 (-58.64%)
Mutual labels:  lidar, self-driving-car
mpc
A software pipeline using the Model Predictive Control method to drive a car around a virtual track.
Stars: ✭ 119 (-26.54%)
Mutual labels:  udacity, self-driving-car
highway-path-planning
My path-planning pipeline to navigate a car safely around a virtual highway with other traffic.
Stars: ✭ 39 (-75.93%)
Mutual labels:  udacity, self-driving-car
Model-Predictive-Control
C++ implementation of Model Predictive Control(MPC)
Stars: ✭ 51 (-68.52%)
Mutual labels:  udacity, self-driving-car
LiDAR-GTA-V
A plugin for Grand Theft Auto V that generates a labeled LiDAR point cloud from the game environment.
Stars: ✭ 127 (-21.6%)
Mutual labels:  lidar, self-driving-car
Tracking With Extended Kalman Filter
Object (e.g Pedestrian, vehicles) tracking by Extended Kalman Filter (EKF), with fused data from both lidar and radar sensors.
Stars: ✭ 393 (+142.59%)
Mutual labels:  lidar, radar
Advanced Lane Detection
An advanced lane-finding algorithm using distortion correction, image rectification, color transforms, and gradient thresholding.
Stars: ✭ 71 (-56.17%)
Mutual labels:  self-driving-car, udacity

🐳 ☕️ 🧧

INTRODUCTION

This is an unscented Kalman Filter implementation in C++ for fusing lidar and radar sensor measurements. A Kalman filter can be used anywhere where you have uncertain information about some dynamic system, and you want to make an educated guess about what the system is going to do next.

In this case, we have two 'noisy' sensors:

  • A lidar sensor that measures a tracked object's position in cartesian-coordinates (x, y)
  • A radar sensor that measures a tracked object's position and relative velocity (the velocity within line of sight) in polar coordinates (rho, phi, drho)

We want to predict a tracked object's position, how fast it's going in what direction, and how fast it is turning (yaw rate) at any point in time.

  • In essence we want to get: the position of the system in cartesian coordinates, the velocity magnitude, the yaw angle in radians, and yaw rate in radians per second (x, y, v, yaw, yawrate)
  • We are assuming a constant turn/yaw rate and velocity magnitude model (CRTV) for this particular system

This unscented kalman filter does just that.

  • NOTE: Compared with an Extended Kalman Filter with a constant velocity model, RMSE should be lower for the unscented Kalman filter especially for velocity. The CTRV model is more precise than a constant velocity model. And UKF is also known for handling non-linear equations better than EKF.
  • Harvard Paper about UKF

CONTENTS

  • Basic Usage
  • Notes

BASIC USAGE

  • Dependencies are same as in here
  • Clone this repository
$ git clone https://github.com/mithi/fusion-ukf/
  • Go inside the build folder and compile:
$ cd build
$ CC=gcc-6 cmake .. && make
  • To execute inside the build folder use the following format:
$ ./unscentedKF /PATH/TO/INPUT/FILE /PATH/TO/OUTPUT/FILE
$ ./unscentedKF ../data/data-3.txt ../data/out-3.txt
  • Please use the following format for your input file
L(for lidar) m_x m_y t r_x r_y r_vx r_vy, r_yaw, r_yawrate
R(for radar) m_rho m_phi m_drho t r_px r_py r_vx r_vy, r_yaw, r_yawrate

Where:
(m_x, m_y) - measurements by the lidar
(m_rho, m_phi, m_drho) - measurements by the radar in polar coordinates
(t) - timestamp in unix/epoch time the measurements were taken
(r_x, r_y, r_vx, r_vy, r_yaw, r_yawrate) - the real ground truth state of the system

Example:
L 3.122427e-01  5.803398e-01  1477010443000000  6.000000e-01  6.000000e-01  5.199937e+00  0 0 6.911322e-03
R 1.014892e+00  5.543292e-01  4.892807e+00  1477010443050000  8.599968e-01  6.000449e-01  5.199747e+00  1.796856e-03  3.455661e-04  1.382155e-02
  • The program outputs the predictions in the following format on the output file path you specified:
time_stamp  px_state  py_state  v_state yaw_angle_state yaw_rate_state  sensor_type NIS px_measured py_measured px_ground_truth py_ground_truth vx_ground_truth vy_ground_truth
1477010443000000  0.312243  0.58034 0 0 0 lidar 2.32384e-319  0.312243  0.58034 0.6 0.6 0 0
1477010443050000  0.735335  0.629467  7.20389 9.78669e-18 5.42626e-17 radar 74.6701 0.862916  0.534212  0.859997  0.600045  0.000345533 4.77611e-06
...

NOTES

If you take a look at settings you'll see the following:

//process noise standard deviations
const double STD_SPEED_NOISE = 0.9; // longitudinal acceleration in m/s^2
const double STD_YAWRATE_NOISE = 0.6; // yaw acceleration in rad/s^2

Here's the terminal output from the given data set

terminal output

Here's a visualization of how it's performing

Visualization

Here's a visualization of the Radar's NIS

Radar's NIS

Here's a visualization of the Lidar's NIS

Lidar's NIS

Here's my UKF algorithm overview

UKF Algorithm Overview

And here's an overview of what the instantiated classes are doing

UKF Algorithm Overview 2

🐳 ☕️ 🧧

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].