All Projects → JunshengFu → Tracking With Extended Kalman Filter

JunshengFu / Tracking With Extended Kalman Filter

Licence: mit
Object (e.g Pedestrian, vehicles) tracking by Extended Kalman Filter (EKF), with fused data from both lidar and radar sensors.

Projects that are alternatives of or similar to Tracking With Extended Kalman Filter

CarND-Extended-Kalman-Filter-P6
Self Driving Car Project 6 - Sensor Fusion(Extended Kalman Filter)
Stars: ✭ 24 (-93.89%)
Mutual labels:  radar, lidar
Fusion Ukf
An unscented Kalman Filter implementation for fusing lidar and radar sensor measurements.
Stars: ✭ 162 (-58.78%)
Mutual labels:  lidar, radar
fusion-ekf
An extended Kalman Filter implementation in C++ for fusing lidar and radar sensor measurements.
Stars: ✭ 113 (-71.25%)
Mutual labels:  radar, lidar
OpenMaterial
3D model exchange format with physical material properties for virtual development, test and validation of automated driving.
Stars: ✭ 23 (-94.15%)
Mutual labels:  radar, lidar
sensor-fusion
Filters: KF, EKF, UKF || Process Models: CV, CTRV || Measurement Models: Radar, Lidar
Stars: ✭ 96 (-75.57%)
Mutual labels:  radar, lidar
lidar transfer
Code for Langer et al. "Domain Transfer for Semantic Segmentation of LiDAR Data using Deep Neural Networks", IROS, 2020.
Stars: ✭ 54 (-86.26%)
Mutual labels:  lidar
Open3d Ml
An extension of Open3D to address 3D Machine Learning tasks
Stars: ✭ 284 (-27.74%)
Mutual labels:  lidar
radar-ml
Detect (classify and localize) people, pets and objects using millimeter-wave radar.
Stars: ✭ 47 (-88.04%)
Mutual labels:  radar
AO-Radar
Albion Online Gathering And Player Radar
Stars: ✭ 16 (-95.93%)
Mutual labels:  radar
Interactive slam
Interactive Map Correction for 3D Graph SLAM
Stars: ✭ 372 (-5.34%)
Mutual labels:  lidar
Hdl localization
Real-time 3D localization using a (velodyne) 3D LIDAR
Stars: ✭ 332 (-15.52%)
Mutual labels:  lidar
Sparse Depth Completion
Predict dense depth maps from sparse and noisy LiDAR frames guided by RGB images. (Ranked 1st place on KITTI)
Stars: ✭ 272 (-30.79%)
Mutual labels:  lidar
Pandar40 SDK
Development kit for Pandar40
Stars: ✭ 20 (-94.91%)
Mutual labels:  lidar
Overlapnet
OverlapNet - Loop Closing for 3D LiDAR-based SLAM (chen2020rss)
Stars: ✭ 299 (-23.92%)
Mutual labels:  lidar
Pandora SDK
Development kit for Pandora
Stars: ✭ 14 (-96.44%)
Mutual labels:  lidar
Sc Lego Loam
LiDAR SLAM: Scan Context + LeGO-LOAM
Stars: ✭ 332 (-15.52%)
Mutual labels:  lidar
bom-radar-card
A rain radar card using the new tiled images from the Australian BOM
Stars: ✭ 52 (-86.77%)
Mutual labels:  radar
3d cnn tensorflow
KITTI data processing and 3D CNN for Vehicle Detection
Stars: ✭ 266 (-32.32%)
Mutual labels:  lidar
Lidr
R package for airborne LiDAR data manipulation and visualisation for forestry application
Stars: ✭ 310 (-21.12%)
Mutual labels:  lidar
3dfier
The open-source tool for creating of 3D models
Stars: ✭ 260 (-33.84%)
Mutual labels:  lidar

Object Tracking with Sensor Fusion-based Extended Kalman Filter

Objective

Utilize sensor data from both LIDAR and RADAR measurements for object (e.g. pedestrian, vehicles, or other moving objects) tracking with the Extended Kalman Filter.

Demo: Object tracking with both LIDAR and RADAR measurements

gif_demo1

In this demo, the blue car is the object to be tracked, but the tracked object can be any types, e.g. pedestrian, vehicles, or other moving objects. We continuously got both LIDAR (red circle) and RADAR (blue circle) measurements of the car's location in the defined coordinate, but there might be noise and errors in the data. Also, we need to find a way to fuse the two types of sensor measurements to estimate the proper location of the tracked object.

Therefore, we use Extended Kalman Filter to compute the estimated location (green triangle) of the blue car. The estimated trajectory (green triangle) is compared with the ground true trajectory of the blue car, and the error is displayed in RMSE format in real time.

In autonomous driving case, the self-driving cars obtian both Lidar and radar sensors measurements of objects to be tracked, and then apply the Extended Kalman Filter to track the objects based on the two types of sensor data.


Code & Files

1. Dependencies & environment

2. My project files

(Note: the hyperlinks only works if you are on the homepage of this GitHub reop, and if you are viewing it in "github.io" you can be redirected by clicking the View the Project on GitHub on the top)

  • CMakeLists.txt is the cmake file.

  • data folder contains test lidar and radar measurements.

  • Docs folder contains docments which describe the data.

  • src folder contains the source code.

3. Code Style

4. How to run the code

  1. Clone this repo.
  2. Make a build directory: mkdir build && cd build
  3. Compile: cmake .. && make
    • On windows, you may need to run: cmake .. -G "Unix Makefiles" && make
  4. Run it by either of the following commands:
    • ./ExtendedKF ../data/obj_pose-laser-radar-synthetic-input.txt ./output.txt
    • ./ExtendedKF ../data/sample-laser-radar-measurement-data-1.txt ./output.txt

5. Release History

  • 0.2.1

    • Docs: Add a sample video for vehicle tracking
    • Date 3 May 2017
  • 0.2.0

    • Fix: Normalize the angle for EKF updates with Radar
    • Fix: Initialize several variables
    • Date 2 May 2017
  • 0.1.1

    • First proper release
    • Date 1 May 2017
  • 0.1.0

    • Initiate the repo and add the functionality of pedestrian trakcing with lidar data.
    • Date 28 April 2017

System details

1. Demos

Demo 1: Tracking with both LIDAR and RADAR measurements

In this demo, both LIDAR and RADAR measurements are used for object tracking.

gif_demo1

Demo 2: Tracking with only LIDAR measurements

In this demo, only LIDAR measurements are used for the object tracking.

gif_demo2

Demo 3:Tracking with only RADAR measurements

In this demo, only RADAR measurements are used for the object tracking. are more noisy than the LIDAR measurements.

gif_demo3

From these three Demos, we could see that

  • RADAR measurements are tend to be more more noisy than the LIDAR measurements.
  • Extended Kalman Filter tracking by utilizing both measurements from both LIDAR and RADAR can reduce the noise/errors from the sensor measurements, and provide the robust estimations of the tracked object locations.

Note: the advantage of RADAR is that it can estimate the object speed directly by Doppler effect.

2. How does LIDAR measurement look like

The LIDAR will produce 3D measurement px,py,pz. But for the case of driving on the road, we could simplify the pose of the tracked object as: px,py,and one rotation. In other words, we could only use px and px to indicate the position of the object, and one rotation to indicate the orientation of the object. But in real world where you have very steep road, you have to consider z axis as well. Also in application like airplane and drone, you definitely want to consider pz as well.

3. How does RADAR measurement look like

4. Comparison of LIDAR, RADAR and Camera

Sensor type LIDAR RADAR Camera
Resolution median low high
Direct velocity measure no yes no
All-weather bad good bad
Sensor size large small small
sense non-line of sight object no yes no

Note:

  • LIDAR wavelength in infrared; RADAR wavelength in mm.
  • LIDAR most affected by dirt and small debris.

One comparison Figure from another aspect.

5. How does the Extended Kalman Filter Work

4. Extended Kalman Filter V.S. Kalman Filter

  • x is the mean state vector.
  • F is the state transition function.
  • P is the state covariance matrix, indicating the uncertainty of the object's state.
  • u is the process noise, which is a Gaussian with zero mean and covariance as Q.
  • Q is the covariance matrix of the process noise.

  • y is the innovation term, i.e. the difference between the measurement and the prediction. In order to compute the innovation term, we transform the state to measurement space by measurement function, so that we can compare the measurement and prediction directly.
  • S is the predicted measurement covariance matrix, or named innovation covariance matrix.
  • H is the measurement function.
  • z is the measurement.
  • R is the covariance matrix of the measurement noise.
  • I is the identity matrix.
  • K is the Kalman filter gain.
  • Hj and Fj are the jacobian matrix.

All Kalman filters have the same three steps:

  1. Initialization
  2. Prediction
  3. Update

A standard Kalman filter can only handle linear equations. Both the Extended Kalman Filter (EKF) and the Unscented Kalman Filter (UKF will be disuccsed in the next project) allow you to use non-linear equations; the difference between EKF and UKF is how they handle non-linear equations: Extended Kalman Filter uses the Jacobian matrix to linearize non-linear functions; Unscented Kalman Filter, on the other hand, does not need to linearize non-linear functions, insteadly, the unscented Kalman filter takes representative points from a Gaussian distribution.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].