All Projects → PRBonn → point-cloud-prediction

PRBonn / point-cloud-prediction

Licence: MIT license
Self-supervised Point Cloud Prediction Using 3D Spatio-temporal Convolutional Networks

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to point-cloud-prediction

Displaz.jl
Julia bindings for the displaz lidar viewer
Stars: ✭ 16 (-83.51%)
Mutual labels:  point-cloud, lidar
verif
Software for verifying weather forecasts
Stars: ✭ 70 (-27.84%)
Mutual labels:  prediction, forecasting
forecastVeg
A Machine Learning Approach to Forecasting Remotely Sensed Vegetation Health in Python
Stars: ✭ 44 (-54.64%)
Mutual labels:  prediction, forecasting
Liblas
C++ library and programs for reading and writing ASPRS LAS format with LiDAR data
Stars: ✭ 211 (+117.53%)
Mutual labels:  point-cloud, lidar
ECCV-2020-point-cloud-analysis
ECCV 2020 papers focusing on point cloud analysis
Stars: ✭ 22 (-77.32%)
Mutual labels:  point-cloud, point
Pclpy
Python bindings for the Point Cloud Library (PCL)
Stars: ✭ 212 (+118.56%)
Mutual labels:  point-cloud, lidar
Sales-Prediction
In depth analysis and forecasting of product sales based on the items, stores, transaction and other dependent variables like holidays and oil prices.
Stars: ✭ 56 (-42.27%)
Mutual labels:  prediction, forecasting
Lidar camera calibration
Light-weight camera LiDAR calibration package for ROS using OpenCV and PCL (PnP + LM optimization)
Stars: ✭ 133 (+37.11%)
Mutual labels:  point-cloud, lidar
LiDAR fog sim
LiDAR fog simulation
Stars: ✭ 101 (+4.12%)
Mutual labels:  point-cloud, lidar
pole-localization
Online Range Image-based Pole Extractor for Long-term LiDAR Localization in Urban Environments
Stars: ✭ 107 (+10.31%)
Mutual labels:  lidar, range-image
Displaz
A hackable lidar viewer
Stars: ✭ 177 (+82.47%)
Mutual labels:  point-cloud, lidar
Python-for-Remote-Sensing
python codes for remote sensing applications will be uploaded here. I will try to teach everything I learn during my projects in here.
Stars: ✭ 20 (-79.38%)
Mutual labels:  point-cloud, lidar
Vision3d
Research platform for 3D object detection in PyTorch.
Stars: ✭ 177 (+82.47%)
Mutual labels:  point-cloud, lidar
Spvnas
[ECCV 2020] Searching Efficient 3D Architectures with Sparse Point-Voxel Convolution
Stars: ✭ 239 (+146.39%)
Mutual labels:  point-cloud, lidar
Extrinsic lidar camera calibration
This is a package for extrinsic calibration between a 3D LiDAR and a camera, described in paper: Improvements to Target-Based 3D LiDAR to Camera Calibration. This package is used for Cassie Blue's 3D LiDAR semantic mapping and automation.
Stars: ✭ 149 (+53.61%)
Mutual labels:  point-cloud, lidar
3D Ground Segmentation
A ground segmentation algorithm for 3D point clouds based on the work described in “Fast segmentation of 3D point clouds: a paradigm on LIDAR data for Autonomous Vehicle Applications”, D. Zermas, I. Izzat and N. Papanikolopoulos, 2017. Distinguish between road and non-road points. Road surface extraction. Plane fit ground filter
Stars: ✭ 55 (-43.3%)
Mutual labels:  point-cloud, lidar
Laser Camera Calibration Toolbox
A Laser-Camera Calibration Toolbox extending from that at http://www.cs.cmu.edu/~ranjith/lcct.html
Stars: ✭ 99 (+2.06%)
Mutual labels:  point-cloud, lidar
Awesome Robotic Tooling
Tooling for professional robotic development in C++ and Python with a touch of ROS, autonomous driving and aerospace.
Stars: ✭ 1,876 (+1834.02%)
Mutual labels:  point-cloud, lidar
WS3D
Official version of 'Weakly Supervised 3D object detection from Lidar Point Cloud'(ECCV2020)
Stars: ✭ 104 (+7.22%)
Mutual labels:  point-cloud, lidar
Diebold-Mariano-Test
This Python function dm_test implements the Diebold-Mariano Test (1995) to statistically test forecast accuracy equivalence for 2 sets of predictions with modification suggested by Harvey et. al (1997).
Stars: ✭ 70 (-27.84%)
Mutual labels:  prediction, forecasting

Self-supervised Point Cloud Prediction Using 3D Spatio-temporal Convolutional Networks

This is a Pytorch-Lightning implementation of the paper "Self-supervised Point Cloud Prediction Using 3D Spatio-temporal Convolutional Networks".

Given a sequence of P past point clouds (left in red) at time T, the goal is to predict the F future scans (right in blue).

Table of Contents

  1. Publication
  2. Data
  3. Installation
  4. Training
  5. Testing
  6. Visualization
  7. Download
  8. License

Overview of our architecture

Publication

If you use our code in your academic work, please cite the corresponding paper:

@inproceedings{mersch2021corl,
  author = {B. Mersch and X. Chen and J. Behley and C. Stachniss},
  title = {{Self-supervised Point Cloud Prediction Using 3D Spatio-temporal Convolutional Networks}},
  booktitle = {Proc.~of the Conf.~on Robot Learning (CoRL)},
  year = {2021},
}

Data

Download the Kitti Odometry data from the official website.

Installation

Source Code

Clone this repository and run

cd point-cloud-prediction
git submodule update --init

to install the Chamfer distance submodule. The Chamfer distance submodule is originally taken from here with some modifications to use it as a submodule. All parameters are stored in config/parameters.yaml.

Dependencies

In this project, we use CUDA 10.2. All other dependencies are managed with Python Poetry and can be found in the poetry.lock file. If you want to use Python Poetry (recommended), install it with:

curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/install-poetry.py | python -

Install Python dependencies with Python Poetry

poetry install

and activate the virtual environment in the shell with

poetry shell

Export Environment Variables to dataset

We process the data in advance to speed up training. The preprocessing is automatically done if GENERATE_FILES is set to true in config/parameters.yaml. The environment variable PCF_DATA_RAW points to the directory containing the train/val/test sequences specified in the config file. It can be set with

export PCF_DATA_RAW=/path/to/kitti-odometry/dataset/sequences

and the destination of the processed files PCF_DATA_PROCESSED is set with

export PCF_DATA_PROCESSED=/desired/path/to/processed/data/

Training

Note If you have not pre-processed the data yet, you need to set GENERATE_FILES: True in config/parameters.yaml. After that, you can set GENERATE_FILES: False to skip this step.

The training script can be run by

python pcf/train.py

using the parameters defined in config/parameters.yaml. Pass the flag --help if you want to see more options like resuming from a checkpoint or initializing the weights from a pre-trained model. A directory will be created in pcf/runs which makes it easier to discriminate between different runs and to avoid overwriting existing logs. The script saves everything like the used config, logs and checkpoints into a path pcf/runs/COMMIT/EXPERIMENT_DATE_TIME consisting of the current git commit ID (this allows you to checkout at the last git commit used for training), the specified experiment ID (pcf by default) and the date and time.

Example: pcf/runs/7f1f6d4/pcf_20211106_140014

7f1f6d4: Git commit ID

pcf_20211106_140014: Experiment ID, date and time

Testing

Test your model by running

python pcf/test.py -m COMMIT/EXPERIMENT_DATE_TIME

where COMMIT/EXPERIMENT_DATE_TIME is the relative path to your model in pcf/runs. Note: Use the flag -s if you want to save the predicted point clouds for visualiztion and -l if you want to test the model on a smaller amount of data.

Example

python pcf/test.py -m 7f1f6d4/pcf_20211106_140014

or

python pcf/test.py -m 7f1f6d4/pcf_20211106_140014 -l 5 -s

if you want to test the model on 5 batches and save the resulting point clouds.

Visualization

After passing the -s flag to the testing script, the predicted range images will be saved as .svg files in pcf/runs/COMMIT/EXPERIMENT_DATE_TIME/range_view_predictions. The predicted point clouds are saved to pcf/runs/COMMIT/EXPERIMENT_DATE_TIME/test/point_clouds. You can visualize them by running

python pcf/visualize.py -p pcf/runs/COMMIT/EXPERIMENT_DATE_TIME/test/point_clouds

Five past and five future ground truth and our five predicted future range images.

Last received point cloud at time T and the predicted next 5 future point clouds. Ground truth points are shown in red and predicted points in blue.

Download

You can download our best performing model from the paper here. Just extract the zip file into pcf/runs.

License

This project is free software made available under the MIT License. For details see the LICENSE file.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].