All Projects → yizhou-wang → cruw-devkit

yizhou-wang / cruw-devkit

Licence: MIT license
Develop kit for CRUW dataset

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects
shell
77523 projects

Projects that are alternatives of or similar to cruw-devkit

MotionNet
CVPR 2020, "MotionNet: Joint Perception and Motion Prediction for Autonomous Driving Based on Bird's Eye View Maps"
Stars: ✭ 141 (+422.22%)
Mutual labels:  autonomous-driving
Hybrid-A-Star-U-Turn-Solution
Autonomous driving trajectory planning solution for U-Turn scenario
Stars: ✭ 75 (+177.78%)
Mutual labels:  autonomous-driving
patchwork
Official page of Patchwork (RA-L'21 w/ IROS'21)
Stars: ✭ 174 (+544.44%)
Mutual labels:  autonomous-driving
efficient online learning
Efficient Online Transfer Learning for 3D Object Detection in Autonomous Driving
Stars: ✭ 20 (-25.93%)
Mutual labels:  autonomous-driving
racing dreamer
Latent Imagination Facilitates Zero-Shot Transfer in Autonomous Racing
Stars: ✭ 31 (+14.81%)
Mutual labels:  autonomous-driving
lane-detection
Lane detection MATLAB code for Kalman Filter book chapter: Lane Detection
Stars: ✭ 21 (-22.22%)
Mutual labels:  autonomous-driving
Carla-ppo
This repository hosts a customized PPO based agent for Carla. The goal of this project is to make it easier to interact with and experiment in Carla with reinforcement learning based agents -- this, by wrapping Carla in a gym like environment that can handle custom reward functions, custom debug output, etc.
Stars: ✭ 122 (+351.85%)
Mutual labels:  autonomous-driving
opendlv
OpenDLV - A modern microservice-based software ecosystem powered by libcluon to make vehicles autonomous.
Stars: ✭ 67 (+148.15%)
Mutual labels:  autonomous-driving
AutonomousDriving
Java Autonomous Driving Appplication. Real time video car,pedistrians detection
Stars: ✭ 51 (+88.89%)
Mutual labels:  autonomous-driving
copilot
Lane and obstacle detection for active assistance during driving. Uses windowed sweep for lane detection. Combination of object tracking and YOLO for obstacles. Determines lane change, relative velocity and time to collision
Stars: ✭ 95 (+251.85%)
Mutual labels:  autonomous-driving
carla-data-export
A simple tool for generating training data from the Carla driving simulator
Stars: ✭ 47 (+74.07%)
Mutual labels:  autonomous-driving
conde simulator
Autonomous Driving Simulator for the Portuguese Robotics Open
Stars: ✭ 31 (+14.81%)
Mutual labels:  autonomous-driving
awesome-3d-multi-object-tracking-autonomous-driving
A summary and list of open source 3D multi object tracking and datasets at this stage.
Stars: ✭ 16 (-40.74%)
Mutual labels:  autonomous-driving
YOLOP
You Only Look Once for Panopitic Driving Perception.(https://arxiv.org/abs/2108.11250)
Stars: ✭ 1,228 (+4448.15%)
Mutual labels:  autonomous-driving
ad-xolib
C++ library for Parsing OpenScenario (1.1.1) & OpenDrive files (1.7) ASAM Specifications
Stars: ✭ 56 (+107.41%)
Mutual labels:  autonomous-driving
pillar-motion
Self-Supervised Pillar Motion Learning for Autonomous Driving (CVPR 2021)
Stars: ✭ 98 (+262.96%)
Mutual labels:  autonomous-driving
PyLidar3
PyLidar3 is python 3 package to get data from Lidar devices from various manufacturers.
Stars: ✭ 35 (+29.63%)
Mutual labels:  autonomous-driving
OpenHDMap
An open HD map production process for autonomous car simulation
Stars: ✭ 152 (+462.96%)
Mutual labels:  autonomous-driving
Hierarchical-Decision-Making-for-Autonomous-Driving
Rich literature review and discussion on the implementation of "Hierarchical Decision-Making for Autonomous Driving"
Stars: ✭ 38 (+40.74%)
Mutual labels:  autonomous-driving
rtron
r:trån is a road space model transformer library for OpenDRIVE, CityGML and beyond
Stars: ✭ 26 (-3.7%)
Mutual labels:  autonomous-driving

CRUW devkit

Package cruw-devkit is a useful toolkit for the CRUW dataset including sensor configurations, sensor calibration parameters, the mapping functions, metadata, visualization tools, etc. More components are still in the developing phase.

Please refer to our dataset website for more information about the CRUW Dataset.

This repository is maintained by Yizhou Wang. Free to raise issues and help improve this repository.

Acknowledgment for CRUW dataset

ACADEMIC OR NON-PROFIT ORGANIZATION NONCOMMERCIAL RESEARCH USE ONLY

This research is mainly conducted by the Information Processing Lab (IPL) at the University of Washington. It was partially supported by CMMB Vision – UWECE Center on Satellite Multimedia and Connected Vehicles. We would also like to thank the colleagues and students in IPL for their help and assistance on the dataset collection, processing, and annotation works.

News

Changelog

  • [2022/02/02] add dataset script and fix installation bugs
  • [2022/01/28] v1.1: add coordinate transform and other utils.
  • [2022/01/27] handle camera images unavailable issue for ROD2021 testing set.
  • [2022/01/25] add functions to transfer RF images/labels from polar to Cartesian coordinates.
  • [2021/12/03] add evaluation for RODNet format results.
  • [2021/11/09] add sensor config files.
  • [2021/11/01] add some utils functions for evaluation.
  • [2021/01/18] v1.0: stable version for ROD2021 Challenge.

Installation

Create a new conda environment. Tested under Python 3.6, 3.7, 3.8.

conda create -n cruw-devkit python=3.*

Run setup tool for this devkit.

conda activate cruw-devkit
pip install .
pip install -e .  # development mode

Tutorials

The tutorials for the usages of cruw-devkit package are listed in the tutorial folder.

  • For ROD2021 Challenge: Open In Colab

Annotation Format

ROD2021 Dataset

Each training sequence (40 training sequences in total) has an txt object annotation file. The annotation format for the training set (each line in the txt files):

  frame_id range(m) azimuth(rad) class_name
  ...

General CRUW Dataset

For each sequence, a json file is provided as annotations:

{
  "dataset": "CRUW",
  "date_collect": "2019_09_29",
  "seq_name": "2019_09_29_onrd000",
  "n_frames": 1694,
  "fps": 30,
  "sensors": "C2R2",                  // <str>: "C1R1", "C2R1", "C2R2"
  "view": "front",                    // <str>: "front", "right-side"
  "setup": "vehicle",                 // <str>: "cart", "vehicle"
  "metadata": [
    {  // metadata for each frame
      "frame_id": 0,
      "cam_0": {
        "folder_name": "images_0",
        "frame_name": "0000000000.jpg",
        "width": 1440,
        "height": 864,
        "n_objects": 5,
        "obj_info": {
          "anno_source": "human",     // <str>: "human", "mrcnn", etc.
          "categories": [],           // <str> [n_objects]: category names
          "bboxes": [],               // <int> [n_objects, 4]: xywh
          "scores": [],               // <float> [n_objects]: confidence scores [0, 1]
          "masks": [],                // <rle_code> [n_objects]: instance masks
          "visibilities": [],         // <float> [n_objects]: [0, 1]
          "truncations": [],          // <float> [n_objects]: [0, 1]
          "translations": []          // <float> [n_objects, 3]: xyz(m)
        }
      },
      "cam_1": {
        "folder_name": "images_1",
        "frame_name": "0000000000.jpg",
        "width": 1440,
        "height": 864,
        "n_objects": 5,
        "obj_info": {
          "anno_source": "human",     // <str>: "human", "mrcnn", etc.
          "categories": [],           // <str> [n_objects]: category names
          "bboxes": [],               // <int> [n_objects, 4]: xywh
          "scores": [],               // <float> [n_objects]: confidence scores [0, 1]
          "masks": [],                // <rle_code> [n_objects]: instance masks
          "visibilities": [],         // <float> [n_objects]: [0, 1]
          "truncations": [],          // <float> [n_objects]: [0, 1]
          "translations": []          // <float> [n_objects, 3]: xyz(m)
        }
      },
      "radar_h": {
        "folder_name": "radar_chirps_win_RISEP_h",
        "frame_name": "000000.npy",
        "range": 128,
        "azimuth": 128,
        "n_chirps": 255,
        "n_objects": 3,
        "obj_info": {
          "anno_source": "human",     // <str>: "human", "co", "crf", etc.
          "categories": [],           // <str> [n_objects]: category names
          "centers": [],              // <float> [n_objects, 2]: range(m), azimuth(rad)
          "center_ids": [],           // <int> [n_objects, 2]: range indices, azimuth indices
          "scores": []                // <float> [n_objects]: confidence scores [0, 1]
        }
      },
      "radar_v": {
        "folder_name": "radar_chirps_win_RISEP_v",
        "frame_name": "000000.npy",
        "range": 128,
        "azimuth": 128,
        "n_chirps": 255,
        "n_objects": 3,
        "obj_info": {
          "anno_source": "human",     // <str>: "human", "co", "crf", etc.
          "categories": [],           // <str> [n_objects]: category names
          "centers": [],              // <float> [n_objects, 2]: range(m), azimuth(rad)
          "center_ids": [],           // <int> [n_objects, 2]: range indices, azimuth indices
          "scores": []                // <float> [n_objects]: confidence scores [0, 1]
        }
      }
    },
    {...}
  ]
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].