All Projects → poodarchu → Det3d

poodarchu / Det3d

Licence: apache-2.0
A general 3D object detection codebse.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Det3d

Yolo3d Yolov4 Pytorch
YOLO3D: End-to-end real-time 3D Oriented Object Bounding Box Detection from LiDAR Point Cloud (ECCV 2018)
Stars: ✭ 119 (-88.39%)
Mutual labels:  object-detection, point-cloud
Frustum Pointnets
Frustum PointNets for 3D Object Detection from RGB-D Data
Stars: ✭ 1,154 (+12.59%)
Mutual labels:  object-detection, point-cloud
3d Bounding Boxes From Monocular Images
A two stage multi-modal loss model along with rigid body transformations to regress 3D bounding boxes
Stars: ✭ 24 (-97.66%)
Mutual labels:  object-detection, point-cloud
Votenet
Deep Hough Voting for 3D Object Detection in Point Clouds
Stars: ✭ 1,183 (+15.41%)
Mutual labels:  object-detection, point-cloud
Openpcdet
OpenPCDet Toolbox for LiDAR-based 3D Object Detection.
Stars: ✭ 2,199 (+114.54%)
Mutual labels:  object-detection, point-cloud
Vision3d
Research platform for 3D object detection in PyTorch.
Stars: ✭ 177 (-82.73%)
Mutual labels:  object-detection, point-cloud
Mmdetection3d
OpenMMLab's next-generation platform for general 3D object detection.
Stars: ✭ 945 (-7.8%)
Mutual labels:  object-detection, point-cloud
Albumentations
Fast image augmentation library and an easy-to-use wrapper around other libraries. Documentation: https://albumentations.ai/docs/ Paper about the library: https://www.mdpi.com/2078-2489/11/2/125
Stars: ✭ 9,353 (+812.49%)
Mutual labels:  object-detection
Flownet3d pytorch
The pytorch implementation of flownet3d: https://github.com/xingyul/flownet3d
Stars: ✭ 39 (-96.2%)
Mutual labels:  point-cloud
Simple Ssd For Beginners
This repository contains easy SSD(Single Shot MultiBox Detector) implemented with Pytorch and is easy to read and learn
Stars: ✭ 33 (-96.78%)
Mutual labels:  object-detection
Keras M2det
Keras implementation of m2det object detection.
Stars: ✭ 32 (-96.88%)
Mutual labels:  object-detection
Channel Pruning
Channel Pruning for Accelerating Very Deep Neural Networks (ICCV'17)
Stars: ✭ 979 (-4.49%)
Mutual labels:  object-detection
Cv Pretrained Model
A collection of computer vision pre-trained models.
Stars: ✭ 995 (-2.93%)
Mutual labels:  object-detection
Point Cloud Filter
Scripts showcasing filtering techniques applied to point cloud data.
Stars: ✭ 34 (-96.68%)
Mutual labels:  point-cloud
Realtime Detectron
Real-time Detectron using webcam.
Stars: ✭ 42 (-95.9%)
Mutual labels:  object-detection
Ros yolo as template matching
Run 3 scripts to (1) Synthesize images (by putting few template images onto backgrounds), (2) Train YOLOv3, and (3) Detect objects for: one image, images, video, webcam, or ROS topic.
Stars: ✭ 32 (-96.88%)
Mutual labels:  object-detection
Tensorflow Lite Rest Server
Expose tensorflow-lite models via a rest API
Stars: ✭ 43 (-95.8%)
Mutual labels:  object-detection
Computervision Recipes
Best Practices, code samples, and documentation for Computer Vision.
Stars: ✭ 8,214 (+701.37%)
Mutual labels:  object-detection
Unity 3mx
Load 3MX/3MXB format LOD model files generated by Bentley ContextCapture into Unity.
Stars: ✭ 38 (-96.29%)
Mutual labels:  point-cloud
Traffic Light Detector
Detect traffic lights and classify the state of them, then give the commands "go" or "stop".
Stars: ✭ 37 (-96.39%)
Mutual labels:  object-detection

Det3D

A general 3D Object Detection codebase in PyTorch.

1. Introduction

Det3D is the first 3D Object Detection toolbox which provides off the box implementations of many 3D object detection algorithms such as PointPillars, SECOND, PIXOR, etc, as well as state-of-the-art methods on major benchmarks like KITTI(ViP) and nuScenes(CBGS). Key features of Det3D include the following aspects:

  • Multi Datasets Support: KITTI, nuScenes, Lyft
  • Point-based and Voxel-based model zoo
  • State-of-the-art performance
  • DDP & SyncBN

2. Installation

Please refer to INSTALATION.md.

3. Quick Start

Please refer to GETTING_STARTED.md.

4. Model Zoo

4.1 nuScenes

mAP mATE mASE mAOE mAVE mAAE NDS ckpt
CBGS 49.9 0.335 0.256 0.323 0.251 0.197 61.3 link
PointPillar 41.8 0.363 0.264 0.377 0.288 0.198 56.0 link

The original model and prediction files are available in the CBGS README.

4.2 KITTI

Second on KITTI(val) Dataset

car  AP @0.70, 0.70,  0.70:
bbox AP:90.54, 89.35, 88.43
bev  AP:89.89, 87.75, 86.81
3d   AP:87.96, 78.28, 76.99
aos  AP:90.34, 88.81, 87.66

PointPillars on KITTI(val) Dataset

car  [email protected],  0.70,  0.70:
bbox AP:90.63, 88.86, 87.35
bev  AP:89.75, 86.15, 83.00
3d   AP:85.75, 75.68, 68.93
aos  AP:90.48, 88.36, 86.58

4.3 Lyft

4.4 Waymo

5. Functionality

  • Models
    • [x] VoxelNet
    • [x] SECOND
    • [x] PointPillars
  • Features
    • [x] Multi task learning & Multi-task Learning
    • [x] Distributed Training and Validation
    • [x] SyncBN
    • [x] Flexible anchor dimensions
    • [x] TensorboardX
    • [x] Checkpointer & Breakpoint continue
    • [x] Self-contained visualization
    • [x] Finetune
    • [x] Multiscale Training & Validation
    • [x] Rotated RoI Align

6. TODO List

  • To Be Released

    • [ ] CGBS on Lyft(val) Dataset
  • Models

    • [ ] PointRCNN
    • [ ] PIXOR

7. Call for contribution.

  • Support Waymo Dataset.
  • Add other 3D detection / segmentation models, such as VoteNet, STD, etc.

8. Developers

Benjin Zhu , Bingqi Ma

9. License

Det3D is released under the Apache licenes.

10. Citation

Det3D is a derivative codebase of CBGS, if you find this work useful in your research, please consider cite:

@article{zhu2019class,
  title={Class-balanced Grouping and Sampling for Point Cloud 3D Object Detection},
  author={Zhu, Benjin and Jiang, Zhengkai and Zhou, Xiangxin and Li, Zeming and Yu, Gang},
  journal={arXiv preprint arXiv:1908.09492},
  year={2019}
}

11. Acknowledgement

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].