appinho / Sarosperceptionkitti
Licence: mit
ROS package for the Perception (Sensor Processing, Detection, Tracking and Evaluation) of the KITTI Vision Benchmark Suite
Stars: ✭ 193
Projects that are alternatives of or similar to Sarosperceptionkitti
Mask rcnn ros
The ROS Package of Mask R-CNN for Object Detection and Segmentation
Stars: ✭ 53 (-72.54%)
Mutual labels: object-detection, ros, semantic-segmentation
Self Driving Golf Cart
Be Driven 🚘
Stars: ✭ 147 (-23.83%)
Mutual labels: object-detection, ros, semantic-segmentation
Jacinto Ai Devkit
Training & Quantization of embedded friendly Deep Learning / Machine Learning / Computer Vision models
Stars: ✭ 49 (-74.61%)
Mutual labels: object-detection, semantic-segmentation
Darknet ros
YOLO ROS: Real-Time Object Detection for ROS
Stars: ✭ 1,101 (+470.47%)
Mutual labels: object-detection, ros
Raster Vision
An open source framework for deep learning on satellite and aerial imagery.
Stars: ✭ 1,248 (+546.63%)
Mutual labels: object-detection, semantic-segmentation
Ros yolo as template matching
Run 3 scripts to (1) Synthesize images (by putting few template images onto backgrounds), (2) Train YOLOv3, and (3) Detect objects for: one image, images, video, webcam, or ROS topic.
Stars: ✭ 32 (-83.42%)
Mutual labels: object-detection, ros
Awesome Semantic Segmentation
🤘 awesome-semantic-segmentation
Stars: ✭ 8,831 (+4475.65%)
Mutual labels: semantic-segmentation, evaluation
Vidvrd Helper
To keep updates with VRU Grand Challenge, please use https://github.com/NExTplusplus/VidVRD-helper
Stars: ✭ 81 (-58.03%)
Mutual labels: object-detection, evaluation
Gluon Cv
Gluon CV Toolkit
Stars: ✭ 5,001 (+2491.19%)
Mutual labels: object-detection, semantic-segmentation
Pytorch cpp
Deep Learning sample programs using PyTorch in C++
Stars: ✭ 114 (-40.93%)
Mutual labels: object-detection, semantic-segmentation
Evo
Python package for the evaluation of odometry and SLAM
Stars: ✭ 1,373 (+611.4%)
Mutual labels: ros, evaluation
Pick Place Robot
Object picking and stowing with a 6-DOF KUKA Robot using ROS
Stars: ✭ 126 (-34.72%)
Mutual labels: object-detection, ros
Dodo detector ros
Object detection from images/point cloud using ROS
Stars: ✭ 31 (-83.94%)
Mutual labels: object-detection, ros
Medicaldetectiontoolkit
The Medical Detection Toolkit contains 2D + 3D implementations of prevalent object detectors such as Mask R-CNN, Retina Net, Retina U-Net, as well as a training and inference framework focused on dealing with medical images.
Stars: ✭ 917 (+375.13%)
Mutual labels: object-detection, semantic-segmentation
Efficientdet Pytorch
A PyTorch impl of EfficientDet faithful to the original Google impl w/ ported weights
Stars: ✭ 906 (+369.43%)
Mutual labels: object-detection, semantic-segmentation
Autonomous driving
Ros package for basic autonomous lane tracking and object detection
Stars: ✭ 67 (-65.28%)
Mutual labels: object-detection, ros
Involution
[CVPR 2021] Involution: Inverting the Inherence of Convolution for Visual Recognition, a brand new neural operator
Stars: ✭ 252 (+30.57%)
Mutual labels: object-detection, semantic-segmentation
Edgenets
This repository contains the source code of our work on designing efficient CNNs for computer vision
Stars: ✭ 331 (+71.5%)
Mutual labels: object-detection, semantic-segmentation
Frostnet
FrostNet: Towards Quantization-Aware Network Architecture Search
Stars: ✭ 85 (-55.96%)
Mutual labels: object-detection, semantic-segmentation
Paz
Hierarchical perception library in Python for pose estimation, object detection, instance segmentation, keypoint estimation, face recognition, etc.
Stars: ✭ 131 (-32.12%)
Mutual labels: object-detection, semantic-segmentation
SARosPerceptionKitti
ROS package for the Perception (Sensor Processing, Detection, Tracking and Evaluation) of the KITTI Vision Benchmark
Demo
Setup
Sticking to this folder structure is highly recommended:
~ # Home directory
├── catkin_ws # Catkin workspace
│ ├── src # Source folder
│ └── SARosPerceptionKitti # Repo
├── kitti_data # Dataset
│ ├── 0012 # Demo scenario 0012
│ │ └── synchronized_data.bag # Synchronized ROSbag file
- Install ROS and create a catkin workspace in your home directory:
mkdir -p ~/catkin_ws/src
- Clone this repository into the catkin workspace's source folder (src) and build it:
cd ~/catkin_ws/src
git clone https://github.com/appinho/SARosPerceptionKitti.git
cd ~/catkin_ws
catkin_make
source devel/setup.bash
-
Download a preprocessed scenario and unzip it into a separate
kitti_data
directory, also stored under your home directory:
mkdir ~/kitti_data && cd ~/kitti_data/
mv ~/Downloads/0012.zip .
unzip 0012.zip
rm 0012.zip
Usage
- Launch one of the following ROS nodes to perform and visualize the pipeline (Sensor Processing -> Object Detection -> Object Tracking) step-by-step:
source devel/setup.bash
roslaunch sensor_processing sensor_processing.launch home_dir:=/home/YOUR_USERNAME
roslaunch detection detection.launch home_dir:=/home/YOUR_USERNAME
roslaunch tracking tracking.launch home_dir:=/home/YOUR_USERNAME
- Default parameters:
- scenario:=0012
- speed:=0.2
- delay:=3
Without assigning any of the abovementioned parameters the demo scenario 0012 is replayed at 20% of its speed with a 3 second delay so RViz has enough time to boot up.
- Write the results to file and evaluate them:
roslaunch evaluation evaluation.launch home_dir:=/home/YOUR_USERNAME
cd ~/catkin_ws/src/SARosPerceptionKitti/benchmark/python
python evaluate_tracking.py
Results for demo scenario 0012
Class | MOTA | MOTP | MOTAL | MODA | MODP |
---|---|---|---|---|---|
Car | 0.881119 | 0.633595 | 0.881119 | 0.881119 | 0.642273 |
Pedestrian | 0.546875 | 0.677919 | 0.546875 | 0.546875 | 0.836921 |
Contact
If you have any questions, things you would love to add or ideas how to actualize the points in the Area of Improvements, send me an email at [email protected] ! More than interested to collaborate and hear any kind of feedback.
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].