All Projects → RalphMao → VMetrics

RalphMao / VMetrics

Licence: other
A Python library to evaluate mean Average Precision(mAP) for object detection. Provides the same output as PASCAL VOC's matlab code.

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to VMetrics

Device Detector
The Universal Device Detection library will parse any User Agent and detect the browser, operating system, device used (desktop, tablet, mobile, tv, cars, console, etc.), brand and model.
Stars: ✭ 2,106 (+12288.24%)
Mutual labels:  detection-library
Androidperformancemonitor
A transparent ui-block detection library for Android. (known as BlockCanary)
Stars: ✭ 6,241 (+36611.76%)
Mutual labels:  detection-library
metrics
📈 Capturing JVM- and application-level metrics. So you know what's going on.
Stars: ✭ 7,654 (+44923.53%)
Mutual labels:  metrics-library

VMetrics

This repo provides the evaluation codes used in our ICCV 2019 paper A Delay Metric for Video Object Detection: What Average Precision Fails to Tell, including:

  • Mean Average Precision (mAP)
  • Average Delay (AD)
  • A redesigned NAB metric for the video object detection problem.

Prepare the data

Download the groundtruth annotations and the sample detector outputs from Google Drive.

The groundtruth annotations of VIDT are stored in KITTI-format due to its simplicity and io-efficiency.

We provide the outputs of the following methods. The github repos that generate those outputs are also listed.

Run evaluation

All the evaluation scripts are under ./experiments folder. For instance, to measure the mAP and AD of FGFA, run command:

python experiments/eval_map_ad.py examples/rfcn_fgfa_7 data/ILSVRC2015_KITTI_FORMAT

Evaluate your own detector.

For every video sequence, output a file as <sequence_name>.txt. Each line in the file should be one single object in <frame_id> <class_id> <confidence> <xmin> <ymin> <xmax> <ymax> format.

Acknowledgement

This pure Python-based mAP evaluation code is refactored from Cartucho/mAP. It has been tested against the original matlab version.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].