ucas-vg / Tinybenchmark
Licence: mit
Scale Match for Tiny Person Detection(WACV2020), Official link of the dataset
Stars: ✭ 364
Programming Languages
python
139335 projects - #7 most used programming language
Labels
Projects that are alternatives of or similar to Tinybenchmark
Crowdsec
CrowdSec - the open-source and participative IPS able to analyze visitor behavior & provide an adapted response to all kinds of attacks. It also leverages the crowd power to generate a global CTI database to protect the user network.
Stars: ✭ 4,204 (+1054.95%)
Mutual labels: detection
Vott
Visual Object Tagging Tool: An electron app for building end to end Object Detection Models from Images and Videos.
Stars: ✭ 3,684 (+912.09%)
Mutual labels: detection
Sqli Hunter
SQLi-Hunter is a simple HTTP / HTTPS proxy server and a SQLMAP API wrapper that makes digging SQLi easy.
Stars: ✭ 340 (-6.59%)
Mutual labels: detection
Cvpods
All-in-one Toolbox for Computer Vision Research.
Stars: ✭ 277 (-23.9%)
Mutual labels: detection
Tensorflow 2.x Yolov3
YOLOv3 implementation in TensorFlow 2.3.1
Stars: ✭ 300 (-17.58%)
Mutual labels: detection
Php Opencv Examples
Tutorial for computer vision and machine learning in PHP 7/8 by opencv (installation + examples + documentation)
Stars: ✭ 333 (-8.52%)
Mutual labels: detection
Detection
ASP.NET Core Detection with Responsive View for identifying details about client device, browser, engine, platform, & crawler. Responsive middleware for routing base upon request client device detection to specific view.
Stars: ✭ 335 (-7.97%)
Mutual labels: detection
Android Object Detection
☕️ Fast-RCNN and Scene Recognition using Caffe
Stars: ✭ 295 (-18.96%)
Mutual labels: detection
Text Image Augmentation
Geometric Augmentation for Text Image
Stars: ✭ 333 (-8.52%)
Mutual labels: detection
Faster rcnn for dota
Code used for training Faster R-CNN on DOTA
Stars: ✭ 283 (-22.25%)
Mutual labels: detection
Fingerprintjs
Browser fingerprinting library with the highest accuracy and stability.
Stars: ✭ 15,481 (+4153.02%)
Mutual labels: detection
Rectlabel Support
RectLabel - An image annotation tool to label images for bounding box object detection and segmentation.
Stars: ✭ 338 (-7.14%)
Mutual labels: detection
Scale Match for Tiny Person Detection
[paper] [ECCVW] [challenge] [ECCVW sumarry]
The annotaions of test set have released aready !!!
Baidu Yun, password:pmcq and Google Driver
For how to use the test_set annotation to evaluate, please see Evaluation
TODO list
- add a tutorial that how to train on TinyPerson with scale match on COCO
- add a tutorial that how to train on other dataset
- add a tutorial that how to train a strong baseline for competetion
Dataset
TinyPerson Dataset
The dataset will be used to for ECCV2020 workshop RLQ-TOD'20 @ ECCV, TOD challenge
Download link:
Official Site: recomended, download may faster
Baidu Pan password: pmcq
Google Driver
For more details about TinyPerson dataset, please see Dataset.
Tiny Citypersons
Baidu Pan password:vwq2
Tiny Benchmark
The benchmark is based on maskrcnn_benchmark and citypersons code.
For more details about the benchmark, please see Tiny Benchmark.
Scale Match
Citation
If you use the code and benchmark in your research, please cite:
@inproceedings{yu2020scale,
title={Scale Match for Tiny Person Detection},
author={Yu, Xuehui and Gong, Yuqi and Jiang, Nan and Ye, Qixiang and Han, Zhenjun},
booktitle={The IEEE Winter Conference on Applications of Computer Vision},
pages={1257--1265},
year={2020}
}
And if the ECCVW challenge sumarry do some help for your research, please cite:
@article{yu20201st,
title={The 1st Tiny Object Detection Challenge: Methods and Results},
author={Yu, Xuehui and Han, Zhenjun and Gong, Yuqi and Jan, Nan and Zhao, Jian and Ye, Qixiang and Chen, Jie and Feng, Yuan and Zhang, Bin and Wang, Xiaodi and others},
journal={arXiv preprint arXiv:2009.07506},
year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].