All Projects → liuwei16 → Alfnet

liuwei16 / Alfnet

Code for 'Learning Efficient Single-stage Pedestrian Detectors by Asymptotic Localization Fitting' in ECCV2018

Projects that are alternatives of or similar to Alfnet

Seqface
SeqFace : Making full use of sequence information for face recognition
Stars: ✭ 125 (-0.79%)
Mutual labels:  jupyter-notebook
Geostatsmodels
This is a collection of geostatistical scripts written in Python
Stars: ✭ 125 (-0.79%)
Mutual labels:  jupyter-notebook
The Data Science Workshop
A New, Interactive Approach to Learning Data Science
Stars: ✭ 126 (+0%)
Mutual labels:  jupyter-notebook
Datacamp facebook live nlp
DataCamp Facebook Live Code Along Session 1: Enjoy.
Stars: ✭ 125 (-0.79%)
Mutual labels:  jupyter-notebook
Dive Into Machine Learning
Dive into Machine Learning with Python Jupyter notebook and scikit-learn! First posted in 2016, maintained as of 2021. Pull requests welcome.
Stars: ✭ 10,810 (+8479.37%)
Mutual labels:  jupyter-notebook
Scir Training Day
a small training program for new crews of HIT-SCIR
Stars: ✭ 125 (-0.79%)
Mutual labels:  jupyter-notebook
Keras Mdn Layer
An MDN Layer for Keras using TensorFlow's distributions module
Stars: ✭ 125 (-0.79%)
Mutual labels:  jupyter-notebook
Cmucomputationalphotography
Jupyter Notebooks for CMU Computational Photography Course 15.463
Stars: ✭ 126 (+0%)
Mutual labels:  jupyter-notebook
First Order Model
This repository contains the source code for the paper First Order Motion Model for Image Animation
Stars: ✭ 11,964 (+9395.24%)
Mutual labels:  jupyter-notebook
Understandingbdl
Stars: ✭ 126 (+0%)
Mutual labels:  jupyter-notebook
Pytorch Model Zoo
A collection of deep learning models implemented in PyTorch
Stars: ✭ 125 (-0.79%)
Mutual labels:  jupyter-notebook
Skills Ml
Data Processing and Machine learning methods for the Open Skills Project
Stars: ✭ 125 (-0.79%)
Mutual labels:  jupyter-notebook
Modular Rl
[ICML 2020] PyTorch Code for "One Policy to Control Them All: Shared Modular Policies for Agent-Agnostic Control"
Stars: ✭ 126 (+0%)
Mutual labels:  jupyter-notebook
Understanding Pytorch Batching Lstm
Understanding and visualizing PyTorch Batching with LSTM
Stars: ✭ 125 (-0.79%)
Mutual labels:  jupyter-notebook
Teaching Monolith
Data science teaching materials
Stars: ✭ 126 (+0%)
Mutual labels:  jupyter-notebook
Choicenet
Implementation of ChoiceNet
Stars: ✭ 125 (-0.79%)
Mutual labels:  jupyter-notebook
Python Audio
Some Jupyter notebooks about audio signal processing with Python
Stars: ✭ 125 (-0.79%)
Mutual labels:  jupyter-notebook
Normalizing Flows
Understanding normalizing flows
Stars: ✭ 126 (+0%)
Mutual labels:  jupyter-notebook
Meteorological Books
气象相关书籍合集(持续更新)
Stars: ✭ 125 (-0.79%)
Mutual labels:  jupyter-notebook
Distance Encoding
Distance Encoding for GNN Design
Stars: ✭ 126 (+0%)
Mutual labels:  jupyter-notebook

Learning Efficient Single-stage Pedestrian Detectors by Asymptotic Localization Fitting

Keras implementation of ALFNet accepted in ECCV 2018.

Introduction

This paper is a step forward pedestrian detection for both speed and accuracy. Specifically, a structurally simple but effective module called Asymptotic Localization Fitting (ALF) is proposed, which stacks a series of predictors to directly evolve the default anchor boxes step by step into improving detection results. As a result, during training the latter predictors enjoy more and better-quality positive samples, meanwhile harder negatives could be mined with increasing IoU thresholds. On top of this, an efficient single-stage pedestrian detection architecture (denoted as ALFNet) is designed, achieving state-of-the-art performance on CityPersons and Caltech. For more details, please refer to our paper.

img01

Dependencies

  • Python 2.7
  • Numpy
  • Tensorflow 1.x
  • Keras 2.0.6
  • OpenCV

Contents

  1. Installation
  2. Preparation
  3. Models
  4. Training
  5. Test
  6. Evaluation

Installation

  1. Get the code. We will call the cloned directory as '$ALFNet'.
  git clone https://github.com/liuwei16/ALFNet.git
  1. Install the requirments.
  pip install -r requirements.txt

Preparation

  1. Download the dataset. We trained and tested our model on the recent CityPersons pedestrian detection dataset, you should firstly download the datasets. By default, we assume the dataset is stored in '$ALFNet/data/cityperson/'.

  2. Dataset preparation. We have provided the cache files of training and validation subsets. Optionally, you can also follow the ./generate_data.py to create the cache files for training and validation. By default, we assume the cache files is stored in '$ALFNet/data/cache/cityperson/'.

  3. Download the initialized models. We use the backbone ResNet-50 and MobileNet_v1 in our experiments. By default, we assume the weight files is stored in '$ALFNet/data/models/'.

Models

We have provided the models that are trained on the training subset with different ALF steps and backbone architectures. To help reproduce the results in our paper,

  1. For ResNet-50:

ALFNet-1s: city_res50_1step.hdf5

ALFNet-2s: city_res50_2step.hdf5

ALFNet-3s: city_res50_3step.hdf5

  1. For MobileNet:

MobNet-1s: city_mobnet_1step.hdf5

MobNet-2s: city_mobnet_2step.hdf5

Training

Optionally, you should set the training parameters in ./keras_alfnet/config.py.

  1. Train with different backbone networks.

Follow the ./train.py to start training. You can modify the parameter 'self.network' in ./keras_alfnet/config.py for different backbone networks. By default, the output weight files will be saved in '$ALFNet/output/valmodels/(network)/'.

  1. Train with different ALF steps.

Follow the ./train.py to start training. You can modify the parameter 'self.steps' in ./keras_alfnet/config.py for different ALF steps. By default, the output weight files will be saved in '$ALFNet/output/valmodels/(network)/(num of)steps'.

  1. Update: Train with the strategy of weight moving average (WMA)

Optionally, we provide an example of training ALFNet-2s with WMA (./train_2step_wma.py)

WMA is firstly proposed in Mean-Teacher.

We find that WMA is helpful to achieve more stable results and one trial is given in ./results_2step_wma.txt

Test

Follow the ./test.py to get the detection results. By default, the output .txt files will be saved in '$ALFNet/output/valresults/(network)/(num of)steps'.

Evaluation

  1. Follow the ./evaluation/dt_txt2json.m to convert the '.txt' files to '.json'.
  2. Follow the ./evaluation/eval_script/eval_demo.py to get the Miss Rate (MR) results of models. By default, the models are evaluated based on the Reasonable settting. Optionally, you can modify the parameters in ./evaluation/eval_script/eval_MR_multisetup.py to evaluate the models in different settings, such as different occlusion levels and IoU thresholds.

Citation

If you think our work is useful in your research, please consider citing:

@InProceedings{Liu_2018_ECCV,
author = {Liu, Wei and Liao, Shengcai and Hu, Weidong and Liang, Xuezhi and Chen, Xiao},
title = {Learning Efficient Single-stage Pedestrian Detectors by Asymptotic Localization Fitting},
booktitle = {The European Conference on Computer Vision (ECCV)},
month = {September},
year = {2018}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].