All Projects → wangguanan → Aligngan

wangguanan / Aligngan

[ICCV2019] RGB-Infrared Cross-Modality Person Re-Identification via Joint Pixel and Feature Alignment

Programming Languages

python
139335 projects - #7 most used programming language

AlignGAN

This is the official implementation for AlignGAN(ICCV2019). Please refer our paper for more details:

[Paper, Poster] RGB-Infrared Cross-Modality Person Re-Identification via Joint Pixel and Feature Alignment

Guan'an Wang, Tianzhu Zhang, Jian Cheng, Si Liu, Yang Yang and Zengguang Hou

Bibtex

If you find the code useful, please consider citing our paper:

@InProceedings{wang2019aligngan,
author = {Wang, Guan'an and Zhang, Tianzhu and Cheng, Jian and Liu, Si and Yang, Yang and Hou, Zengguang},
title = {RGB-Infrared Cross-Modality Person Re-Identification via Joint Pixel and Feature Alignment},
booktitle = {The IEEE International Conference on Computer Vision (ICCV)},
month = {October},
year = {2019}
}

Dependencies

Dataset Preparation

  • SYSU-MM01 Dataset [link]

Train

# train, please replace sysu-mm01-path with your own path
python main.py --dataset_path sysu-mm01-path --mode train

Test with Pre-trained Model

  • pretrained model (Google Drive, Baidu Disk(code:zsr8)), please download all the 8 files into a folder.
  • test with the pre-trained model
# test with pretrained model, please sysu-mm01-path and pretrained-model-path with your own paths
python main.py --dataset_path sysu-mm01-path --mode test --pretrained_model_path pretrained-model-path --pretrained_model_index 250

Experimental Results

  • Settings

    • We trained our model with 4 GTX1080ti GPUs.
  • Comparison with SOTA

  • Pixel Alignment Module

  • Feature ALignment Module

Contacts

If you have any question about the project, please feel free to contact with me.

E-mail: [email protected]

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].