All Projects → Charleshhy → Grapy-ML

Charleshhy / Grapy-ML

Licence: MIT License
(AAAI2020)Grapy-ML: Graph Pyramid Mutual Learning for Cross-dataset Human Parsing

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to Grapy-ML

Clan
( CVPR2019 Oral ) Taking A Closer Look at Domain Shift: Category-level Adversaries for Semantics Consistent Domain Adaptation
Stars: ✭ 248 (+396%)
Mutual labels:  transfer-learning, semantic-segmentation
ObjectNet
PyTorch implementation of "Pyramid Scene Parsing Network".
Stars: ✭ 15 (-70%)
Mutual labels:  transfer-learning, semantic-segmentation
super-gradients
Easily train or fine-tune SOTA computer vision models with one open source training library
Stars: ✭ 429 (+758%)
Mutual labels:  transfer-learning, semantic-segmentation
Seg Uncertainty
IJCAI2020 & IJCV 2020 🌇 Unsupervised Scene Adaptation with Memory Regularization in vivo
Stars: ✭ 202 (+304%)
Mutual labels:  transfer-learning, semantic-segmentation
Segmentation-Series-Chaos
Summary and experiment includes basic segmentation, human segmentation, human or portrait matting for both image and video.
Stars: ✭ 75 (+50%)
Mutual labels:  semantic-segmentation, human-parsing
fetch
A set of deep learning models for FRB/RFI binary classification.
Stars: ✭ 19 (-62%)
Mutual labels:  transfer-learning
DLCV2018SPRING
Deep Learning for Computer Vision (CommE 5052) in NTU
Stars: ✭ 38 (-24%)
Mutual labels:  semantic-segmentation
unet pytorch
Pytorch implementation of UNet for converting aerial satellite images into google maps kinda images.
Stars: ✭ 27 (-46%)
Mutual labels:  semantic-segmentation
MNIST-multitask
6️⃣6️⃣6️⃣ Reproduce ICLR '18 under-reviewed paper "MULTI-TASK LEARNING ON MNIST IMAGE DATASETS"
Stars: ✭ 34 (-32%)
Mutual labels:  transfer-learning
tradaboost
Transfer learning algorithm TrAdaboost,coded by python
Stars: ✭ 96 (+92%)
Mutual labels:  transfer-learning
AD Prediction
Alzheimer's Disease Prediction by using ResNet, AlexNet
Stars: ✭ 118 (+136%)
Mutual labels:  transfer-learning
Inferno-Realtime-Fire-detection-using-CNNs
FPGA Deployable Fire Detection Model for Real-Time Video Surveillance Systems Using Convolutional Neural Networks
Stars: ✭ 24 (-52%)
Mutual labels:  transfer-learning
Robust-Semantic-Segmentation
Dynamic Divide-and-Conquer Adversarial Training for Robust Semantic Segmentation (ICCV2021)
Stars: ✭ 25 (-50%)
Mutual labels:  semantic-segmentation
KD3A
Here is the official implementation of the model KD3A in paper "KD3A: Unsupervised Multi-Source Decentralized Domain Adaptation via Knowledge Distillation".
Stars: ✭ 63 (+26%)
Mutual labels:  transfer-learning
digital peter aij2020
Materials of the AI Journey 2020 competition dedicated to the recognition of Peter the Great's manuscripts, https://ai-journey.ru/contest/task01
Stars: ✭ 61 (+22%)
Mutual labels:  transfer-learning
HoHoNet
"HoHoNet: 360 Indoor Holistic Understanding with Latent Horizontal Features" official pytorch implementation.
Stars: ✭ 65 (+30%)
Mutual labels:  semantic-segmentation
Swin-Transformer
This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".
Stars: ✭ 8,046 (+15992%)
Mutual labels:  semantic-segmentation
awesome-contrastive-self-supervised-learning
A comprehensive list of awesome contrastive self-supervised learning papers.
Stars: ✭ 748 (+1396%)
Mutual labels:  transfer-learning
building-footprint-segmentation
Building footprint segmentation from satellite and aerial imagery
Stars: ✭ 26 (-48%)
Mutual labels:  semantic-segmentation
SenseEarth2020-ChangeDetection
1st place solution to the Satellite Remote Sensing Image Change Detection Challenge hosted by SenseTime
Stars: ✭ 156 (+212%)
Mutual labels:  semantic-segmentation

Grapy-ML: Graph Pyramid Mutual Learning for Cross-dataset Human Parsing

This repository contains pytorch source code for AAAI2020 oral paper: Grapy-ML: Graph Pyramid Mutual Learning for Cross-dataset Human Parsing by Haoyu He, Jing Zhang, Qiming Zhang and Dacheng Tao.


Grapy-ML:

GPM


Getting Started:

Environment:

  • Pytorch = 1.1.0

  • torchvision

  • scipy

  • tensorboardX

  • numpy

  • opencv-python

  • matplotlib

Data Preparation:

You need to download the three datasets. The CIHP dataset and ATR dataset can be found in this repository and our code is heavily borrowed from it as well.

Then, the datasets should be arranged in the following folder, and images should be rearranged with the provided file structure.

/data/dataset/

Testing:

The pretrain models and some trained models are provided here for testing and training.

Model Name Description Derived from
deeplab_v3plus_v3.pth The Deeplab v3+'s pretrain weights
CIHP_pretrain.pth The reproduced Deeplab v3+ model trained on CIHP dataset deeplab_v3plus_v3.pth
CIHP_trained.pth GPM model trained on CIHP dataset CIHP_pretrain.pth
deeplab_multi-dataset.pth The reproduced multi-task learning Deeplab v3+ model trained on CIHP, PASCAL-Person-Part and ATR dataset deeplab_v3plus_v3.pth
GPM-ML_multi-dataset.pth Grapy-ML model trained on CIHP, PASCAL-Person-Part and ATR dataset deeplab_multi-dataset.pth
GPM-ML_finetune_PASCAL.pth Grapy-ML model finetuned on PASCAL-Person-Part dataset GPM-ML_multi-dataset.pth

To test, run the following two scripts:

bash eval_gpm.sh
bash eval_gpm_ml.sh

Training:

GPM:

During training, you first need to get the Deeplab pretrain model(e.g. CIHP_dlab.pth) on each dataset. Such act aims to provide a trustworthy initial raw result for the GSA operation in GPM.

bash train_dlab.sh

The imageNet pretrain model is provided in the following table, and you should swith the dataset name and target classes to the dataset you want in the script. (CIHP: 20 classes, PASCAL: 7 classes and ATR: 18 classes)

In the next step, you should utilize the Deeplab pretrain model to further train the GPM model.

bash train_gpm.sh 

It is recommended to follow the training settings in our paper to reproduce the results.

GPM-ML:

Firstly, you can conduct the deeplab pretrain process by the following script:

bash train_dlab_ml.sh

The multi-dataset Deeplab V3+ is transformed as a simple multi-task task.

Then, you can train the GPM-ML model with the training set from all three datasets by:

bash train_gpm_ml_all.sh

After this phase, the first two levels of the GPM-ML model would be more robust and generalized.

Finally, you can try to finetune on each dataset by the unified pretrain model.

bash train_gpm_ml_pascal.sh

Citation:

@inproceedings{he2020grapy,
title={Grapy-ML: Graph Pyramid Mutual Learning for Cross-dataset Human Parsing},
author={He, Haoyu and Zhang, Jing and Zhang, Qiming and Tao, Dacheng},
booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
year={2020}
}

Maintainer:

[email protected]

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].