All Projects → Zheng222 → PPCN

Zheng222 / PPCN

Licence: other
Tensorflow implementation of Perception-Preserving Convolutional Networks for Image Enhancement on Smartphones (ECCV 2018 Workshop PIRM)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to PPCN

Awesome-ICCV2021-Low-Level-Vision
A Collection of Papers and Codes for ICCV2021 Low Level Vision and Image Generation
Stars: ✭ 163 (+181.03%)
Mutual labels:  image-enhancement
pytorch-neural-enhance
Experiments on CNN-based image enhancement in pytorch
Stars: ✭ 29 (-50%)
Mutual labels:  image-enhancement
subjectiveqe-esrgan
PyTorch implementation of ESRGAN (ECCVW 2018) for compressed image subjective quality enhancement.
Stars: ✭ 12 (-79.31%)
Mutual labels:  image-enhancement
Awesome-Underwater-Image-Enhancement
A collection of awesome underwater image enhancement methods.
Stars: ✭ 57 (-1.72%)
Mutual labels:  image-enhancement
image-filtering
image filtering techniques in python with examples
Stars: ✭ 19 (-67.24%)
Mutual labels:  image-enhancement
UWCNN
Code and Datasets for "Underwater Scene Prior Inspired Deep Underwater Image and Video Enhancement", Pattern Recognition, 2019
Stars: ✭ 82 (+41.38%)
Mutual labels:  image-enhancement
Mobile Image-Video Enhancement
Sensifai image and video enhancement module on mobiles
Stars: ✭ 39 (-32.76%)
Mutual labels:  image-enhancement
StarEnhancer
[ICCV 2021 Oral] StarEnhancer: Learning Real-Time and Style-Aware Image Enhancement
Stars: ✭ 127 (+118.97%)
Mutual labels:  image-enhancement
CURL
Code for the ICPR 2020 paper: "CURL: Neural Curve Layers for Image Enhancement"
Stars: ✭ 177 (+205.17%)
Mutual labels:  image-enhancement
Image-Contrast-Enhancement
C++ implementation of several image contrast enhancement techniques.
Stars: ✭ 139 (+139.66%)
Mutual labels:  image-enhancement
ICCV2021-Single-Image-Desnowing-HDCWNet
This paper is accepted by ICCV 2021.
Stars: ✭ 47 (-18.97%)
Mutual labels:  image-enhancement
Awesome-low-level-vision-resources
A curated list of resources for Low-level Vision Tasks
Stars: ✭ 35 (-39.66%)
Mutual labels:  image-enhancement
PESR
Official code (Pytorch) for paper Perception-Enhanced Single Image Super-Resolution via Relativistic Generative Networks
Stars: ✭ 28 (-51.72%)
Mutual labels:  pirm-eccv-2018

Python 3.5

PPCN (Team: Rainbow)

[Paper_download][Paper_CVF][Paper_Springer]

Super-Resolution Task


The schematics of the proposed network for image super-resolution

Training

First, download the DIV2K dataset and unzip it in train_SR/ folder.

Run the following command to train the SR model

python train_SR.py

Testing

First, download the SR_Test_Datasets and put them in test/SR_test_data folder.

Run the following command to super-resolve low-resolution images

python evaluate_super_resolution.py

Enhancement Task


DPED image enhanced by our method


The structure of the proposed generator and discriminator for image enhancement

Training

  • Step1: download the pre-trained VGG19 model and put it into train/vgg_pretrained/ folder
  • Step2: download DPED dataset and extract it into train/dped/ folder.
  • Step3: train the teacher model by executing the following command
python train_teacher.py
  • Step4: train the student model by running
python train_student.py

Testing

Run the following command to enhance low-quality images

python evaluate_enhancement.py

Citation

If you find PPCN useful in your research, please consider citing:

@inproceedings{Hui-PPCN-2018,
  title={Perception-Preserving Convolutional Networks for Image Enhancement on Smartphones},
  author={Hui, Zheng and Wang, Xiumei and Deng, Lirui and Gao, Xinbo},
  booktitle={ECCV Workshop},
  pages = {197--213},
  year={2018}
}

Code References

[1]https://github.com/aiff22/ai-challenge

[2]https://github.com/aiff22/DPED

[3]https://github.com/roimehrez/contextualLoss

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].