All Projects → MingtaoGuo → FormResNet-Denoise-Gaussian-noise-TensorFlow

MingtaoGuo / FormResNet-Denoise-Gaussian-noise-TensorFlow

Licence: MIT License
Simplely implement the paper 'FormResNet: Formatted Residual Learning for Image Restoration' by TensorFlow

Programming Languages

python
139335 projects - #7 most used programming language

FormResNet-Denoise-Gaussian-noise-TensorFlow

Simplely implement the paper 'FormResNet: Formatted Residual Learning for Image Restoration' by TensorFlow

Introduction

This code just simplely implement the paper FormResNet: Formatted Residual Learning for Image Restoration, but there are some details of the code are different from the paper.

DataSets

The datasets include 400 gray images, but i have croped them into 40x40 patches. the croped datasets can be downloaded from my BaiduYun

Examples of TrainingSet

Python packages

====================

  1. python3.5
  2. tensorflow1.4.0
  3. pillow
  4. numpy
  5. scipy
  6. skimage

====================

Results of the code

Trained about 1 epoch, noise intensity sigma: 25

Raw Noised FormResNet[1] DnCNN[2]
- - psnr/ssim 26.12/0.82 psnr/ssim 25.01/0.77
- - psnr/ssim 32.01/0.83 psnr/ssim 31.24/0.80
- - psnr/ssim 25.55/0.86 psnr/ssim 24.53/0.82
- - psnr/ssim 27.66/0.85 psnr/ssim 26.30/0.82
- - psnr/ssim 29.392/0.90 psnr/ssim 28.85/0.87
- - psnr/ssim 28.34/0.84 psnr/ssim 28.00/0.80
- - psnr/ssim 22.00/0.83 psnr/ssim 23.83/0.79

[1] Jiao J, Tu W C, He S, et al. FormResNet: Formatted Residual Learning for Image Restoration[C]// Computer Vision and Pattern Recognition Workshops. IEEE, 2017:1034-1042.

[2] Zhang K, Zuo W, Chen Y, et al. Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising[J]. IEEE Transactions on Image Processing A Publication of the IEEE Signal Processing Society, 2017, 26(7):3142-3155.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].