All Projects → layumi → 2016_super_resolution

layumi / 2016_super_resolution

Licence: mit
ICCV2015 Image Super-Resolution Using Deep Convolutional Networks

Labels

Projects that are alternatives of or similar to 2016 super resolution

Gdax Orderbook Ml
Application of machine learning to the Coinbase (GDAX) orderbook
Stars: ✭ 60 (-23.08%)
Mutual labels:  cuda
Autodock Gpu
AutoDock for GPUs and other accelerators
Stars: ✭ 65 (-16.67%)
Mutual labels:  cuda
Parenchyma
An extensible HPC framework for CUDA, OpenCL and native CPU.
Stars: ✭ 71 (-8.97%)
Mutual labels:  cuda
Ggnn
GGNN: State of the Art Graph-based GPU Nearest Neighbor Search
Stars: ✭ 63 (-19.23%)
Mutual labels:  cuda
Cudadrv.jl
A Julia wrapper for the CUDA driver API.
Stars: ✭ 64 (-17.95%)
Mutual labels:  cuda
Alenka
GPU database engine
Stars: ✭ 1,150 (+1374.36%)
Mutual labels:  cuda
Pycuda
CUDA integration for Python, plus shiny features
Stars: ✭ 1,112 (+1325.64%)
Mutual labels:  cuda
Hiop
HPC solver for nonlinear optimization problems
Stars: ✭ 75 (-3.85%)
Mutual labels:  cuda
Arboretum
Gradient Boosting powered by GPU(NVIDIA CUDA)
Stars: ✭ 64 (-17.95%)
Mutual labels:  cuda
Project Currennt Public
CURRENNNT codes and scripts
Stars: ✭ 69 (-11.54%)
Mutual labels:  cuda
Cutlass
CUDA Templates for Linear Algebra Subroutines
Stars: ✭ 1,123 (+1339.74%)
Mutual labels:  cuda
Cudadtw
GPU-Suite
Stars: ✭ 63 (-19.23%)
Mutual labels:  cuda
Torch sampling
Efficient reservoir sampling implementation for PyTorch
Stars: ✭ 68 (-12.82%)
Mutual labels:  cuda
Tsne Cuda
GPU Accelerated t-SNE for CUDA with Python bindings
Stars: ✭ 1,120 (+1335.9%)
Mutual labels:  cuda
Titan
A high-performance CUDA-based physics simulation sandbox for soft robotics and reinforcement learning.
Stars: ✭ 73 (-6.41%)
Mutual labels:  cuda
Minkowskiengine
Minkowski Engine is an auto-diff neural network library for high-dimensional sparse tensors
Stars: ✭ 1,110 (+1323.08%)
Mutual labels:  cuda
Build Deep Learning Env With Tensorflow Python Opencv
Tutorial on how to build your own research envirorment for Deep Learning with OpenCV, Python, Tensorfow
Stars: ✭ 66 (-15.38%)
Mutual labels:  cuda
Cuda Design Patterns
Some CUDA design patterns and a bit of template magic for CUDA
Stars: ✭ 78 (+0%)
Mutual labels:  cuda
Cudart.jl
Julia wrapper for CUDA runtime API
Stars: ✭ 75 (-3.85%)
Mutual labels:  cuda
Deepjointfilter
The source code of ECCV16 'Deep Joint Image Filtering'.
Stars: ✭ 68 (-12.82%)
Mutual labels:  cuda

2016_super_resolution

Image Super-Resolution Using Deep Convolutional Networks ICCV2015

I re-implement this paper and includes my train and test code in this repository. This code uses MIT License.

Note that:

Thanks for @star4s. I fixed some bugs in the network training code and made the code more clear to use. (2017/4/29)

Training data

I random selected about 60,000 pic from 2014 ILSVR2014_train (only academic) You can download from (Sorry. My Google Driver is out of storage. So I remove it. ) or BaiduYun

Result

This code get the better performance than 'bicubic' for enlarging a 2x pic. It can be trained and tested now.

original pic -> super resolution pic (trained by matconvnet)

How to train & test

1.You may compile matconvnet first by running gpu_compile.m (you need to change some setting in it)

For more compile information, you can learn it from www.vlfeat.org/matconvnet/install/#compiling

2.run testSRnet_result.m for test result.

3.If you want to train it by yourself, you may download my data and use prepare_ur_data.m to produce imdb.mat which include every picture path.

4.Use train_SRnet.m to have fun~

(I also provide a verson for gray-scale images. But the improvement is limited. You can learn more from train_SRnet_gray.m and testSRnet_gray.m)

Small Tricks

1.I fix the scale factor 2(than 2+2*rand). It seems to be easy for net to learn more information.

2.How to initial net? (You can learn more from /matlab/+dagnn/@DagNN/initParam.m) In this work, the initial weight is important!

Citation

We greatly appreciate it if you can cite the website in your publications:

@misc{2016_super_resolution,
  title = {{2016_super_resolution}},
  howpublished = "\url{https://github.com/layumi/2016_super_resolution}",
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].