twtygqyy / Pytorch Lapsrn
Licence: mit
Pytorch implementation for LapSRN (CVPR2017)
Stars: ✭ 215
Programming Languages
python
139335 projects - #7 most used programming language
Projects that are alternatives of or similar to Pytorch Lapsrn
Srrescgan
Code repo for "Deep Generative Adversarial Residual Convolutional Networks for Real-World Super-Resolution" (CVPRW NTIRE2020).
Stars: ✭ 44 (-79.53%)
Mutual labels: deep-neural-networks, super-resolution
Wdsr ntire2018
Code of our winning entry to NTIRE super-resolution challenge, CVPR 2018
Stars: ✭ 570 (+165.12%)
Mutual labels: deep-neural-networks, super-resolution
Pytorch Vdsr
VDSR (CVPR2016) pytorch implementation
Stars: ✭ 313 (+45.58%)
Mutual labels: deep-neural-networks, super-resolution
Cfsrcnn
Coarse-to-Fine CNN for Image Super-Resolution (IEEE Transactions on Multimedia,2020)
Stars: ✭ 84 (-60.93%)
Mutual labels: deep-neural-networks, super-resolution
Learnopencv
Learn OpenCV : C++ and Python Examples
Stars: ✭ 15,385 (+7055.81%)
Mutual labels: deep-neural-networks
Chameleon recsys
Source code of CHAMELEON - A Deep Learning Meta-Architecture for News Recommender Systems
Stars: ✭ 202 (-6.05%)
Mutual labels: deep-neural-networks
Tensorflow Deep Learning
All course materials for the Zero to Mastery Deep Learning with TensorFlow course.
Stars: ✭ 170 (-20.93%)
Mutual labels: deep-neural-networks
Generative inpainting
DeepFill v1/v2 with Contextual Attention and Gated Convolution, CVPR 2018, and ICCV 2019 Oral
Stars: ✭ 2,659 (+1136.74%)
Mutual labels: deep-neural-networks
Paddlegan
PaddlePaddle GAN library, including lots of interesting applications like First-Order motion transfer, wav2lip, picture repair, image editing, photo2cartoon, image style transfer, and so on.
Stars: ✭ 4,987 (+2219.53%)
Mutual labels: super-resolution
Oneflow
OneFlow is a performance-centered and open-source deep learning framework.
Stars: ✭ 2,868 (+1233.95%)
Mutual labels: deep-neural-networks
Tsne Umap Embedding Visualisation
A Simple and easy to use way to Visualise Embeddings!
Stars: ✭ 203 (-5.58%)
Mutual labels: deep-neural-networks
Chanlun
文件 笔和线段的一种划分.py,只需要把k线high,low数据输入,就能自动实现笔,线段,中枢,买卖点,走势类型的划分了。可以把sh.csv 作为输入文件。个人简历见.pdf。时间的力量。有人说择时很困难,有人说选股很容易,有人说统计套利需要的IT配套设施很重要。还有人说系统有不可测原理。众说纷纭。分布式的系统,当你的影响可以被忽略,你才能实现,Jiang主席所谓之,闷声发大财。
Stars: ✭ 206 (-4.19%)
Mutual labels: deep-neural-networks
Iseebetter
iSeeBetter: Spatio-Temporal Video Super Resolution using Recurrent-Generative Back-Projection Networks | Python3 | PyTorch | GANs | CNNs | ResNets | RNNs | Published in Springer Journal of Computational Visual Media, September 2020, Tsinghua University Press
Stars: ✭ 202 (-6.05%)
Mutual labels: super-resolution
Ignn
Code repo for "Cross-Scale Internal Graph Neural Network for Image Super-Resolution" (NeurIPS'20)
Stars: ✭ 210 (-2.33%)
Mutual labels: super-resolution
Halite Ii
Season 2 of @twosigma's artificial intelligence programming challenge
Stars: ✭ 201 (-6.51%)
Mutual labels: deep-neural-networks
Deep Mri Reconstruction
Deep Cascade of Convolutional Neural Networks for MR Image Reconstruction: Implementation & Demo
Stars: ✭ 204 (-5.12%)
Mutual labels: deep-neural-networks
Ml Examples
Arm Machine Learning tutorials and examples
Stars: ✭ 207 (-3.72%)
Mutual labels: deep-neural-networks
Pytorch realtime multi Person pose estimation
Pytorch version of Realtime Multi-Person Pose Estimation project
Stars: ✭ 205 (-4.65%)
Mutual labels: cvpr-2017
Character Based Cnn
Implementation of character based convolutional neural network
Stars: ✭ 205 (-4.65%)
Mutual labels: deep-neural-networks
PyTorch LapSRN
Implementation of CVPR2017 Paper: "Deep Laplacian Pyramid Networks for Fast and Accurate Super-Resolution"(http://vllab.ucmerced.edu/wlai24/LapSRN/) in PyTorch
Usage
Training
usage: main.py [-h] [--batchSize BATCHSIZE] [--nEpochs NEPOCHS] [--lr LR]
[--step STEP] [--cuda] [--resume RESUME]
[--start-epoch START_EPOCH] [--threads THREADS]
[--momentum MOMENTUM] [--weight-decay WEIGHT_DECAY]
[--pretrained PRETRAINED]
PyTorch LapSRN
optional arguments:
-h, --help show this help message and exit
--batchSize BATCHSIZE
training batch size
--nEpochs NEPOCHS number of epochs to train for
--lr LR Learning Rate. Default=1e-4
--step STEP Sets the learning rate to the initial LR decayed by
momentum every n epochs, Default: n=10
--cuda Use cuda?
--resume RESUME Path to checkpoint (default: none)
--start-epoch START_EPOCH
Manual epoch number (useful on restarts)
--threads THREADS Number of threads for data loader to use, Default: 1
--momentum MOMENTUM Momentum, Default: 0.9
--weight-decay WEIGHT_DECAY, --wd WEIGHT_DECAY
weight decay, Default: 1e-4
--pretrained PRETRAINED
path to pretrained model (default: none)
An example of training usage is shown as follows:
python main_lapsrn.py --cuda
Evaluation
usage: eval.py [-h] [--cuda] [--model MODEL] [--dataset DATASET]
[--scale SCALE]
PyTorch LapSRN Eval
optional arguments:
-h, --help show this help message and exit
--cuda use cuda?
--model MODEL model path
--dataset DATASET dataset name, Default: Set5
--scale SCALE scale factor, Default: 4
Demo
usage: demo.py [-h] [--cuda] [--model MODEL] [--image IMAGE] [--scale SCALE]
PyTorch LapSRN Demo
optional arguments:
-h, --help show this help message and exit
--cuda use cuda?
--model MODEL model path
--image IMAGE image name
--scale SCALE scale factor, Default: 4
We convert Set5 test set images to mat format using Matlab, for best PSNR performance, please use Matlab
Prepare Training dataset
- We provide a simple hdf5 format training sample in data folder with 'data', 'label_x2', and 'label_x4' keys, the training data is generated with Matlab Bicubic Interplotation, please refer Code for Data Generation for creating training files.
Performance
- We provide a pretrained LapSRN x4 model trained on T91 and BSDS200 images from SR_training_datasets with data augmentation as mentioned in the paper
- No bias is used in this implementation, and another difference from paper is that Adam optimizer with 1e-4 learning is applied instead of SGD
- Performance in PSNR on Set5, Set14, and BSD100
DataSet/Method | LapSRN Paper | LapSRN PyTorch |
---|---|---|
Set5 | 31.54 | 31.65 |
Set14 | 28.19 | 28.27 |
BSD100 | 27.32 | 27.36 |
ToDos
- LapSRN x8
- LapGAN Evaluation
Citation
If you find the code and datasets useful in your research, please cite:
@inproceedings{LapSRN,
author = {Lai, Wei-Sheng and Huang, Jia-Bin and Ahuja, Narendra and Yang, Ming-Hsuan},
title = {Deep Laplacian Pyramid Networks for Fast and Accurate Super-Resolution},
booktitle = {IEEE Conferene on Computer Vision and Pattern Recognition},
year = {2017}
}
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].