All Projects → kumar-shridhar → Master Thesis Bayesiancnn

kumar-shridhar / Master Thesis Bayesiancnn

Licence: mit
Master Thesis on Bayesian Convolutional Neural Network using Variational Inference

Projects that are alternatives of or similar to Master Thesis Bayesiancnn

Pytorch Srgan
A modern PyTorch implementation of SRGAN
Stars: ✭ 289 (+30.18%)
Mutual labels:  convolutional-neural-networks, generative-adversarial-network, super-resolution
Jsi Gan
Official repository of JSI-GAN (Accepted at AAAI 2020).
Stars: ✭ 42 (-81.08%)
Mutual labels:  convolutional-neural-networks, generative-adversarial-network, super-resolution
Tensorflow Srgan
Tensorflow implementation of "Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network" (Ledig et al. 2017)
Stars: ✭ 33 (-85.14%)
Mutual labels:  convolutional-neural-networks, generative-adversarial-network, super-resolution
Iseebetter
iSeeBetter: Spatio-Temporal Video Super Resolution using Recurrent-Generative Back-Projection Networks | Python3 | PyTorch | GANs | CNNs | ResNets | RNNs | Published in Springer Journal of Computational Visual Media, September 2020, Tsinghua University Press
Stars: ✭ 202 (-9.01%)
Mutual labels:  convolutional-neural-networks, generative-adversarial-network, super-resolution
Bayesian cnn
Bayes by Backprop implemented in a CNN in PyTorch
Stars: ✭ 98 (-55.86%)
Mutual labels:  convolutional-neural-networks, bayesian-inference
Deep Learning For Beginners
videos, lectures, blogs for Deep Learning
Stars: ✭ 89 (-59.91%)
Mutual labels:  convolutional-neural-networks, generative-adversarial-network
Natsr
Natural and Realistic Single Image Super-Resolution with Explicit Natural Manifold Discrimination (CVPR, 2019)
Stars: ✭ 105 (-52.7%)
Mutual labels:  generative-adversarial-network, super-resolution
Drln
Densely Residual Laplacian Super-resolution, IEEE Pattern Analysis and Machine Intelligence (TPAMI), 2020
Stars: ✭ 120 (-45.95%)
Mutual labels:  convolutional-neural-networks, super-resolution
Exermote
Using Machine Learning to predict the type of exercise from movement data
Stars: ✭ 108 (-51.35%)
Mutual labels:  convolutional-neural-networks, generative-adversarial-network
Awesome Gan For Medical Imaging
Awesome GAN for Medical Imaging
Stars: ✭ 1,814 (+717.12%)
Mutual labels:  generative-adversarial-network, super-resolution
Ranksrgan
ICCV 2019 (oral) RankSRGAN: Generative Adversarial Networks with Ranker for Image Super-Resolution. PyTorch implementation
Stars: ✭ 213 (-4.05%)
Mutual labels:  generative-adversarial-network, super-resolution
Alice
NIPS 2017: ALICE: Towards Understanding Adversarial Learning for Joint Distribution Matching
Stars: ✭ 80 (-63.96%)
Mutual labels:  generative-adversarial-network, bayesian-inference
Seranet
Super Resolution of picture images using deep learning
Stars: ✭ 79 (-64.41%)
Mutual labels:  convolutional-neural-networks, super-resolution
Idn Caffe
Caffe implementation of "Fast and Accurate Single Image Super-Resolution via Information Distillation Network" (CVPR 2018)
Stars: ✭ 104 (-53.15%)
Mutual labels:  convolutional-neural-networks, super-resolution
Cyclegan Qp
Official PyTorch implementation of "Artist Style Transfer Via Quadratic Potential"
Stars: ✭ 59 (-73.42%)
Mutual labels:  convolutional-neural-networks, generative-adversarial-network
A Nice Mc
Code for "A-NICE-MC: Adversarial Training for MCMC"
Stars: ✭ 115 (-48.2%)
Mutual labels:  generative-adversarial-network, bayesian-inference
Mmediting
OpenMMLab Image and Video Editing Toolbox
Stars: ✭ 2,618 (+1079.28%)
Mutual labels:  generative-adversarial-network, super-resolution
A Pytorch Tutorial To Super Resolution
Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network | a PyTorch Tutorial to Super-Resolution
Stars: ✭ 157 (-29.28%)
Mutual labels:  generative-adversarial-network, super-resolution
Yann
This toolbox is support material for the book on CNN (http://www.convolution.network).
Stars: ✭ 41 (-81.53%)
Mutual labels:  convolutional-neural-networks, generative-adversarial-network
Deep Learning With Pytorch Tutorials
深度学习与PyTorch入门实战视频教程 配套源代码和PPT
Stars: ✭ 1,986 (+794.59%)
Mutual labels:  convolutional-neural-networks, generative-adversarial-network

Master Thesis: Bayesian Convolutional Neural Networks

Thesis work submitted at Computer Science department at University of Kaiserslautern.

License MIT

Author

Supervisors

  • Prof. Marcus Liwicki (Professor at Luleå University, Sweden)
  • Felix Laumann (PhD candidate at Imperial College, London)

Abstract

Artificial Neural Networks are connectionist systems that perform a given task by learning on examples without having a prior knowledge about the task. This is done by finding an optimal point estimate for the weights in every node. Generally, the network using point estimates as weights perform well with large datasets, but they fail to express uncertainty in regions with little or no data, leading to overconfident decisions.

In this thesis, Bayesian Convolutional Neural Network (BayesCNN) using Variational Inference is proposed, that introduces probability distribution over the weights. Furthermore, the proposed BayesCNN architecture is applied to tasks like Image Classification, Image Super-Resolution and Generative Adversarial Networks.

BayesCNN is based on Bayes by Backprop which derives a variational approximation to the true posterior. Our proposed method not only achieves performances equivalent to frequentist inference in identical architectures but also incorporate a measurement for uncertainties and regularisation. It further eliminates the use of dropout in the model. Moreover, we predict how certain the model prediction is based on the epistemic and aleatoric uncertainties and finally, we propose ways to prune the Bayesian architecture and to make it more computational and time effective.

In the first part of the thesis, the Bayesian Neural Network is explained and it is applied to an Image Classification task. The results are compared to point-estimates based architectures on MNIST, CIFAR-10, CIFAR-100 and STL-10 datasets. Moreover, uncertainties are calculated and the architecture is pruned and a comparison between the results is drawn.

In the second part of the thesis, the concept is further applied to other computer vision tasks namely, Image Super-Resolution and Generative Adversarial Networks. The concept of BayesCNN is tested and compared against other concepts in a similar domain.


Code base

The proposed work has been implemented in PyTorch and is available here : BayesianCNN


Chapter Overview

Chapter 1 : Introduction

  • Why there is a need for Bayesian Networks?

  • Problem Statement

  • Current Situation

  • Our Hypothesis

  • Our Contribution

Chapter 2: Background

  • Neural Networks and Convolutional Neural Networks

  • Concepts overview of Variational Inference, and local reparameterization trick in Bayesian Neural Network.

  • Backpropagation in Bayesian Networks using Bayes by Backprop.

  • Estimation of Uncertainties in a network.

  • Pruning a network to reduce the number of overall parameters without affecting it's performance.

Chapter 3: Related Work

  • How Bayesian Methods were applied to Neural Networks for the intractable true posterior distribution.

  • Various ways of training Neural Networks posterior probability distributions: Laplace approximations, Monte Carlo and Variational Inference.

  • Proposals on Dropout and Gaussian Dropout as Variational Inference schemes.

  • Work done in the past for uncertainty estimation in Neural Network.

  • Ways to reduce the number of parameters in a model.

Chapter 4: Concept

  • Bayesian CNN with Variational Inference based on Bayes by Backprop.

  • Bayesian convolutional operations with mean and variance.

  • Local reparameterization trick for Bayesian CNN.

  • Uncertainty estimation in a Bayesian network.

  • Using L1 norm for reducing the number of parameters in a Bayesian network.

Chapter 5: Empirical Analysis

  • Applying Bayesian CNN for the task of Image Recognition on MNIST, CIFAR-10, CIFAR-100 and STL-10 datasets.

  • Comparison of results of Bayesian CNN with Normal CNN architectures on similar datasets.

  • Regularization effect of Bayesian Network with dropouts.

  • Distribution of mean and variance in Bayesian CNN over time.

  • Parameters comparison before and after model pruning.

Chapter 6: Applications

  • Empirical analysis of BayesCNN with normal architecture for Image Super Resolution.

  • Empirical analysis of BayesCNN with normal architecture for Generative Adversarial Networks.

Chapter 7: Conclusion and Outlook

  • Conclusion

Appendix A

  • Experiment Specification

Appendix B

  • How to replicate results

Paper

@article{shridhar2019comprehensive,
  title={A Comprehensive guide to Bayesian Convolutional Neural Network with Variational Inference},
  author={Shridhar, Kumar and Laumann, Felix and Liwicki, Marcus},
  journal={arXiv preprint arXiv:1901.02731},
  year={2019}
}

Thesis Template


Contact


Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].