da-molchanov / Variance Networks
Licence: apache-2.0
Variance Networks: When Expectation Does Not Meet Your Expectations, ICLR 2019
Stars: ✭ 38
Programming Languages
python
139335 projects - #7 most used programming language
Labels
Projects that are alternatives of or similar to Variance Networks
Lstms.pth
PyTorch implementations of LSTM Variants (Dropout + Layer Norm)
Stars: ✭ 111 (+192.11%)
Mutual labels: dropout
LoL-Match-Prediction
Win probability predictions for League of Legends matches using neural networks
Stars: ✭ 34 (-10.53%)
Mutual labels: dropout
Deepnet
Implementation of CNNs, RNNs, and many deep learning techniques in plain Numpy.
Stars: ✭ 285 (+650%)
Mutual labels: dropout
Densenet Sdr
repo that holds code for improving on dropout using Stochastic Delta Rule
Stars: ✭ 148 (+289.47%)
Mutual labels: dropout
ALRA
Imputation method for scRNA-seq based on low-rank approximation
Stars: ✭ 48 (+26.32%)
Mutual labels: dropout
Machine-Learning-in-Python-Workshop
My workshop on machine learning using python language to implement different algorithms
Stars: ✭ 89 (+134.21%)
Mutual labels: dropout
Icellr
Single (i) Cell R package (iCellR) is an interactive R package to work with high-throughput single cell sequencing technologies (i.e scRNA-seq, scVDJ-seq, ST and CITE-seq).
Stars: ✭ 80 (+110.53%)
Mutual labels: dropout
Dropblock
Implementation of DropBlock: A regularization method for convolutional networks in PyTorch.
Stars: ✭ 466 (+1126.32%)
Mutual labels: dropout
numpy-neuralnet-exercise
Implementation of key concepts of neuralnetwork via numpy
Stars: ✭ 49 (+28.95%)
Mutual labels: dropout
dropclass speaker
DropClass and DropAdapt - repository for the paper accepted to Speaker Odyssey 2020
Stars: ✭ 20 (-47.37%)
Mutual labels: dropout
Tensorflow Mnist Cnn
MNIST classification using Convolutional NeuralNetwork. Various techniques such as data augmentation, dropout, batchnormalization, etc are implemented.
Stars: ✭ 182 (+378.95%)
Mutual labels: dropout
Targeted Dropout
Complementary code for the Targeted Dropout paper
Stars: ✭ 251 (+560.53%)
Mutual labels: dropout
Daguan 2019 rank9
datagrand 2019 information extraction competition rank9
Stars: ✭ 121 (+218.42%)
Mutual labels: dropout
Sdr Densenet Pytorch
Stochastic Delta Rule implemented in Pytorch on DenseNet
Stars: ✭ 102 (+168.42%)
Mutual labels: dropout
NIPS-Global-Paper-Implementation-Challenge
Selective Classification For Deep Neural Networks.
Stars: ✭ 11 (-71.05%)
Mutual labels: dropout
Satania.moe
Satania IS the BEST waifu, no really, she is, if you don't believe me, this website will convince you
Stars: ✭ 486 (+1178.95%)
Mutual labels: dropout
Tensorflow Tutorial
Tensorflow tutorial from basic to hard, 莫烦Python 中文AI教学
Stars: ✭ 4,122 (+10747.37%)
Mutual labels: dropout
CS231n
My solutions for Assignments of CS231n: Convolutional Neural Networks for Visual Recognition
Stars: ✭ 30 (-21.05%)
Mutual labels: dropout
Variance Networks
The code for our ICLR 2019 paper on Variance Networks: When Expectation Does Not Meet Your Expectations.
Talk video
Code
We actually have two version of the code:
- TensorFlow implementation is done with python 2.7, and will help to reproduce CIFAR results i.e. training variance layers via variational dropout.
- PyTorch implementation is a way more accurate and reproduces results on MNIST and the toy problem. It requires python 3.6 and pytorch 0.3.
Citation
If you found this code useful please cite our paper
@article{neklyudov2018variance,
title={Variance Networks: When Expectation Does Not Meet Your Expectations},
author={Neklyudov, Kirill and Molchanov, Dmitry and Ashukha, Arsenii and Vetrov, Dmitry},
journal={7th International Conference on Learning Representations},
year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].