artificial neural networksA collection of Methods and Models for various architectures of Artificial Neural Networks
Stars: ✭ 40 (-44.44%)
DUNCode for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (-9.72%)
noisy-K-FACNatural Gradient, Variational Inference
Stars: ✭ 29 (-59.72%)
BayesByHypernetCode for the paper Implicit Weight Uncertainty in Neural Networks
Stars: ✭ 63 (-12.5%)
SafeAIReusable, Easy-to-use Uncertainty module package built with Tensorflow, Keras
Stars: ✭ 13 (-81.94%)
Tensorflow Mnist CnnMNIST classification using Convolutional NeuralNetwork. Various techniques such as data augmentation, dropout, batchnormalization, etc are implemented.
Stars: ✭ 182 (+152.78%)
Svhn CnnGoogle Street View House Number(SVHN) Dataset, and classifying them through CNN
Stars: ✭ 44 (-38.89%)
UString[ACM MM 2020] Uncertainty-based Traffic Accident Anticipation
Stars: ✭ 38 (-47.22%)
AI Learning HubAI Learning Hub for Machine Learning, Deep Learning, Computer Vision and Statistics
Stars: ✭ 53 (-26.39%)
Tensorflow TutorialTensorflow tutorial from basic to hard, 莫烦Python 中文AI教学
Stars: ✭ 4,122 (+5625%)
Targeted DropoutComplementary code for the Targeted Dropout paper
Stars: ✭ 251 (+248.61%)
pytorch ardPytorch implementation of Variational Dropout Sparsifies Deep Neural Networks
Stars: ✭ 76 (+5.56%)
Daguan 2019 rank9datagrand 2019 information extraction competition rank9
Stars: ✭ 121 (+68.06%)
Deep Learning 101The tools and syntax you need to code neural networks from day one.
Stars: ✭ 59 (-18.06%)
SelSumAbstractive opinion summarization system (SelSum) and the largest dataset of Amazon product summaries (AmaSum). EMNLP 2021 conference paper.
Stars: ✭ 36 (-50%)
Satania.moeSatania IS the BEST waifu, no really, she is, if you don't believe me, this website will convince you
Stars: ✭ 486 (+575%)
DropblockImplementation of DropBlock: A regularization method for convolutional networks in PyTorch.
Stars: ✭ 466 (+547.22%)
BGCNA Tensorflow implementation of "Bayesian Graph Convolutional Neural Networks" (AAAI 2019).
Stars: ✭ 129 (+79.17%)
DeepnetImplementation of CNNs, RNNs, and many deep learning techniques in plain Numpy.
Stars: ✭ 285 (+295.83%)
CS231nMy solutions for Assignments of CS231n: Convolutional Neural Networks for Visual Recognition
Stars: ✭ 30 (-58.33%)
LoL-Match-PredictionWin probability predictions for League of Legends matches using neural networks
Stars: ✭ 34 (-52.78%)
vireoDemultiplexing pooled scRNA-seq data with or without genotype reference
Stars: ✭ 34 (-52.78%)
ALRAImputation method for scRNA-seq based on low-rank approximation
Stars: ✭ 48 (-33.33%)
active-inferenceA toy model of Friston's active inference in Tensorflow
Stars: ✭ 36 (-50%)
AilearnnotesArtificial Intelligence Learning Notes.
Stars: ✭ 195 (+170.83%)
Densenet Sdrrepo that holds code for improving on dropout using Stochastic Delta Rule
Stars: ✭ 148 (+105.56%)
PyLDAA Latent Dirichlet Allocation implementation in Python.
Stars: ✭ 51 (-29.17%)
Lstms.pthPyTorch implementations of LSTM Variants (Dropout + Layer Norm)
Stars: ✭ 111 (+54.17%)
prosperA Python Library for Probabilistic Sparse Coding with Non-Standard Priors and Superpositions
Stars: ✭ 17 (-76.39%)
IcellrSingle (i) Cell R package (iCellR) is an interactive R package to work with high-throughput single cell sequencing technologies (i.e scRNA-seq, scVDJ-seq, ST and CITE-seq).
Stars: ✭ 80 (+11.11%)
adaptive-f-divergenceA tensorflow implementation of the NIPS 2018 paper "Variational Inference with Tail-adaptive f-Divergence"
Stars: ✭ 20 (-72.22%)
CplxmoduleComplex-valued neural networks for pytorch and Variational Dropout for real and complex layers.
Stars: ✭ 51 (-29.17%)
VINFRepository for DTU Special Course, focusing on Variational Inference using Normalizing Flows (VINF). Supervised by Michael Riis Andersen
Stars: ✭ 23 (-68.06%)
Variance NetworksVariance Networks: When Expectation Does Not Meet Your Expectations, ICLR 2019
Stars: ✭ 38 (-47.22%)
rssRegression with Summary Statistics.
Stars: ✭ 42 (-41.67%)
haskell-vaeLearning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-75%)
SIVIUsing neural network to build expressive hierarchical distribution; A variational method to accurately estimate posterior uncertainty; A fast and general method for Bayesian inference. (ICML 2018)
Stars: ✭ 49 (-31.94%)
Theano lstm🔬 Nano size Theano LSTM module
Stars: ✭ 310 (+330.56%)
dropclass speakerDropClass and DropAdapt - repository for the paper accepted to Speaker Odyssey 2020
Stars: ✭ 20 (-72.22%)
Celeste.jlScalable inference for a generative model of astronomical images
Stars: ✭ 142 (+97.22%)
Good PapersI try my best to keep updated cutting-edge knowledge in Machine Learning/Deep Learning and Natural Language Processing. These are my notes on some good papers
Stars: ✭ 248 (+244.44%)
ccubeBayesian mixture models for estimating and clustering cancer cell fractions
Stars: ✭ 23 (-68.06%)
Probabilistic ModelsCollection of probabilistic models and inference algorithms
Stars: ✭ 217 (+201.39%)
normalizing-flowsPyTorch implementation of normalizing flow models
Stars: ✭ 271 (+276.39%)
ReactiveMP.jlJulia package for automatic Bayesian inference on a factor graph with reactive message passing
Stars: ✭ 58 (-19.44%)
spatial-smoothing(ICML 2022) Official PyTorch implementation of “Blurs Behave Like Ensembles: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness”.
Stars: ✭ 68 (-5.56%)
probai-2021-pyroRepo for the Tutorials of Day1-Day3 of the Nordic Probabilistic AI School 2021 (https://probabilistic.ai/)
Stars: ✭ 45 (-37.5%)
boundary-gpKnow Your Boundaries: Constraining Gaussian Processes by Variational Harmonic Features
Stars: ✭ 21 (-70.83%)