All Projects → gradientinstitute → Aboleth

gradientinstitute / Aboleth

Licence: apache-2.0
A bare-bones TensorFlow framework for Bayesian deep learning and Gaussian process approximation

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Aboleth

Exoplanet
Fast & scalable MCMC for all your exoplanet needs!
Stars: ✭ 122 (-3.94%)
Mutual labels:  bayesian-inference, gaussian-processes
models
Forecasting 🇫🇷 elections with Bayesian statistics 🥳
Stars: ✭ 24 (-81.1%)
Mutual labels:  bayesian-inference, gaussian-processes
Stheno.jl
Probabilistic Programming with Gaussian processes in Julia
Stars: ✭ 318 (+150.39%)
Mutual labels:  bayesian-inference, gaussian-processes
approxposterior
A Python package for approximate Bayesian inference and optimization using Gaussian processes
Stars: ✭ 36 (-71.65%)
Mutual labels:  bayesian-inference, gaussian-processes
Gpstuff
GPstuff - Gaussian process models for Bayesian analysis
Stars: ✭ 106 (-16.54%)
Mutual labels:  bayesian-inference, gaussian-processes
GPJax
A didactic Gaussian process package for researchers in Jax.
Stars: ✭ 159 (+25.2%)
Mutual labels:  bayesian-inference, gaussian-processes
FBNN
Code for "Functional variational Bayesian neural networks" (https://arxiv.org/abs/1903.05779)
Stars: ✭ 67 (-47.24%)
Mutual labels:  bayesian-inference, gaussian-processes
lgpr
R-package for interpretable nonparametric modeling of longitudinal data using additive Gaussian processes. Contains functionality for inferring covariate effects and assessing covariate relevances. Various models can be specified using a convenient formula syntax.
Stars: ✭ 22 (-82.68%)
Mutual labels:  bayesian-inference, gaussian-processes
Neural Tangents
Fast and Easy Infinite Neural Networks in Python
Stars: ✭ 1,357 (+968.5%)
Mutual labels:  bayesian-inference, gaussian-processes
Deep Kernel Gp
Deep Kernel Learning. Gaussian Process Regression where the input is a neural network mapping of x that maximizes the marginal likelihood
Stars: ✭ 58 (-54.33%)
Mutual labels:  deep-neural-networks, gaussian-processes
TemporalGPs.jl
Fast inference for Gaussian processes in problems involving time. Partly built on results from https://proceedings.mlr.press/v161/tebbutt21a.html
Stars: ✭ 89 (-29.92%)
Mutual labels:  bayesian-inference, gaussian-processes
Vbmc
Variational Bayesian Monte Carlo (VBMC) algorithm for posterior and model inference in MATLAB
Stars: ✭ 123 (-3.15%)
Mutual labels:  bayesian-inference, gaussian-processes
Stheno.jl
Probabilistic Programming with Gaussian processes in Julia
Stars: ✭ 233 (+83.46%)
Mutual labels:  bayesian-inference, gaussian-processes
TrendinessOfTrends
The Trendiness of Trends
Stars: ✭ 14 (-88.98%)
Mutual labels:  bayesian-inference, gaussian-processes
Survival Analysis Using Deep Learning
This repository contains morden baysian statistics and deep learning based research articles , software for survival analysis
Stars: ✭ 139 (+9.45%)
Mutual labels:  bayesian-inference, gaussian-processes
Ipynotebook machinelearning
This contains a number of IP[y]: Notebooks that hopefully give a light to areas of bayesian machine learning.
Stars: ✭ 27 (-78.74%)
Mutual labels:  bayesian-inference, gaussian-processes
Numpy Ml
Machine learning, in numpy
Stars: ✭ 11,100 (+8640.16%)
Mutual labels:  bayesian-inference, gaussian-processes
Bcpd
Bayesian Coherent Point Drift (BCPD/BCPD++); Source Code Available
Stars: ✭ 116 (-8.66%)
Mutual labels:  bayesian-inference, gaussian-processes
Imagecluster
Cluster images based on image content using a pre-trained deep neural network, optional time distance scaling and hierarchical clustering.
Stars: ✭ 122 (-3.94%)
Mutual labels:  deep-neural-networks
Echo
Python package containing all custom layers used in Neural Networks (Compatible with PyTorch, TensorFlow and MegEngine)
Stars: ✭ 126 (-0.79%)
Mutual labels:  deep-neural-networks

======= Aboleth

.. |copy| unicode:: 0xA9

.. image:: https://circleci.com/gh/data61/aboleth/tree/develop.svg?style=svg&circle-token=f02db635cf3a7e998e17273c91f13ffae7dbf088 :target: https://circleci.com/gh/data61/aboleth/tree/develop :alt: circleCI

.. image:: https://readthedocs.org/projects/aboleth/badge/?version=stable :target: http://aboleth.readthedocs.io/en/stable/?badge=stable :alt: Documentation Status

A bare-bones TensorFlow <https://www.tensorflow.org/>_ framework for Bayesian deep learning and Gaussian process approximation [1]_ with stochastic gradient variational Bayes inference [2]_.

Features

Some of the features of Aboleth:

  • Bayesian fully-connected, embedding and convolutional layers using SGVB [2]_ for inference.
  • Random Fourier and arc-cosine features for approximate Gaussian processes. Optional variational optimisation of these feature weights as per [1]_.
  • Imputation layers with parameters that are learned as part of a model.
  • Noise Contrastive Priors [3]_ for better out-of-domain uncertainty estimation.
  • Very flexible construction of networks, e.g. multiple inputs, ResNets etc.
  • Compatible and interoperable with other neural net frameworks such as Keras <https://keras.io/>_ (see the demos <https://github.com/data61/aboleth/tree/develop/demos>_ for more information).

Why?

The purpose of Aboleth is to provide a set of high performance and light weight components for building Bayesian neural nets and approximate (deep) Gaussian process computational graphs. We aim for minimal abstraction over pure TensorFlow, so you can still assign parts of the computational graph to different hardware, use your own data feeds/queues, and manage your own sessions etc.

Here is an example of building a simple Bayesian neural net classifier with one hidden layer and Normal prior/posterior distributions on the network weights:

.. code-block:: python

import tensorflow as tf
import aboleth as ab

# Define the network, ">>" implements function composition,
# the InputLayer gives a kwarg for this network, and
# allows us to specify the number of samples for stochastic
# gradient variational Bayes.
net = (
    ab.InputLayer(name="X", n_samples=5) >>
    ab.DenseVariational(output_dim=100) >>
    ab.Activation(tf.nn.relu) >>
    ab.DenseVariational(output_dim=1)
)

X_ = tf.placeholder(tf.float, shape=(None, D))
Y_ = tf.placeholder(tf.float, shape=(None, 1))

# Build the network, nn, and the parameter regularization, kl
nn, kl = net(X=X_)

# Define the likelihood model
likelihood = tf.distributions.Bernoulli(logits=nn).log_prob(Y_)

# Build the final loss function to use with TensorFlow train
loss = ab.elbo(likelihood, kl, N)

# Now your TensorFlow training code here!
...

At the moment the focus of Aboleth is on supervised tasks, however this is subject to change in subsequent releases if there is interest in this capability.

Installation

NOTE: Aboleth is a Python 3 library only. Some of the functionality within it depends on features only found in python 3. Sorry.

To get up and running quickly you can use pip and get the Aboleth package from PyPI <https://pypi.python.org/pypi>_::

$ pip install aboleth

For the best performance on your architecture, we recommend installing TensorFlow from sources <https://www.tensorflow.org/install/install_sources>_.

Or, to install additional dependencies required by the demos <https://github.com/data61/aboleth/tree/develop/demos>_::

$ pip install aboleth[demos]

To install in develop mode with packages required for development we recommend you clone the repository from GitHub::

$ git clone [email protected]:data61/aboleth.git

Then in the directory that you cloned into, issue the following::

$ pip install -e .[dev]

Getting Started

See the quick start guide <http://aboleth.readthedocs.io/en/latest/quickstart.html>_ to get started, and for more in depth guide, have a look at our tutorials <http://aboleth.readthedocs.io/en/latest/tutorials/tutorials.html>. Also see the demos <https://github.com/data61/aboleth/tree/develop/demos> folder for more examples of creating and training algorithms with Aboleth.

The full project documentation can be found on readthedocs <http://aboleth.readthedocs.io>_.

References

.. [1] Cutajar, K. Bonilla, E. Michiardi, P. Filippone, M. Random Feature Expansions for Deep Gaussian Processes. In ICML, 2017. .. [2] Kingma, D. P. and Welling, M. Auto-encoding variational Bayes. In ICLR, 2014. .. [3] Hafner, D., Tran, D., Irpan, A., Lillicrap, T. and Davidson, J., 2018. Reliable Uncertainty Estimates in Deep Neural Networks using Noise Contrastive Priors. arXiv preprint arXiv:1807.09289.

License

Copyright 2017 CSIRO (Data61)

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].