All Projects → theogf → Augmentedgaussianprocesses.jl

theogf / Augmentedgaussianprocesses.jl

Licence: other
Gaussian Process package based on data augmentation, sparsity and natural gradients

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to Augmentedgaussianprocesses.jl

Sinaweibo Emotion Classification
新浪微博情感分析应用
Stars: ✭ 118 (+19.19%)
Mutual labels:  classification, svm
dzetsaka
dzetsaka : classification plugin for Qgis
Stars: ✭ 61 (-38.38%)
Mutual labels:  svm, classification
Tiny ml
numpy 实现的 周志华《机器学习》书中的算法及其他一些传统机器学习算法
Stars: ✭ 129 (+30.3%)
Mutual labels:  classification, svm
Gpstuff
GPstuff - Gaussian process models for Bayesian analysis
Stars: ✭ 106 (+7.07%)
Mutual labels:  classification, gaussian-processes
Jsmlt
🏭 JavaScript Machine Learning Toolkit
Stars: ✭ 22 (-77.78%)
Mutual labels:  classification, svm
Ml Course
Starter code of Prof. Andrew Ng's machine learning MOOC in R statistical language
Stars: ✭ 154 (+55.56%)
Mutual labels:  classification, svm
Dynaml
Scala Library/REPL for Machine Learning Research
Stars: ✭ 195 (+96.97%)
Mutual labels:  classification, gaussian-processes
Fuku Ml
Simple machine learning library / 簡單易用的機器學習套件
Stars: ✭ 280 (+182.83%)
Mutual labels:  classification, svm
137 Stopmove
Algorithms to automatically discover stops and moves in GPS trajectories.
Stars: ✭ 19 (-80.81%)
Mutual labels:  classification, gps
Tensorflow cookbook
Code for Tensorflow Machine Learning Cookbook
Stars: ✭ 5,984 (+5944.44%)
Mutual labels:  classification, svm
Gru Svm
[ICMLC 2018] A Neural Network Architecture Combining Gated Recurrent Unit (GRU) and Support Vector Machine (SVM) for Intrusion Detection
Stars: ✭ 76 (-23.23%)
Mutual labels:  classification, svm
Glcm Svm
提取图像的灰度共生矩阵(GLCM),根据GLCM求解图像的概率特征,利用特征训练SVM分类器,对目标分类
Stars: ✭ 48 (-51.52%)
Mutual labels:  classification, svm
Nlp Journey
Documents, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0.
Stars: ✭ 1,290 (+1203.03%)
Mutual labels:  classification, svm
Leaflet Gps
Simple leaflet control plugin for tracking gps position
Stars: ✭ 90 (-9.09%)
Mutual labels:  gps
Meshtastic Device
Device code for the Meshtastic ski/hike/fly/customizable open GPS radio
Stars: ✭ 1,331 (+1244.44%)
Mutual labels:  gps
Safeopt
Safe Bayesian Optimization
Stars: ✭ 90 (-9.09%)
Mutual labels:  gaussian-processes
Tabi
BGP Hijack Detection
Stars: ✭ 90 (-9.09%)
Mutual labels:  classification
Dataset
Crop/Weed Field Image Dataset
Stars: ✭ 98 (-1.01%)
Mutual labels:  classification
R2d2
[ICLR'19] Meta-learning with differentiable closed-form solvers
Stars: ✭ 96 (-3.03%)
Mutual labels:  classification
Uer Py
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Stars: ✭ 1,295 (+1208.08%)
Mutual labels:  classification

AugmentedGaussianProcesses.jl

Docs Latest Docs Stable BuildStatus Coverage Status DOI

AugmentedGaussianProcesses.jl is a Julia package in development for Data Augmented Sparse Gaussian Processes. It contains a collection of models for different gaussian and non-gaussian likelihoods, which are transformed via data augmentation into conditionally conjugate likelihood allowing for extremely fast inference via block coordinate updates. There are also more options to use more traditional variational inference via quadrature or Monte Carlo integration.

The theory for the augmentation is given in the following paper : Automated Augmented Conjugate Inference for Non-conjugate Gaussian Process Models

You can also use the package in Python via PyJulia!

Packages models :

Two GP classification likelihood


Four GP Regression likelihood

  • Gaussian : The standard Gaussian Process regression model with a Gaussian Likelihood (no data augmentation was needed here) IJulia example/Reference
  • StudentT : The standard Gaussian Process regression with a Student-t likelihood (the degree of freedom ν is not optimizable for the moment) IJulia example/Reference
  • Laplace : Gaussian Process regression with a Laplace likelihood IJulia example/(No reference at the moment)
  • Heteroscedastic : Regression with non-stationary noise, given by an additional GP. (no reference at the moment)


Two GP event counting likelihoods

  • Discrete Poisson Process : Estimating a the Poisson parameter λ at every point (as λ₀σ(f)). (no reference at the moment)
  • Negative Binomial : Estimating the success probability at every point for a negative binomial distribution (no reference at the miment)


One Multi-Class Classification Likelihood

  • Logistic-SoftMax : A modified version of the softmax where the exponential is replaced by the logistic function IJulia example/Reference


Multi-Ouput models

  • It is also possible to create a multi-ouput model where the outputs are a linear combination of inducing variables see IJulia example in preparation/[Reference][neuripsmultiouput]

More models in development

  • Probit : A Classifier with a Bernoulli likelihood with the probit link
  • Online : Allowing for all algorithms to work online as well

Install the package

The package requires at least Julia 1.3 Run julia, press ] and type add AugmentedGaussianProcesses, it will install the package and all its dependencies.

Use the package

A complete documentation is available in the docs. For a short start now you can use this very basic example where X_train is a matrix N x D where N is the number of training points and D is the number of dimensions and Y_train is a vector of outputs (or matrix of independent outputs).

using AugmentedGaussianProcesses;
using KernelFunctions
model = SVGP(X_train, Y_train, SqExponentialKernel(), LogisticLikelihood(),AnalyticSVI(100), 64)
train!(model, 100)
Y_predic = predict_y(model, X_test) #For getting the label directly
Y_predic_prob, Y_predic_prob_var = proba_y(model,X_test) #For getting the likelihood (and likelihood uncertainty) of predicting class 1

Both documentation and examples/tutorials are available.

References :

Check out my website for more news

"Gaussian Processes for Machine Learning" by Carl Edward Rasmussen and Christopher K.I. Williams

AISTATS 20' "Automated Augmented Conjugate Inference for Non-conjugate Gaussian Process Models" by Théo Galy-Fajou, Florian Wenzel and Manfred Opper [https://arxiv.org/abs/2002.11451][autoconj]

UAI 19' "Multi-Class Gaussian Process Classification Made Conjugate: Efficient Inference via Data Augmentation" by Théo Galy-Fajou, Florian Wenzel, Christian Donner and Manfred Opper https://arxiv.org/abs/1905.09670

ECML 17' "Bayesian Nonlinear Support Vector Machines for Big Data" by Florian Wenzel, Théo Galy-Fajou, Matthäus Deutsch and Marius Kloft. https://arxiv.org/abs/1707.05532

AAAI 19' "Efficient Gaussian Process Classification using Polya-Gamma Variables" by Florian Wenzel, Théo Galy-Fajou, Christian Donner, Marius Kloft and Manfred Opper. https://arxiv.org/abs/1802.06383

NeurIPS 18' "Moreno-Muñoz, Pablo, Antonio Artés, and Mauricio Álvarez. "Heterogeneous multi-output Gaussian process prediction." Advances in Neural Information Processing Systems. 2018." [https://papers.nips.cc/paper/7905-heterogeneous-multi-output-gaussian-process-prediction][neuripsmultiouput]

UAI 13' "Gaussian Process for Big Data" by James Hensman, Nicolo Fusi and Neil D. Lawrence https://arxiv.org/abs/1309.6835

JMLR 11' "Robust Gaussian process regression with a Student-t likelihood." by Jylänki Pasi, Jarno Vanhatalo, and Aki Vehtari. http://www.jmlr.org/papers/v12/jylanki11a.html

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].