All Projects → rnburn → bbai

rnburn / bbai

Licence: CC-BY-4.0 license
Set model hyperparameters using deterministic, exact algorithms.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to bbai

optkeras
OptKeras: wrapper around Keras and Optuna for hyperparameter optimization
Stars: ✭ 29 (+52.63%)
Mutual labels:  hyperparameter-optimization
shadho
Scalable, structured, dynamically-scheduled hyperparameter optimization.
Stars: ✭ 17 (-10.53%)
Mutual labels:  hyperparameter-optimization
optuna-dashboard
Real-time Web Dashboard for Optuna.
Stars: ✭ 240 (+1163.16%)
Mutual labels:  hyperparameter-optimization
go-bayesopt
A library for doing Bayesian Optimization using Gaussian Processes (blackbox optimizer) in Go/Golang.
Stars: ✭ 47 (+147.37%)
Mutual labels:  hyperparameter-optimization
TransformVariables.jl
Transformations to contrained variables from ℝⁿ.
Stars: ✭ 52 (+173.68%)
Mutual labels:  bayesian-statistics
mango
Parallel Hyperparameter Tuning in Python
Stars: ✭ 241 (+1168.42%)
Mutual labels:  hyperparameter-optimization
bess
Best Subset Selection algorithm for Regression, Classification, Count, Survival analysis
Stars: ✭ 14 (-26.32%)
Mutual labels:  regression-models
Sales-Prediction
In depth analysis and forecasting of product sales based on the items, stores, transaction and other dependent variables like holidays and oil prices.
Stars: ✭ 56 (+194.74%)
Mutual labels:  regression-models
sparsereg
a collection of modern sparse (regularized) linear regression algorithms.
Stars: ✭ 55 (+189.47%)
Mutual labels:  regression-models
statistical-rethinking-solutions
Solutions of practice problems from the Richard McElreath's "Statistical Rethinking" book.
Stars: ✭ 60 (+215.79%)
Mutual labels:  bayesian-statistics
Luminescence
Development of the R package 'Luminescence'
Stars: ✭ 13 (-31.58%)
Mutual labels:  bayesian-statistics
broomExtra
Helpers for regression analyses using `{broom}` & `{easystats}` packages 📈 🔍
Stars: ✭ 45 (+136.84%)
Mutual labels:  regression-models
geostan
Bayesian spatial analysis
Stars: ✭ 40 (+110.53%)
Mutual labels:  bayesian-statistics
joineRML
R package for fitting joint models to time-to-event data and multivariate longitudinal data
Stars: ✭ 24 (+26.32%)
Mutual labels:  regression-models
cli
Polyaxon Core Client & CLI to streamline MLOps
Stars: ✭ 18 (-5.26%)
Mutual labels:  hyperparameter-optimization
scikit-hyperband
A scikit-learn compatible implementation of hyperband
Stars: ✭ 68 (+257.89%)
Mutual labels:  hyperparameter-optimization
glmnetUtils
Utilities for glmnet
Stars: ✭ 60 (+215.79%)
Mutual labels:  regression-models
aeppl
Tools for an Aesara-based PPL.
Stars: ✭ 46 (+142.11%)
Mutual labels:  bayesian-statistics
srqm
An introductory statistics course for social scientists, using Stata
Stars: ✭ 43 (+126.32%)
Mutual labels:  regression-models
blangSDK
Blang's software development kit
Stars: ✭ 21 (+10.53%)
Mutual labels:  bayesian-statistics

bbai

PyPI version License: CC BY 4.0 API Reference

Many models, even basic ones, have hyperparameters.

Hyperparameters are used, among other things, to prevent overfitting and can have a big impact on a model's performance.

Typically, hyperparameters are treated separately from other model parameters and are set using non-deterministic processes. For example, a hyperparameter specifying regularization strength might be set by estimating out-of-sample performance with a randomized cross-validation and searching through the space of parameters with a brute-force or black-box strategy.

But such methods will never produce the best value and tweaking is frequently deployed to achieve better performance.

By comparison, this project provides models that use deterministic, exact algorithms to set hyperparameters. By using exact processes, you can remove tweaking from the model fitting process, and instead focus on model specification.

Installation

bbai supports both Linux and OSX on x86-64.

pip install bbai

Usage

Ridge Regression

Fit a ridge regression model with the regularization parameter exactly set so as to minimize mean squared error on a leave-one-out cross-validation of the training data set

# load example data set
from sklearn.datasets import load_boston
from sklearn.preprocessing import StandardScaler
X, y = load_boston(return_X_y=True)
X = StandardScaler().fit_transform(X)

# fit model
from bbai.glm import RidgeRegression
model = RidgeRegression()
model.fit(X, y)

Logistic Regression

Fit a logistic regression model with the regularization parameter exactly set so as to maximize likelihood on an approximate leave-one-out cross-validation of the training data set

# load example data set
from sklearn.datasets import load_breast_cancer
from sklearn.preprocessing import StandardScaler
X, y = load_breast_cancer(return_X_y=True)
X = StandardScaler().fit_transform(X)

# fit model
from bbai.glm import LogisticRegression
model = LogisticRegression()
model.fit(X, y)

Bayesian Ridge Regression

Fit a Bayesian ridge regression model where the hyperparameter controlling the regularization strength is integrated over.

# load example data set
from sklearn.datasets import load_boston
from sklearn.preprocessing import StandardScaler
X, y = load_boston(return_X_y=True)
X = StandardScaler().fit_transform(X)

# fit model
from bbai.glm import BayesianRidgeRegression
model = BayesianRidgeRegression()
model.fit(X, y)

Logistic Regression MAP with Jeffreys Prior

Fit a logistic regression MAP model with Jeffreys prior.

# load example data set
from sklearn.datasets import load_breast_cancer
from sklearn.preprocessing import StandardScaler
X, y = load_breast_cancer(return_X_y=True)
X = StandardScaler().fit_transform(X)

# fit model
from bbai.glm import LogisticRegressionMAP
model = LogisticRegressionMAP()
model.fit(X, y)

How it works

Examples

  • 01-digits: Fit a multinomial logistic regression model to predict digits.
  • 02-iris: Fit a multinomial logistic regression model to the Iris data set.
  • 03-bayesian: Fit a Bayesian ridge regression model with hyperparameter integration.
  • 04-curve-fitting: Fit a Bayesian ridge regression model with hyperparameter integration.
  • 05-jeffreys1: Fit a logistic regression MAP model with Jeffreys prior and a single regressor.
  • 06-jeffreys2: Fit a logistic regression MAP model with Jeffreys prior and two regressors.
  • 07-jeffreys-breast-cancer: Fit a logistic regression MAP model with Jeffreys prior to the breast cancer data set.

Documentation

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].