All Projects → BlackHC → batchbald_redux

BlackHC / batchbald_redux

Licence: Apache-2.0 License
Reusable BatchBALD implementation

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language
Makefile
30231 projects

Projects that are alternatives of or similar to batchbald redux

src
tools for fast reading of docs
Stars: ✭ 40 (-9.09%)
Mutual labels:  active-learning
Generalizing-Lottery-Tickets
This repository contains code to replicate the experiments given in NeurIPS 2019 paper "One ticket to win them all: generalizing lottery ticket initializations across datasets and optimizers"
Stars: ✭ 48 (+9.09%)
Mutual labels:  neurips-2019
small-text
Active Learning for Text Classification in Python
Stars: ✭ 241 (+447.73%)
Mutual labels:  active-learning
nnanno
Sample, annotate and apply computer vision to the Newspaper Navigator dataset
Stars: ✭ 16 (-63.64%)
Mutual labels:  nbdev
Active-Explainable-Classification
A set of tools for leveraging pre-trained embeddings, active learning and model explainability for effecient document classification
Stars: ✭ 28 (-36.36%)
Mutual labels:  active-learning
trunklucator
Python module for data scientists for quick creating annotation projects.
Stars: ✭ 80 (+81.82%)
Mutual labels:  active-learning
molpal
active learning for accelerated high-throughput virtual screening
Stars: ✭ 110 (+150%)
Mutual labels:  active-learning
al-fk-self-supervision
Official PyTorch code for CVPR 2020 paper "Deep Active Learning for Biased Datasets via Fisher Kernel Self-Supervision"
Stars: ✭ 28 (-36.36%)
Mutual labels:  active-learning
EntityTargetedActiveLearning
No description or website provided.
Stars: ✭ 17 (-61.36%)
Mutual labels:  active-learning
MONAILabel
MONAI Label is an intelligent open source image labeling and learning tool.
Stars: ✭ 249 (+465.91%)
Mutual labels:  active-learning
differential-privacy-bayesian-optimization
This repo contains the underlying code for all the experiments from the paper: "Automatic Discovery of Privacy-Utility Pareto Fronts"
Stars: ✭ 22 (-50%)
Mutual labels:  active-learning
UPIT
A fastai/PyTorch package for unpaired image-to-image translation.
Stars: ✭ 94 (+113.64%)
Mutual labels:  nbdev
JCLAL
JCLAL is a general purpose framework developed in Java for Active Learning.
Stars: ✭ 22 (-50%)
Mutual labels:  active-learning
ml-afv
No description or website provided.
Stars: ✭ 44 (+0%)
Mutual labels:  neurips-2019
scikit-activeml
Our package scikit-activeml is a Python library for active learning on top of SciPy and scikit-learn.
Stars: ✭ 46 (+4.55%)
Mutual labels:  active-learning
activelearning
Active Learning in R
Stars: ✭ 43 (-2.27%)
Mutual labels:  active-learning
human-in-the-loop-machine-learning-tool-tornado
Tornado is a human-in-the-loop machine learning framework that helps you exploit your unlabelled data to train models through a simple and easy to use web interface.
Stars: ✭ 37 (-15.91%)
Mutual labels:  active-learning
asreview-visualization
Visualization extension for ASReview
Stars: ✭ 16 (-63.64%)
Mutual labels:  active-learning
doccano-client
A simple client wrapper for doccano API.
Stars: ✭ 52 (+18.18%)
Mutual labels:  active-learning
AlpacaTag
AlpacaTag: An Active Learning-based Crowd Annotation Framework for Sequence Tagging (ACL 2019 Demo)
Stars: ✭ 126 (+186.36%)
Mutual labels:  active-learning

BatchBALD Redux

Clean reimplementation of "BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning"

For an introduction & more information, see http://batchbald.ml/. The paper can be found at http://arxiv.org/abs/1906.08158.

The original implementation used in the paper is available at https://github.com/BlackHC/BatchBALD.

We are grateful for fastai's nbdev which is powering this package.

For more information, explore the sections and notebooks in the left-hand menu. The code is available on https://github.com/BlackHC/batchbald_redux, and the website on https://blackhc.github.io/batchbald_redux.

Install

pip install batchbald_redux

Motivation

BatchBALD is an algorithm and acquisition function for Active Learning in a Bayesian setting using BNNs and MC dropout.

The aquisition function is the mutual information between the joint of a candidate batch and the model parameters $\omega$:

{% raw %} $$a_{\text{BatchBALD}}((y_b)_B) = I[(y_b)_B;\omega]$$ {% endraw %}

The best candidate batch is one that maximizes this acquisition function.

In the paper, we show that this function satisfies sub-modularity, which provides us an optimality guarantee for a greedy algorithm. The candidate batch is selected using greedy expansion.

Joint entropies are hard to estimate and, for everything to work, one also has to use consistent MC dropout, which keeps a set of dropout masks fixed while scoring the pool set.

To aid reproducibility and baseline reproduction, we provide this simpler and clearer reimplementation.

Please cite us

@misc{kirsch2019batchbald,
    title={BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning},
    author={Andreas Kirsch and Joost van Amersfoort and Yarin Gal},
    year={2019},
    eprint={1906.08158},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

How to use

We provide a simple example experiment that uses this package here.

To get a candidate batch using BatchBALD, we provide a simple API in batchbald_redux.batchbald:

get_batchbald_batch[source]

get_batchbald_batch(log_probs_N_K_C:Tensor, batch_size:int, num_samples:int, dtype=None, device=None)

We also provide a simple implementation of consistent MC dropout in batchbald_redux.consistent_mc_dropout.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].