All Projects → tensorflow → Adanet

tensorflow / Adanet

Licence: apache-2.0
Fast and flexible AutoML with learning guarantees.

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to Adanet

H2o 3
H2O is an Open Source, Distributed, Fast & Scalable Machine Learning Platform: Deep Learning, Gradient Boosting (GBM) & XGBoost, Random Forest, Generalized Linear Modeling (GLM with Elastic Net), K-Means, PCA, Generalized Additive Models (GAM), RuleFit, Support Vector Machine (SVM), Stacked Ensembles, Automatic Machine Learning (AutoML), etc.
Stars: ✭ 5,656 (+69.34%)
Mutual labels:  jupyter-notebook, gpu, automl
nas-encodings
Encodings for neural architecture search
Stars: ✭ 29 (-99.13%)
Mutual labels:  automl, neural-architecture-search
Codefun
DataStructure(SwordOffer、LeetCode)、Deep Learning(Tensorflow、Keras、Pytorch)、Machine Learning(sklearn、spark)、AutoML、AutoDL、ModelDeploying、SQL
Stars: ✭ 319 (-90.45%)
Mutual labels:  jupyter-notebook, automl
Hetu
A high-performance distributed deep learning system targeting large-scale and automated distributed training.
Stars: ✭ 78 (-97.66%)
Mutual labels:  gpu, distributed-training
HyperKeras
An AutoDL tool for Neural Architecture Search and Hyperparameter Optimization on Tensorflow and Keras
Stars: ✭ 29 (-99.13%)
Mutual labels:  automl, neural-architecture-search
Neural-Architecture-Search
This repo is about NAS
Stars: ✭ 26 (-99.22%)
Mutual labels:  automl, neural-architecture-search
HyperGBM
A full pipeline AutoML tool for tabular data
Stars: ✭ 172 (-94.85%)
Mutual labels:  automl, distributed-training
Flaml
A fast and lightweight AutoML library.
Stars: ✭ 205 (-93.86%)
Mutual labels:  jupyter-notebook, automl
Adaptnlp
An easy to use Natural Language Processing library and framework for predicting, training, fine-tuning, and serving up state-of-the-art NLP models.
Stars: ✭ 278 (-91.68%)
Mutual labels:  jupyter-notebook, gpu
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+17.37%)
Mutual labels:  automl, neural-architecture-search
Pycaret
An open-source, low-code machine learning library in Python
Stars: ✭ 4,594 (+37.54%)
Mutual labels:  jupyter-notebook, gpu
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (-93.38%)
Mutual labels:  automl, neural-architecture-search
managed ml systems and iot
Managed Machine Learning Systems and Internet of Things Live Lesson
Stars: ✭ 35 (-98.95%)
Mutual labels:  automl, tpu
BossNAS
(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (-96.26%)
Mutual labels:  automl, neural-architecture-search
AutoSpeech
[InterSpeech 2020] "AutoSpeech: Neural Architecture Search for Speaker Recognition" by Shaojin Ding*, Tianlong Chen*, Xinyu Gong, Weiwei Zha, Zhangyang Wang
Stars: ✭ 195 (-94.16%)
Mutual labels:  automl, neural-architecture-search
hfta
Boost hardware utilization for ML training workloads via Inter-model Horizontal Fusion
Stars: ✭ 18 (-99.46%)
Mutual labels:  gpu, tpu
Pnasnet.pytorch
PyTorch implementation of PNASNet-5 on ImageNet
Stars: ✭ 309 (-90.75%)
Mutual labels:  automl, neural-architecture-search
Auto ts
Automatically build ARIMA, SARIMAX, VAR, FB Prophet and XGBoost Models on Time Series data sets with a Single Line of Code. Now updated with Dask to handle millions of rows.
Stars: ✭ 195 (-94.16%)
Mutual labels:  jupyter-notebook, automl
Keras Acgan
Auxiliary Classifier Generative Adversarial Networks in Keras
Stars: ✭ 196 (-94.13%)
Mutual labels:  jupyter-notebook, gpu
Awesome Automl Papers
A curated list of automated machine learning papers, articles, tutorials, slides and projects
Stars: ✭ 3,198 (-4.25%)
Mutual labels:  automl, neural-architecture-search

AdaNet

adanet_tangram_logo

Documentation Status PyPI version Travis codecov Gitter Downloads License

AdaNet is a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention. AdaNet builds on recent AutoML efforts to be fast and flexible while providing learning guarantees. Importantly, AdaNet provides a general framework for not only learning a neural network architecture, but also for learning to ensemble to obtain even better models.

This project is based on the AdaNet algorithm, presented in “AdaNet: Adaptive Structural Learning of Artificial Neural Networks” at ICML 2017, for learning the structure of a neural network as an ensemble of subnetworks.

AdaNet has the following goals:

  • Ease of use: Provide familiar APIs (e.g. Keras, Estimator) for training, evaluating, and serving models.
  • Speed: Scale with available compute and quickly produce high quality models.
  • Flexibility: Allow researchers and practitioners to extend AdaNet to novel subnetwork architectures, search spaces, and tasks.
  • Learning guarantees: Optimize an objective that offers theoretical learning guarantees.

The following animation shows AdaNet adaptively growing an ensemble of neural networks. At each iteration, it measures the ensemble loss for each candidate, and selects the best one to move onto the next iteration. At subsequent iterations, the blue subnetworks are frozen, and only yellow subnetworks are trained:

adanet_tangram_logo

AdaNet was first announced on the Google AI research blog: "Introducing AdaNet: Fast and Flexible AutoML with Learning Guarantees".

This is not an official Google product.

Features

AdaNet provides the following AutoML features:

Example

A simple example of learning to ensemble linear and neural network models:

import adanet
import tensorflow as tf

# Define the model head for computing loss and evaluation metrics.
head = MultiClassHead(n_classes=10)

# Feature columns define how to process examples.
feature_columns = ...

# Learn to ensemble linear and neural network models.
estimator = adanet.AutoEnsembleEstimator(
    head=head,
    candidate_pool={
        "linear":
            tf.estimator.LinearEstimator(
                head=head,
                feature_columns=feature_columns,
                optimizer=...),
        "dnn":
            tf.estimator.DNNEstimator(
                head=head,
                feature_columns=feature_columns,
                optimizer=...,
                hidden_units=[1000, 500, 100])},
    max_iteration_steps=50)

estimator.train(input_fn=train_input_fn, steps=100)
metrics = estimator.evaluate(input_fn=eval_input_fn)
predictions = estimator.predict(input_fn=predict_input_fn)

Getting Started

To get you started:

Requirements

Requires Python 3.6 or above.

adanet is built on TensorFlow 2.1. It depends on bug fixes and enhancements not present in TensorFlow releases prior to 2.1. You must install or upgrade your TensorFlow package to at least 2.1:

$ pip install "tensorflow==2.1"

Installing with Pip

You can use the pip package manager to install the official adanet package from PyPi:

$ pip install adanet

Installing from Source

To install from source first you'll need to install bazel following their installation instructions.

Next clone the adanet repository:

$ git clone https://github.com/tensorflow/adanet
$ cd adanet

From the adanet root directory run the tests:

$ bazel build -c opt //...
$ python3 -m nose

Once you have verified that the tests have passed, install adanet from source as a pip package .

You are now ready to experiment with adanet.

import adanet

Citing this Work

If you use this AdaNet library for academic research, you are encouraged to cite the following paper from the ICML 2019 AutoML Workshop:

@misc{weill2019adanet,
    title={AdaNet: A Scalable and Flexible Framework for Automatically Learning Ensembles},
    author={Charles Weill and Javier Gonzalvo and Vitaly Kuznetsov and Scott Yang and Scott Yak and Hanna Mazzawi and Eugen Hotaj and Ghassen Jerfel and Vladimir Macko and Ben Adlam and Mehryar Mohri and Corinna Cortes},
    year={2019},
    eprint={1905.00080},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

License

AdaNet is released under the Apache License 2.0.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].