AdaNet is a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention. AdaNet builds on recent AutoML efforts to be fast and flexible while providing learning guarantees. Importantly, AdaNet provides a general framework for not only learning a neural network architecture, but also for learning to ensemble to obtain even better models.
This project is based on the AdaNet algorithm, presented in “AdaNet: Adaptive Structural Learning of Artificial Neural Networks” at ICML 2017, for learning the structure of a neural network as an ensemble of subnetworks.
AdaNet has the following goals:
The following animation shows AdaNet adaptively growing an ensemble of neural networks. At each iteration, it measures the ensemble loss for each candidate, and selects the best one to move onto the next iteration. At subsequent iterations, the blue subnetworks are frozen, and only yellow subnetworks are trained:
AdaNet was first announced on the Google AI research blog: "Introducing AdaNet: Fast and Flexible AutoML with Learning Guarantees".
This is not an official Google product.
AdaNet provides the following AutoML features:
tf.estimator.Estimator
API for training, evaluation, prediction, and serving models.adanet.AutoEnsembleEstimator
for learning to ensemble user-defined tf.estimator.Estimators
.tf.layers
via the adanet.subnetwork
API.A simple example of learning to ensemble linear and neural network models:
import adanet
import tensorflow as tf
# Define the model head for computing loss and evaluation metrics.
head = MultiClassHead(n_classes=10)
# Feature columns define how to process examples.
feature_columns = ...
# Learn to ensemble linear and neural network models.
estimator = adanet.AutoEnsembleEstimator(
head=head,
candidate_pool={
"linear":
tf.estimator.LinearEstimator(
head=head,
feature_columns=feature_columns,
optimizer=...),
"dnn":
tf.estimator.DNNEstimator(
head=head,
feature_columns=feature_columns,
optimizer=...,
hidden_units=[1000, 500, 100])},
max_iteration_steps=50)
estimator.train(input_fn=train_input_fn, steps=100)
metrics = estimator.evaluate(input_fn=eval_input_fn)
predictions = estimator.predict(input_fn=predict_input_fn)
To get you started:
Requires Python 3.6 or above.
adanet
is built on TensorFlow 2.1. It depends on bug fixes and enhancements not present in TensorFlow releases prior to 2.1. You must install or upgrade your TensorFlow package to at least 2.1:
$ pip install "tensorflow==2.1"
You can use the pip package manager to install the official adanet
package from PyPi:
$ pip install adanet
To install from source first you'll need to install bazel
following their installation instructions.
Next clone the adanet
repository:
$ git clone https://github.com/tensorflow/adanet
$ cd adanet
From the adanet
root directory run the tests:
$ bazel build -c opt //...
$ python3 -m nose
Once you have verified that the tests have passed, install adanet
from source as a pip package .
You are now ready to experiment with adanet
.
import adanet
If you use this AdaNet library for academic research, you are encouraged to cite the following paper from the ICML 2019 AutoML Workshop:
@misc{weill2019adanet,
title={AdaNet: A Scalable and Flexible Framework for Automatically Learning Ensembles},
author={Charles Weill and Javier Gonzalvo and Vitaly Kuznetsov and Scott Yang and Scott Yak and Hanna Mazzawi and Eugen Hotaj and Ghassen Jerfel and Vladimir Macko and Ben Adlam and Mehryar Mohri and Corinna Cortes},
year={2019},
eprint={1905.00080},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
AdaNet is released under the Apache License 2.0.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。