Shortcuts

autotorch.searcher

RandomSearcher

class autotorch.searcher.RandomSearcher(configspace)[source]

Random sampling Searcher for ConfigSpace

Parameters

configspace (ConfigSpace.ConfigurationSpace) – The configuration space to sample from. It contains the full specification of the Hyperparameters with their priors

Examples

>>> import numpy as np
>>> import autotorch as at
>>> @at.args(
...     lr=at.Real(1e-3, 1e-2, log=True),
...     wd=at.Real(1e-3, 1e-2))
>>> def train_fn(args, reporter):
...     print('lr: {}, wd: {}'.format(args.lr, args.wd))
...     for e in range(10):
...         dummy_accuracy = 1 - np.power(1.8, -np.random.uniform(e, 2*e))
...         reporter(epoch=e, accuracy=dummy_accuracy, lr=args.lr, wd=args.wd)
>>> searcher = at.searcher.RandomSearcher(train_fn.cs)
>>> searcher.get_config()
{'lr': 0.0031622777, 'wd': 0.0055}
get_config(**kwargs)[source]

Function to sample a new configuration at random

Parameters

returns (config) – must return a valid configuration

GridSearcher

class autotorch.searcher.GridSearcher(configspace)[source]

Grid Searcher, only search spaces autotorch.space.Choice

Requires scikit-learn to be installed. You can install scikit-learn with the command: pip install scikit-learn.

Examples

>>> import autotorch as ag
>>> @ag.args(
>>>     x=ag.space.Choice(0, 1, 2),
>>>     y=ag.space.Choice('a', 'b', 'c'))
>>> def train_fn(args, reporter):
...     pass
>>> searcher = ag.searcher.GridSearcher(train_fn.cs)
>>> searcher.get_config()
Number of configurations for grid search is 9
{'x.choice': 2, 'y.choice': 2}
get_config()[source]

Function to sample a new configuration

This function is called inside TaskScheduler to query a new configuration

Parameters

returns ((config, info_dict)) – must return a valid configuration and a (possibly empty) info dict

BayesOptSearcher

class autotorch.searcher.BayesOptSearcher(configspace, lazy_configs=None, random_state=None, ac_kind='ucb', **kwargs)[source]

A wrapper around BayesOpt

Requires scikit-learn to be installed. You can install scikit-learn with the command: pip install scikit-learn.

Parameters
  • configspace (ConfigSpace.ConfigurationSpace) – The configuration space to sample from. It contains the full specification of the Hyperparameters with their priors

  • lazy_configs (list of dict) – Mannual configurations to handle some special case. In some cases, not all configurations are valid in the space, and we need to pre-sample valid configurations with extra constraints.

Examples

>>> import numpy as np
>>> import autotorch as at
>>> @at.args(
...     lr=at.Real(1e-3, 1e-2, log=True),
...     wd=at.Real(1e-3, 1e-2))
>>> def train_fn(args, reporter):
...     print('lr: {}, wd: {}'.format(args.lr, args.wd))
...     for e in range(10):
...         dummy_accuracy = 1 - np.power(1.8, -np.random.uniform(e, 2*e))
...         reporter(epoch=e, accuracy=dummy_accuracy, lr=args.lr, wd=args.wd)
>>> searcher = at.searcher.BayesOptSearcher(train_fn.cs)
>>> searcher.get_config()
{'lr': 0.0031622777, 'wd': 0.0055}
acq_max(n_warmup=10000, n_iter=20)[source]

A function to find the maximum of the acquisition function It uses a combination of random sampling (cheap) and the ‘L-BFGS-B’ optimization method. First by sampling n_warmup (1e5) points at random, and then running L-BFGS-B from n_iter (250) random starting points. :param : number of times to randomly sample the aquisition function :type : param n_warmup: :param : number of times to run scipy.minimize :type : param n_iter:

Returns

Return type

return: x_max, The arg max of the acquisition function.

get_config(**kwargs)[source]

Function to sample a new configuration

Parameters

returns (config) – must return a valid configuration

update(config, reward, **kwargs)[source]

Update the searcher with the newest metric report