autotorch¶
Real¶
-
class
autotorch.
Real
(lower, upper, default=None, log=False)[source]¶ linear search space.
- Parameters
lower (float) – the lower bound of the search space
upper (float) – the upper bound of the search space
default (float (optional)) – default value
log ((True/False)) – search space in log scale
Examples
>>> learning_rate = at.Real(0.01, 0.1, log=True) >>> learning_rate.rand 0.013396492756434304
Int¶
Choice¶
List¶
-
class
autotorch.
List
(*args)[source]¶ A Searchable List (Nested Space)
- Parameters
args (list) – a list of search spaces.
Examples
>>> sequence = at.List( >>> at.Choice('conv3x3', 'conv5x5', 'conv7x7'), >>> at.Choice('BatchNorm', 'InstanceNorm'), >>> at.Choice('relu', 'sigmoid'), >>> ) >>> sequence.rand ['conv3x3', 'InstanceNorm', 'relu']
Dict¶
Bool¶
args¶
-
autotorch.
args
(default={}, **kwvars)[source]¶ Decorator for customized training script, registering arguments or searchable spaces to the decorated function. The arguments should be python built-in objects, autotorch objects (see
autotorch.obj()
.), or autotorch search spaces (autotorch.space.Int
,autotorch.space.Real
…).Examples
>>> @at.args(batch_size=10, lr=at.Real(0.01, 0.1)) >>> def my_train(args): ... print('Batch size is {}, LR is {}'.format(args.batch_size, arg.lr))
obj¶
-
autotorch.
obj
(**kwvars)[source]¶ Register args or searchable spaces to the class.
- Returns
a lazy init object, which allows distributed training.
- Return type
instance of
autotorch.space.AutoTorchObject
Examples
>>> import autotorch as at >>> import torch >>> @at.obj( >>> lr=at.Real(1e-4, 1e-1, log=True), >>> weight_decay=at.Real(1e-4, 1e-1), >>> ) >>> class Adam(torch.optim.Adam): >>> pass
func¶
-
autotorch.
func
(**kwvars)[source]¶ Register args or searchable spaces to the functions.
- Returns
a lazy init object, which allows distributed training.
- Return type
instance of
autotorch.space.AutoTorchObject
Examples
>>> torchvision.models as models >>> >>> @at.func(pretrained=at.Bool()) >>> def resnet18(pretrained): ... return models.resnet18(pretrained=pretrained)