# brainiak.hyperparamopt package¶

Hyper parameter optimization package

## brainiak.hyperparamopt.hpo module¶

Hyper Parameter Optimization (HPO)

This implementation is based on the work in [Bergstra2011] and [Bergstra2013].

Bergstra2011

“Algorithms for Hyper-Parameter Optimization”, James S. Bergstra and Bardenet, Rémi and Bengio, Yoshua and Kégl, Balázs. NIPS 2011

Bergstra2013

“Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures”, James Bergstra, Daniel Yamins, David Cox. JMLR W&CP 28 (1) : 115–123, 2013

brainiak.hyperparamopt.hpo.fmin(loss_fn, space, max_evals, trials, init_random_evals=30, explore_prob=0.2)

Find the minimum of function through hyper parameter optimization.

Parameters
• loss_fn (function(*args) -> float) – Function that takes in a dictionary and returns a real value. This is the function to be minimized.

• space (dictionary) – Custom dictionary specifying the range and distribution of the hyperparamters. E.g. space = {'x': {'dist':scipy.stats.uniform(0,1), 'lo':0, 'hi':1}} for a 1-dimensional space with variable x in range [0,1]

• max_evals (int) – Maximum number of evaluations of loss_fn allowed

• trials (list) – Holds the output of the optimization trials. Need not be empty to begin with, new trials are appended at the end.

• init_random_evals (Optional[int], default 30) – Number of random trials to initialize the optimization.

• explore_prob (Optional[float], default 0.2) – Controls the exploration-vs-exploitation ratio. Value should be in [0,1]. By default, 20% of trails are random samples.

Returns

Best hyperparameter setting found. E.g. {‘x’: 5.6, ‘loss’ : 0.5} where x is the best hyparameter value found and loss is the value of the function for the best hyperparameter value(s).

Return type

trial entry (dictionary of hyperparameters)

Raises

ValueError – If the distribution specified in space does not support a rvs() method to generate random numbers, a ValueError is raised.