hyper_opt_oat

Hyperparameter optimizer using orthogonal array tuning.

class HyperOptOAT(params: Parameters, data=None, use_pkl_checkpoints=False)[source]

Bases: HyperOpt

Hyperparameter optimizer using Orthogonal Array Tuning.

Based on https://link.springer.com/chapter/10.1007/978-3-030-36808-1_31.

Parameters:
  • params (mala.common.parametes.Parameters) – Parameters used to create this hyperparameter optimizer.

  • data (mala.datahandling.data_handler.DataHandler) – DataHandler holding the data for the hyperparameter optimization.

  • use_pkl_checkpoints (bool) – If true, .pkl checkpoints will be created.

add_hyperparameter(opttype='categorical', name='', choices=None, **kwargs)[source]

Add hyperparameter.

Hyperparameter list will automatically sorted w.r.t the number of choices.

Parameters:

opttype (string) – Datatype of the hyperparameter. Follows optuna’s naming conventions, but currently only supports “categorical” (a list).

get_best_trial_results()[source]

Get the best trial out of the list, including the value.

get_optimal_parameters()[source]

Find the optimal set of hyperparameters by doing range analysis.

This is done using loss instead of accuracy as done in the paper.

get_orthogonal_array()[source]

Generate the best OA used for optimal hyperparameter sampling.

This is function is taken from the example notebook of OApackage.

classmethod load_from_file(params, file_path, data)[source]

Load a hyperparameter optimizer from a file.

Parameters:
Returns:

loaded_hyperopt – The hyperparameter optimizer that was loaded from the file.

Return type:

HyperOptOAT

number_of_runs()[source]

Calculate the minimum number of runs required for an Orthogonal array.

Based on the factor levels and the strength of the array requested. See also here: https://oapackage.readthedocs.io/en/latest/examples/example_minimal_number_of_runs_oa.html

perform_study()[source]

Perform the study, i.e. the optimization.

Uses Optunas TPE sampler.

classmethod resume_checkpoint(checkpoint_name, no_data=False, use_pkl_checkpoints=False)[source]

Prepare resumption of hyperparameter optimization from a checkpoint.

Please note that to actually resume the optimization, HyperOptOAT.perform_study() still has to be called.

Parameters:
  • checkpoint_name (string) – Name of the checkpoint from which the checkpoint is loaded.

  • no_data (bool) – If True, the data won’t actually be loaded into RAM or scaled. This can be useful for cases where a checkpoint is loaded for analysis purposes.

  • use_pkl_checkpoints (bool) – If true, .pkl checkpoints will be loaded.

Returns:

  • loaded_params (mala.common.parameters.Parameters) – The parameters saved in the checkpoint.

  • new_datahandler (mala.datahandling.data_handler.DataHandler) – The data handler reconstructed from the checkpoint.

  • new_hyperopt (HyperOptOAT) – The hyperparameter optimizer reconstructed from the checkpoint.

set_optimal_parameters()[source]

Set the optimal parameters found in the present study.

The parameters will be written to the parameter object with which the hyperparameter optimizer was created.

show_order_of_importance()[source]

Print the order of importance of the hyperparameters.