hyper_opt_optuna
Hyperparameter optimizer using optuna.
- class HyperOptOptuna(params: Parameters, data=None, use_pkl_checkpoints=False)[source]
Bases:
HyperOpt
Hyperparameter optimizer using Optuna.
- Parameters:
params (mala.common.parameters.Parameters) – Parameters used to create this hyperparameter optimizer.
data (mala.datahandling.data_handler.DataHandler) – DataHandler holding the data for the hyperparameter optimization.
use_pkl_checkpoints (bool) – If true, .pkl checkpoints will be created.
- get_trials_from_study()[source]
Return the trials from the last study.
- Returns:
last_trials – A list of optuna.FrozenTrial objects.
- Return type:
list
- classmethod load_from_file(params, file_path, data)[source]
Load a hyperparameter optimizer from a file.
- Parameters:
params (mala.common.parameters.Parameters) – Parameters object with which the hyperparameter optimizer should be created Has to be compatible with data.
file_path (string) – Path to the file from which the hyperparameter optimizer should be loaded.
data (mala.datahandling.data_handler.DataHandler) – DataHandler holding the training data.
- Returns:
loaded_trainer – The hyperparameter optimizer that was loaded from the file.
- Return type:
- perform_study()[source]
Perform the study, i.e. the optimization.
This is done by sampling a certain subset of network architectures. In this case, optuna is used.
- static requeue_zombie_trials(study_name, rdb_storage)[source]
Put zombie trials back into the queue to be investigated.
When using Optuna with scheduling systems in HPC infrastructure, zombie trials can occur. These are trials that are still marked as “RUNNING”, but are, in actuality, dead, since the HPC job ended. This function takes a saved hyperparameter study, and puts all “RUNNING” trials als “WAITING”. Upon the next execution from checkpoint, they will be executed.
BE CAREFUL! DO NOT USE APPLY THIS TO A RUNNING STUDY, IT WILL MESS THE STUDY UP! ONLY USE THIS ONCE ALL JOBS HAVE FINISHED, TO CLEAN UP, AND THEN RESUBMIT!
- Parameters:
rdb_storage (string) – Adress of the RDB storage to be cleaned.
study_name (string) – Name of the study in the storage. Same as the checkpoint name.
- classmethod resume_checkpoint(checkpoint_name, alternative_storage_path=None, no_data=False, use_pkl_checkpoints=False)[source]
Prepare resumption of hyperparameter optimization from a checkpoint.
Please note that to actually resume the optimization, HyperOptOptuna.perform_study() still has to be called.
- Parameters:
checkpoint_name (string) – Name of the checkpoint from which the checkpoint is loaded.
alternative_storage_path (string) – Alternative storage string to load the study from. For applications on an HPC cluster it might be necessary to slightly modify the storage path between runs, since the SQL server might be running on different nodes each time.
no_data (bool) – If True, the data won’t actually be loaded into RAM or scaled. This can be useful for cases where a checkpoint is loaded for analysis purposes.
use_pkl_checkpoints (bool) – If true, .pkl checkpoints will be loaded.
- Returns:
loaded_params (mala.common.parameters.Parameters) – The Parameters saved in the checkpoint.
new_datahandler (mala.datahandling.data_handler.DataHandler) – The data handler reconstructed from the checkpoint.
new_hyperopt (HyperOptOptuna) – The hyperparameter optimizer reconstructed from the checkpoint.