optframework.kernel_opt.opt_base_ray module

Optimization algorithm based on ray-Tune

class optframework.kernel_opt.opt_base_ray.OptBaseRay(base)[source]

Bases: object

print_current_actors()[source]

Print the current number of active Ray actors.

Returns

None

multi_optimierer_ray(opt_params_space, exp_data_paths=None, known_params=None)[source]

Optimize PBE parameters using multiple Ray Tuners, managed via Ray multiprocessing.

This method enables multiple instances of Ray Tune to run concurrently, each tuning with different experimental data sets or parameters. It splits the job queue based on the number of available jobs and distributes the tasks across multiple processes managed by Ray’s multiprocessing pool.

Parameters

opt_params_spacedict

A dictionary of optimization parameters, where each key corresponds to a parameter name and its value contains information about the bounds and scaling (logarithmic or linear).

exp_data_pathslist of str, optional

Paths to the experimental data for each tuning instance.

known_paramslist of dict, optional

Known parameters to be passed to the optimization process.

Returns

list

A list of dictionaries, where each dictionary contains the optimized parameters and the corresponding objective score (delta).

check_num_jobs(queue_length)[source]

Adjust the number of concurrent Tuners to the available job queue length.

Parameters

queue_lengthint

The length of the job queue, representing the number of data sets available for tuning.

Returns

None

optimierer_ray(opt_params_space=None, exp_data_paths=None, known_params=None, evaluated_params=None)[source]

Optimize PBE parameters using Ray’s Tune module, based on the delta calculated by the method calc_delta_agg.

This method utilizes Ray Tune for hyperparameter optimization. It sets up the search space and runs the optimization process, either for single or multi-dimensional PBEs. The results are saved and returned in a dictionary that contains the optimized parameters and the corresponding objective score.

Parameters

opt_params_spacedict, optional

A dictionary of optimization parameters, where each key corresponds to a parameter name and its value contains information about the bounds and scaling (logarithmic or linear).

exp_data_pathslist of str, optional

Paths to the experimental or synthetic data to be used for optimization. For multi-case, this should be a list containing the paths for 1D and 2D data.

known_paramsdict, optional

Known parameters to be passed to the optimization process.

Returns

dict
A dictionary containing:
  • “opt_score”: The optimized objective score (delta value).

  • “opt_params_space”: The optimized parameters from the search space.

  • “file_path”: The path(s) to the experimental data used for optimization.

create_algo(batch=False, evaluated_params=None, evaluated_rewards=None)[source]

Create and return the search algorithm to be used for hyperparameter optimization.

This method creates a search algorithm based on the method attribute of the core object. It supports a variety of search algorithms from the Optuna library, including Bayesian optimization (GP), tree-structured Parzen estimators (TPE), covariance matrix adaptation evolution strategy (Cmaes), NSGA-II (NSGA), and quasi-Monte Carlo sampling (QMC). Optionally, it can also limit the number of concurrent trials.

The number of concurrent trials controls the parallelism of the optimization process. In theory, increasing the number of concurrent trials speeds up the calculation, but it may reduce the convergence rate due to less frequent information sharing between trials. Empirically, a range of 4-12 concurrent trials tends to work well.

The batch parameter controls whether a new batch of trials is submitted only after the current batch finishes all trials. Note that for some algorithms, the batch setting is fixed; for example, the HEBO algorithm always uses batching.

Parameters

batchbool, optional

Whether to use batch mode for the concurrency limiter. Default is False.

Returns

search_algobject

The search algorithm instance to be used for optimization.