HyperTuning: Automated hyperparameter tuning in Julia.
MIT License
Installation / Quick Start / Features / Examples / Documentation
Automated hyperparameter tuning in Julia. HyperTuning aims to be intuitive, capable of handling multiple problem instances, and providing easy parallelization.
This package can be installed on Julia v1.7 and above. Use one of the following options.
Via Pkg module:
julia> import Pkg; Pkg.add("HyperTuning")
Via the Julia package manager, type ]
and
pkg> add HyperTuning
Let's begin using HyperTuning
to optimize $f(x,y)=(1-x)^2+(y-1)^2$.
After that, the hyperparameters and budget are given in a new scenario.
Once the scenario and the objective function are defined, the optimization process begins.
julia> using HyperTuning
julia> function objective(trial)
@unpack x, y = trial
(1 - x)^2 + (y - 1)^2
end
objective (generic function with 1 method)
julia> scenario = Scenario(x = (-10.0..10.0),
y = (-10.0..10.0),
max_trials = 200);
julia> HyperTuning.optimize(objective, scenario)
Scenario: evaluated 200 trials.
parameters: x, y
space cardinality: Huge!
instances: 1
batch_size: 8
sampler: BCAPSampler{Random.Xoshiro}
pruner: NeverPrune
max_trials: 200
max_evals: 200
stop_reason: HyperTuning.BudgetExceeded("Due to max_trials")
best_trial:
Trial Value
198
x 0.996266
y 1.00086
Pruned false
Success false
Objective 1.46779e-5
julia> @unpack x, y = scenario
See here for more examples.
julia -t8
if you have 8 available threads or julia -p4
if you want 4 distributed processes, and the HyperTuning does the rest.BCAPSampler
for a heuristic search, GridSampler
for brute force, and RandomSampler
for an unbiased search.NeverPrune
to never prune a trial and MedianPruner
for early-stopping the algorithm being configured.Examples for different Julia packages.
Further examples can be found at https://github.com/jmejia8/hypertuning-examples
Please, cite us if you use this package in your research work.
Meja-de-Dios, JA., Mezura-Montes, E. & Quiroz-Castellanos, M. Automated parameter tuning as a bilevel optimization problem solved by a surrogate-assisted population-based approach. Appl Intell 51, 59786000 (2021). https://doi.org/10.1007/s10489-020-02151-y
@article{MejadeDios2021,
author = {Jes{\'{u}}s-Adolfo Mej{\'{\i}}a-de-Dios and Efr{\'{e}}n Mezura-Montes and Marcela Quiroz-Castellanos},
title = {Automated parameter tuning as a bilevel optimization problem solved by a surrogate-assisted population-based approach},
journal = {Applied Intelligence}
doi = {10.1007/s10489-020-02151-y},
url = {https://doi.org/10.1007/s10489-020-02151-y},
year = {2021},
month = jan,
publisher = {Springer Science and Business Media {LLC}},
volume = {51},
number = {8},
pages = {5978--6000},
}
To start contributing to the codebase, consider opening an issue describing the possible changes. PRs fixing typos or grammar issues are always welcome.