Perform optimization using evolution strategies. OptimizerMies and TunerMies implement a standard ES optimization algorithm, performing initialization first, followed by a loop of performance evaluation, survival selection, parent selection, mutation, and recombination to generate new individuals to be evaluated. Currently, two different survival modes ("comma" and "plus") are supported. Multi-fidelity optimization, similar to the "rolling-tide" algorithm described in Fieldsend (2014), is supported. The modular design and reliance on MiesOperator objects to perform central parts of the optimization algorithm makes this Optimizer highly flexible and configurable. In combination with OperatorCombination mutators and recombinators, an algorithm as presented in Li (2013) can easily be implemented.

OptimizerMies implements a standard evolution strategies loop:

  1. Prime operators, using mies_prime_operators()

  2. Initialize and evaluate population, using mies_init_population()

  3. Generate offspring by selecting parents, recombining and mutating them, using mies_generate_offspring()

  4. Evaluate performance, using mies_evaluate_offspring()

  5. Select survivors, using either mies_survival_plus() or mies_survival_comma(), depending on the survival_strategy configuration parameter

  6. Optionally, evaluate survivors with higher fidelity if the multi-fidelity functionality is being used

  7. Jump to 3.

Terminating

As with all optimizers, Terminators are used to end optimization after a specific number of evaluations were performed, time elapsed, or other conditions are satisfied. Of particular interest is TerminatorGenerations, which terminates after a number of generations were evaluated in OptimizerMies. The initial population counts as generation 1, its offspring as generation 2 etc.; fidelity refinements (step 6. in the algorithm description above) are always included in their generation, TerminatorGenerations avoids terminating right before they are evaluated. Other terminators may, however, end the optimization process at any time.

Multi-Fidelity

miesmuschel provides a simple multi-fidelity optimization mechanism that allows both the refinement of fidelity as the optimization progresses, as well as fidelity refinement within each generation. When multi_fidelity is TRUE, then one search space component of the OptimInstance must have the "budget" tag, which is then optimized as the "budget" component. This means that the value of this component is determined by the fidelity/fidelity_offspring parameters, which are functions that get called whenever individuals get evaluated. The fidelity function is evaluated before step 2 and before every occurrence of step 6 in the algorithm, it returns the value of the budget search space component that all individuals that survive the current generation should be evaluated with. fidelity_offspring is called before step 4 and determines the fidelity that newly sampled offspring individuals should be evaluated with; it may be desirable to set this to a lower value than fidelity to save budget when preliminarily evaluating newly sampled individuals that may or may not perform well compared to already sampled individuals. Individuals that survive the generation and are not removed in step 5 will be re-evaluated with the fidelity-value in step 6 before the next loop iteration.

fidelity and fidelity_offspring must have arguments inst, budget_id, last_fidelity and last_fidelity_offspring. inst is the OptimInstance bein optimized, the functions can use it to determine the progress of the optimization, e.g. query the current generation with mies_generation. budget_id identifies the search space component being used as budget parameter. last_fidelity and last_fidelity_offspring contain the last values given by fidelity / fidelity_offspring. Should the offspring-fidelity (as returned by fidelity_offspring always be the same as the parent generation fidelity (as returned by fidelity), for example, then fidelity_offspring can be set to a function that just returns last_fidelity; this is actually the behaviour that fidelity_offspring is initialized with.

OptimizerMies avoids re-evaluating individuals if the fidelity parameter does not change. This means that setting fidelity and fidelity_offspring to the same value avoids re-evaluating individuals in step 6. When fidelity_monotonic is TRUE, re-evaluation is also avoided should the desired fidelity parameter value decrease. When fidelity_current_gen_only is TRUE, then step 6 only re-evaluates individuals that were created in the current generation (in the previous step 4) and sets the fidelity for individuals that are created in step 6, but it does not re-evaluate individuals that survived from earlier generations or were already in the OptimInstance when optimization started; it is recommended to leave this value at TRUE which it is initialized with.

Additional Components

The search space over which the optimization is performed is fundamentally tied to the Objective, and therefore to the OptimInstance given to OptimizerMies$optimize(). However, some advanced Evolution Strategy based algorithms may need to make use of additional search space components that are independent of the particular objective. An example is self-adaption as implemented in OperatorCombination, where one or several components can be used to adjust operator behaviour. These additional components are supplied to the optimizer through the additional_component_sampler configuration parameter, which takes a Sampler object. This object both has an associated ParamSet which represents the additional components that are present, and it provides a method for generating the initial values of these components. The search space that is seen by the MiesOperators is then the union of the OptimInstance's ParamSet, and the Sampler's ParamSet.

Configuration Parameters

OptimizerMies has the configuration parameters of the mutator, recombinator, parent_selector, survival_selector, init_selector, and, if given, elite_selector operator given during construction, and prefixed according to the name of the argument (mutator's configuration parameters are prefixed "mutator." etc.). When using the construction arguments' default values, they are all "proxy" operators: MutatorProxy, RecombinatorProxy and SelectorProxy. This means that the respective configuration parameters become mutator.operation, recombinator.operation etc., so the operators themselves can be set via configuration parameters in this case.

Further configuration parameters are:

  • lambda :: integer(1)
    Offspring size: Number of individuals that are created and evaluated anew for each generation. This is equivalent to the lambda parameter of mies_generate_offspring(), see there for more information. Must be set by the user.

  • mu :: integer(1)
    Population size: Number of individuals that are sampled in the beginning, and which are selected with each survival step. This is equivalent to the mu parameter of mies_init_population(), see there for more information. Must be set by the user.

  • survival_strategy :: character(1)
    May be "plus", or, if the elite_selector construction argument is not NULL, "comma": Choose whether mies_survival_plus() or mies_survival_comma() is used for survival selection. Initialized to "plus".

  • n_elite :: integer(1)
    Only if the elite_selector construction argument is not NULL, and only valid when survival_strategy is "comma": Number of elites, i.e. individuals from the parent generation, to keep during "Comma" survival. This is equivalent to the n_elite parameter of mies_survival_comma(), see there for more information.

  • initializer :: function
    Function that generates the initial population as a Design object, with arguments param_set and n, functioning like paradox::generate_design_random or paradox::generate_design_lhs. This is equivalent to the initializer parameter of mies_init_population(), see there for more information. Initialized to generate_design_random().

  • additional_component_sampler :: Sampler | NULL
    Additional components that may be part of individuals as seen by mutation, recombination, and selection MiesOperators, but that are not part of the search space of the OptimInstance being optimized. This is equivalent to the additional_component_sampler parameter of mies_init_population(), see there for more information. Initialized to NULL (no additional components).

  • fidelity :: function
    Only if the multi_fidelity construction argument is TRUE: Function that determines the value of the "budget" component of surviving individuals being evaluated when doing multi-fidelity optimization. It must have arguments named inst, budget_id, last_fidelity and last_fidelity_offspring, see the "Multi-Fidelity"-section for more details. Its return value is given to mies_init_population() and mies_step_fidelity(). When this configuration parameter is present (i.e. multi_fidelity is TRUE), then it is initialized to a function returning the value 1.

  • fidelity_offspring :: function
    Only if the multi_fidelity construction argument is TRUE: Function that determines the value of the "budget" component of newly sampled offspring individuals being evaluated when doing multi-fidelity optimization. It must have arguments named inst, budget_id, last_fidelity and last_fidelity_offspring, see the "Multi-Fidelity"-section for more details. Its return value is given to mies_evaluate_offspring(). When this configuration parameter is present (i.e. multi_fidelity is TRUE), then it is initialized to a function returning the value of last_fidelity, i.e. the value returned by the last call to the fidelity configuration parameter. This is the recommended value when fidelity should not change within a generation, since this means that survivor selection is performed with individuals that were evaluated with the same fidelity (at least if fidelity_current_gen_only is also set to FALSE) .

  • fidelity_current_gen_only :: logical(1)
    Only if the multi_fidelity construction argument is TRUE: When doing fidelity refinement in mies_step_fidelity(), whether to refine all individuals with different budget component, or only individuals created in the current generation. This is equivalent to the current_gen_only parameter of mies_step_fidelity(), see there for more information.
    When this configuration parameter is present (i.e. multi_fidelity is TRUE), then it is initialized to FALSE, the recommended value.

  • fidelity_monotonic :: logical(1)
    Only if the multi_fidelity construction argument is TRUE: Whether to only do fidelity refinement in mies_step_fidelity() for individuals for which the budget component value would increase. This is equivalent to the monotonic parameter of mies_step_fidelity(), see there for more information.
    When this configuration parameter is present (i.e. multi_fidelity is TRUE), then it is initialized to TRUE. When optimization is performed on problems that have a categorical "budget" parameter, then this value should be set to FALSE.

References

Fieldsend, E J, Everson, M R (2014). “The rolling tide evolutionary algorithm: A multiobjective optimizer for noisy optimization problems.” IEEE Transactions on Evolutionary Computation, 19(1), 103--117.

Li, Rui, Emmerich, TM M, Eggermont, Jeroen, B"ack, Thomas, Sch"utz, Martin, Dijkstra, Jouke, Reiber, HC J (2013). “Mixed integer evolution strategies for parameter optimization.” Evolutionary computation, 21(1), 29--64.

Super class

bbotk::Optimizer -> OptimizerMies

Active bindings

mutator

(Mutator)
Mutation operation to perform during mies_generate_offspring().

recombinator

(Recombinator)
Recombination operation to perform during mies_generate_offspring().

parent_selector

(Selector)
Parent selection operation to perform during mies_generate_offspring().

survival_selector

(Selector)
Survival selection operation to use in mies_survival_plus() or mies_survival_comma().

elite_selector

(Selector | NULL)
Elite selector used in mies_survival_comma().

init_selector

(Selector)
Selection operation to use when there are more than mu individuals present at the beginning of the optimization.

param_set

(ParamSet)
Configuration parameters of the optimization algorithm.

Methods

Inherited methods


Method new()

Initialize the OptimizerMies object.

Usage

OptimizerMies$new(
  mutator = MutatorProxy$new(),
  recombinator = RecombinatorProxy$new(),
  parent_selector = SelectorProxy$new(),
  survival_selector = SelectorProxy$new(),
  elite_selector = NULL,
  init_selector = survival_selector,
  multi_fidelity = FALSE
)

Arguments

mutator

(Mutator)
Mutation operation to perform during mies_generate_offspring(), see there for more information. Default is MutatorProxy, which exposes the operation as a configuration parameter of the optimizer itself.
The $mutator field will reflect this value.

recombinator

(Recombinator)
Recombination operation to perform during mies_generate_offspring(), see there for more information. Default is RecombinatorProxy, which exposes the operation as a configuration parameter of the optimizer itself. Note: The default RecombinatorProxy has $n_indivs_in set to 2, so to use recombination operations with more than two inputs, or to use population size of 1, it may be necessary to construct this argument explicitly.
The $recombinator field will reflect this value.

parent_selector

(Selector)
Parent selection operation to perform during mies_generate_offspring(), see there for more information. Default is SelectorProxy, which exposes the operation as a configuration parameter of the optimizer itself.
The $parent_selector field will reflect this value.

survival_selector

(Selector)
Survival selection operation to use in mies_survival_plus() or mies_survival_comma() (depending on the survival_strategy configuration parameter), see there for more information. Default is SelectorProxy, which exposes the operation as a configuration parameter of the optimizer itself.
The $survival_selector field will reflect this value.

elite_selector

(Selector | NULL)
Elite selector used in mies_survival_comma(), see there for more information. "Comma" selection is only available when this argument is not NULL. Default NULL.
The $elite_selector field will reflect this value.

init_selector

(Selector)
Survival selection operation to give to the survival_selector argument of mies_init_population(); it is used if the OptimInstance being optimized already contains more (alive) individuals than mu. Default is the value given to survival_selector. The $init_selector field will reflect this value.

multi_fidelity

(logical(1))
Whether to enable multi-fidelity optimization. When this is TRUE, then the OptimInstance being optimized must contain a Param tagged "budget", which is then used as the "budget" search space component, determined by fidelity and fidelity_offspring instead of by the MiesOperators themselves. For multi-fidelity optimization, the fidelity, fidelity_offspring, fidelity_current_gen_only, and fidelity_monotonic configuration parameters must be given to determine multi-fidelity behaviour. (While the initial values for most of these are probably good for most cases in which more budget implies higher fidelity, at least the fidelity configuration parameter should be adjusted in most cases). Default is FALSE.


Method clone()

The objects of this class are cloneable with this method.

Usage

OptimizerMies$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Super classes

mlr3tuning::Tuner -> mlr3tuning::TunerFromOptimizer -> TunerMies

Methods

Inherited methods


Method new()

Initialize the TunerMies object.

Usage

TunerMies$new(
  mutator = MutatorProxy$new(),
  recombinator = RecombinatorProxy$new(),
  parent_selector = SelectorProxy$new(),
  survival_selector = SelectorProxy$new(),
  elite_selector = NULL,
  init_selector = survival_selector,
  multi_fidelity = FALSE
)

Arguments

mutator

(Mutator)

recombinator

(Recombinator)

parent_selector

(Selector)

survival_selector

(Selector)

elite_selector

(Selector | NULL)

init_selector

(Selector)

multi_fidelity

(logical(1))


Method clone()

The objects of this class are cloneable with this method.

Usage

TunerMies$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# \donttest{
lgr::threshold("warn")

op.m <- mut("gauss", sdev = 0.1)
op.r <- rec("xounif", p = .3)
op.parent <- sel("random")
op.survival <- sel("best")

#####
# Optimizing a Function
#####

library("bbotk")

# Define the objective to optimize
objective <- ObjectiveRFun$new(
  fun = function(xs) {
    z <- exp(-xs$x^2 - xs$y^2) + 2 * exp(-(2 - xs$x)^2 - (2 - xs$y)^2)
    list(Obj = z)
  },
  domain = ps(x = p_dbl(-2, 4), y = p_dbl(-2, 4)),
  codomain = ps(Obj = p_dbl(tags = "maximize"))
)

# Get a new OptimInstance
oi <- OptimInstanceSingleCrit$new(objective,
  terminator = trm("evals", n_evals = 100)
)

# Create OptimizerMies object
mies_opt <- opt("mies", mutator = op.m, recombinator = op.r,
  parent_selector = op.parent, survival_selector = op.survival,
  mu = 10, lambda = 5)

# mies_opt$optimize performs MIES optimization and returns the optimum
mies_opt$optimize(oi)
#>           x        y  x_domain      Obj
#>       <num>    <num>    <list>    <num>
#> 1: 2.136216 1.970314 <list[2]> 1.961718

#####
# Optimizing a Machine Learning Method
#####

# Note that this is a short example, aiming at clarity and short runtime.
# The settings are not optimal for hyperparameter tuning. The resampling
# in particular should not be "holdout" for small datasets where this gives
# a very noisy estimate of performance.

library("mlr3")
library("mlr3tuning")

# The Learner to optimize
learner = lrn("classif.rpart")

# The hyperparameters to optimize
learner$param_set$values[c("cp", "maxdepth")] = list(to_tune())

# Get a TuningInstance
ti = TuningInstanceSingleCrit$new(
  task = tsk("iris"),
  learner = learner,
  resampling = rsmp("holdout"),
  measure = msr("classif.acc"),
  terminator = trm("gens", generations = 10)
)

# Create TunerMies object
mies_tune <- tnr("mies", mutator = op.m, recombinator = op.r,
  parent_selector = op.parent, survival_selector = op.survival,
  mu = 10, lambda = 5)

# mies_tune$optimize performs MIES optimization and returns the optimum
mies_tune$optimize(ti)
#>           cp maxdepth learner_param_vals  x_domain classif.acc
#>        <num>    <num>             <list>    <list>       <num>
#> 1: 0.3009208       14          <list[3]> <list[2]>        0.94
# }