Lecture 4 - Move Acceptance in Local Search Metaheuristics Flashcards
What are the three different types of parameter setting for metaheuristics?
Static - either there is no parameter to set, or parameters are set to a fixed value e.g. IOM (intensity of mutation = 5)
Dynamic - parameter values vary with respect to time/iteration count.
Adaptive - given the same candidate and current solutions at the same current elapsed time or iteration count, the acceptance threshold or acceptance probability is not guaranteed to be the same as one or more components depend on search history
What is Threshold move acceptance?
Determine a threshold which is in the vicinity of a chosen solution quality e.g. the quality of the best solution found so far or current solution, and accept all solutions below that threshold.
What are some examples of threshold move acceptance?
Static - accept a worsening solution if the worsening of the objective value is no worse than a fixed value
Dynamic - Great Deluge or Flex Deluge
Adaptive - Extended Great Deluge, Modified Great Deluge
How does Great Deluge work?
Choose an initial solution
Choose a rain speed
Choose the initial water level
Then repeat this loop:
Choose a new solution which is a perturbation of the old one
Compute the objective value of the new solution
If the value is smaller than the water level, then the old solution becomes the new one, and the water level lowers by a pre-set decay rate.
If the value is not smaller, then loops again, unless there has been no increase in quality or too many iterations, in which case it terminates
How does the Extended Great Deluge work?
Same as the normal one, but feedback is received during the search and decay-rate is updated/reset accordingly whenever there is no improvement for a long time.
What are some examples of stochastic move acceptance?
Static - Naive acceptance: P is fixed e.g. if improving P=1.0, else P=0.5
Dynamic - Simulated Annealing: P changes in time with respect to the difference in the quality of current and previous solutions. Temperature parameter changes dynamically
Adaptive - Simulated Annealing with reheating: P is modified via increasing temperature time to time causing partial restart - increasing the probability of acceptance of non-improving solutions
What is Simulated Annealing?
A stochastic local search algorithm inspired by the physical process of annealing (letting something cool down over time, rather than rushing the cooling process)
How does Simulated Annealing work?
Generates an initial solution, and initialises temperature to T0
Then repeats this loop:
Chooses a neighbouring solution to the current solution.
Then gets the difference between the current and the candidate solution i.e. solution currently being worked on - solution set as currently best solution
Then, if either the difference is smaller than 0, or a random value generated (between 0 to 1) is smaller than e^(-(difference between solutions)/Temperature), it accepts the solution.
It then updates the temperature according to the cooling schedule and keeps repeating the loop until termination criteria is satisfied.
What does the temperature mean in Simulated Annealing?
T is initially high - many inferior moves are accepted
T is decreasing - inferior moves are nearly always rejected
As the temperature decreases, the probability of accepting worsening moves decreases.
What are the 4 parts that make up the cooling schedule?
Starting temperature
Final temperature
Temperature decrement
Iterations at each temperature
What are the features of starting and final temperature?
Starting temperature:
Hot enough - to allow almost all neighbours
Not so hot - random search for sometime
Estimate a suitable starting temperature
Final temperature:
Usually 0, however in practice, not required
T is low - accepting a worse move is almost the same as 0 at that point
What are the three types of temperature decrement strategies?
Linear:
T = T - x (x being an arbitrary value)
Geometric - T = T * alpha (alpha is typically in the interval [0.9, 0.99]
Lundy Mees:
T = T/(1 + (beta * T))
One iteration at each T, but decrease T very slowly. Beta is typically a small value that is close to 0 e.g 0.0001
What are the features of iterating at each temperature?
One iteration at each T
A constant number of iterations at each T
Dynamically change the number of iterations at each T:
At higher Ts - less number of iterations
At lower Ts - higher number of iterations, local optimum fully exploited
Reheating - if stuck at local optima for a while, increase the current temperature with a certain rate
What parameter tuning methods are there?
Traditional approaches e.g. use of an arbitrary setting or trial & error
Sequential Tuning - fix parameter values successively
Design of experiments
Meta-optimisation - use a metaheuristic to obtain ‘optimal’ parameter settings
What does Design of Experiments mean?
A systematic method (controlled experiments) to determine the relationship between controllable and uncontrollable factors (inputs to the process, variables) affecting a process, their levels (settings) and the response (output) of that process