Metaheuristics Flashcards

1
Q

Metaheuristics summary

A

Basically a 2 level structure:

  • Low level: custom heuristic
  • High level: guiding system that helps escaping local optima.

Characteristics:

  • Simple: regarding implementation and computational complexity.
  • Effective: yields solution of “good” quality
  • General Purpose: makes few or no assumptions about the problem being optimized and can search very large spaces of candidate solutions.

Classes:

  • Multi start: Iterated LS (ILS), Variable Neighborhood Search (VNS), VN Descent (VND), Greedy randomized adaptive search procedure (GRASP), Scattered Search (SS), Ant Colony Optimization (ACO), Evolutionary algorithms.
  • Neighborhood-based: Tabu Search (TS), Simulated Annealing (SA), Guided Local Search (GLS)
  • Hybrid: metaheuristic used in combination with an exact method or an heuristic.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Multi start metaheuristics

A

Philosophy: diversify to avoid local optima. Restart anytime a subregione has been examined.

Key features:

  • Memory (Remember examined regiones): dsed to identify elements that are common to good solutions.
    • E.g. TSP: a sequence of customers appears often in solutions
  • Randomness: randomly generating starting solutions diversifies portfolio.
  • Degree of rebuild: numbe of elements of a solution that don’t change.
    • reduces complexity.
    • solution contruction:
      • built from scratch
      • selective fixing of solution elements (e.g. path relinking)

Use cases: when it’s easier to build new solutions than to apply local search, this usually happens when finding neighbouring feasible solutions is hard (high comp. complexity), e.g. machine scheduling problem.

Phases:

  • Solution generation
    • Can be done through solution building:
      • constructive (GRASP, ACO) or perturbation (ILS, VNS, VND, EA, SS)
    • can be done through solution population (pop size = 1 or pop size >> 1)
      • usually more complex because requires
        • managing many solutions at the same time
        • effective operations to combine solutions (e.g. Crossover in G. Algos)
  • Solution improvement (improvement means: better score or worse score but escaping local optima)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Neighbourhood-based metaheuristics

A
  • Iterative improvement algorithm
  • Accepts worsening solutions escapes local optima
  • Move strategies:
    • Guided Local Search
    • Tabu Search
    • Simulated Annealing
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Guided Local Search

A

* Penalize Features of the solution to escape local optima

* Generalization of GENET (1991): neural network. * Principles:
* define set of features of candidate solution:
    * e.g. for TSP: customer C precedes E
* When local search gets stuck, some features get penalize (init penalization = 0)
* Local search looks for new solutions optimizing for objective function that contains penalties.
    * Penalizing those features that have a high utility U<sub>i</sub> = c<sub>i</sub>/(1 + p<sub>i</sub>). c<sub>i</sub>: feature cost

s: canidadate solution
h: penalized obj. fun.
g: obj fun.

yi(s) = 1 if s has feature i, 0 otherwise.

λ: parameter

pi: penalty for feature i (init. = 0)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Simulated Annealing

A

Annealing: thermodynamics concept, when a molten metal is cooled slowly, it tends to solidify in a mimimum energy structure.

Modification of first improvement search such to avoid local optima.

Steps:

  1. Init.
    • creation of initial solution x1 ∈ X
    • update current solution x~ = x1
    • I = 0 (iteration counter)
  2. Neighbourhood N(x~) selection starting from current solution
    • Select randomly from N(x~) a solution x^
    • I++
    • if F(x^) >= F(x~) set x~ = x^
    • else if ( p = random(0,1) ) < (p^ = exp (- (F(x^) - F(x~)) / T(I) ) ) x~ = xˆ
  3. Termination test
    • Solution improved of a value < threshold
    • Number of accepted moves ( p < p^ ) is less than threshold value for nL consecutive times (L = plateau, n = parameter)
    • Time limit reached
    • Number of iteratrions limit reached

Selction of temeperature and cooling:

  • L = γ |Nx~|, γ ∈ [10, 20]
  • T0 such that p0^ ≈ 0.5 w.r.t. an average neighbouring solution.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

SA applied to Graph Partitioning Problem

A
  • We want to subdivide graph nodes into 2 sets of size |N| / 2
  • N1 ∪ N2 = N, N1 ∩ N2 = 0
  • Encoding: N1 = {1, 2, 3} N2 = {4, 5, 6}
  • Straightforward formulation:
    • F(.): number of bridge edges (for graph in picture F(x1) = 7)
    • objective: min F(x)
    • Neighborhood of current solution (Ni, Nj) has size O(|N|2)
  • Other formulation
    • Solution space: all bipartitions Ni ∪ Nj = N, |Ni| can be != |Nj|
    • objective function includes a term that penalizes bipartitions that aren’t balanced.
    • Neighborhood of current solution (Ni, Nj) has size O(|N|)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Tabu Search

A
  • Uses steepest descent local search approach.
  • At each iteration every solution of the neighborhood (or a subset if too big) of the current solution is evaluated, and the best is selected as the new current solution, then (new solution, old solution) is inserted in the tabu list, to avoid looping.
  • Avoids visiting last k solutions by maintaining a tabu list.
  • Tabu move: a prohibited move, that would take us to a solution that has already been visited.
  • Acceptance is deterministic (not based on probability like in SA)
  • Aspiration criterion: condition that if true allows a tabu move to be performed, for example: if applying the move would create a solution that is better than any of the previously found ones.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Tabu Search applied to Graph Coloring

A

A graph undirected graph G(N, E) is r-chromatic if it can be colored with r colors, so that no adjcent node has the same color.

The problem is used to solve register allocation and the machine job assignment problem.

Chromatic number: min r such that G(N, E) is r-chromatic.

For planar (such as geo. maps) graphs, the chromatic number is at most 4.

TS model:

  • Strategy: Consider coloring a graph with c colors, but do not constraint node colors, just penalize unfeasible solution.
    Everytime we find a solution with F(x) = 0 (feasible), we use that solution as a starting solution for the problem with c-1.
  • Objective function: min F(x) = sum(for each color c, |Ec|), |Ec|: number of edges that connect two nodes of the same color.
  • Moving from the current solution (x~) to the best of the neighborhood (x^) happens when: the move from x~ to x^ is not tabu, or when x^ is the best solution so far.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Iterated Local Search

A

Extension of the multistart approach, with a local search.

It applies iteratively local search to obtain a set of local minima, each times it finds a local mimimum, it perturbates it.

The perturbation is typically randomly generating a neighbor of a neighborhood much larger than the one considered in the local search.

An acceptance test determines a new minimum is accpeted.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Variable Neightborhood Search

A

Variation of Iterated Local search.

Uses a series of neighborhood structures that are evaluated in a hierarchical way, usually sorted by the size of the neighborhood.

The local search phase is typically first improvement.

A new local minimum is accepted only if it improves the current solution, otherwise the neighborhood to consider in the perturbation phase changes. The routine ends when all the neighborhoods are considered in the perturbation phase.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Variable Neighborhood Descent

A

Generalization of VNS, we still use structures for hierarchically evaluating neighborhoods, but they are used directly in the local search phase.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

GRASP

A

Greedy randomized adaptive search procedure.

Steps:

  • Init x_ := current solution,. F(x_) = inf. (if min. problem)
  • for N iterations
    • generation of a solution xk with a greey randomized constructive procedure (select randomly from y decisions 1 to apply)
    • Classic local search (usually first improvement) on xk obtaining xk
    • if xk’ is the b
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Genetic Algorithms

A

Simulate natural evolution processes using some known characteristics that are well known:

  • evolution operates on chromosoms that characterise an individual, not directly on the individual
  • natural selection operates on individuals, in such a way that chromosomes that encode the most adaptive individuals have more probabilities of surviving.
  • evolution is done thanks to breeding; recombination of genetic material of the parents and the mutations enable childern to be different from parents.
  • the evolution process doesn’t have memory, all informations are store in the genetic heritage of the population.

Differs from SA and TS since it operates on populations of solutions, not single solutions. Parallelsm is intrinsic, we can evaluate the population in parallel.

Genetic algorithms can be easily hybridized with local search apporeaches generating “memetic” algorithms, these differ from genetic algorithms, because a new member of the population is “optimized” (steeped descent/first improvement) before inserting it into the population.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Genetic Algorithms: operators

A
  • Crossover is performed on a pair of solutions
    • standard: select two cutting points and swap part of the resulting sectons, could lead to unfeasibility (repeated elements)
    • order: as above but avoids duplicates
    • partially mapped: we map each element of the cutting sections to the counterpart and substitute every instance of them with the corresponding mapping.
      • p1: 1 2 | 3 4 5 | 6
      • p2: 4 3 | 2 1 6 | 5
      • 3 <=> 2
      • 4 <=> 1
      • 5 <=. 6
      • p1’ : 4 3 2 1 6 5
  • mutation: operates on single chromosome, usually replaces a random gene, with a random value. This operation is applied on a small part of the chromosomes (chosen at random).
    • note: this operator could not be applied to TSP.
  • inversion: similar to muation, mutates only one chromosome, by inverting the order of the cutting section or by swapping the first and last element.

This operators are used with varying distribution, usually crossover % is high in the beginning, then reduced to avoid local optima.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Genetic Algorithms: main steps

A
  1. Solution are codified in string of finite length, composed of symbols of a finite alphabet. Each solution is called chromosome, and each element of the solution is called gene.
    A set of solution is selected randomly, to be part of the initial population. Usually solutions are generated randomly, differntly from SA and TS., that use greedy algorithms.
  2. Each chromosomes has a corresponding fitness value, that indicates the quality of the solution, the fitness value may or may not coincide with the objective function.
  3. A subset of the population is selected randomly to breed, each solution is selected with a probability proportional to the fitness.
  4. From the selected subset a number of descendants are produced applying genetic operators.
  5. Some of the old chromosomes are removed from the population.
  6. Fitness of the new chromosomes is computed. With this step we begin a new iteration. Usually new members are kept in the population for at least one generation.
  7. Stopping criterion is usually based on the number of iterations we want.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Genetic Algorithms: Encoding

A

Strictly depends on the problem. Most problems use a binary encoding.

The alternative is defining a custom alphabet, to get a easier decoding or to define custom genetic operators.

17
Q

Genetic Algorithms: Fitness

A

Measures the quality of a solution corresponding to a chromosome.

Used to compute the probability of breeding of a chromosome.

If we are dealing with a min f(x) problem, the fitness value could be set to 1/f(x).

We could also define a fitness function that favours fit individuals more than non-fit or viceversa.

18
Q

Genetic algorithms: parameter tuning

A

Usually a setup phase is needed to understand what values could work.

Parameters that influence the performance should be accurately tuned.

Main parameters:

  • Population size
  • Replacement percentage: how many new members?
  • Number of generations: depends on the computational time available.
  • Probability of applying different gentic operators.
19
Q

Ant colony optimization

A