Heuristics Flashcards

1
Q

Heuristics

A

Algorithms that guarantee feasible solutions, but not necessarily optimality.

Can be grouped into: constructive polynomial heuristics and local search heuristics.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Polynomial constructive heuristics

A
  • Myopic rules (greedy): solution of the problem is generated in a progressive manner, at each iteration we have a partial solution, an extension of the latter is built by selecting one of the available options.
    • to prevent greedy techniques to get stuck in local optima, randomization techniques can be used.
  • Relaxation of exact methods:
    • Beam search: relaxes the branch and bound method.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Beam search

A

We expand the search tree breadth-first, at each level of the tree, we only consider the best k nodes (k is the beam size), e.g. the nodes with the higher upper-bound for a max. problem.

This is an heuritic because the way the solution is constructed doesn’t guarantee an optimal solution, on the other hand the heuristic that we get is not exponential.

Node evaluation can be done:

  • using a greedy algorithm
  • considering the optimal solution of the relaxed problem associated to the node
  • computing the average of the past nodes

Complexity = O(α k β), α = height of the tree, β = number of children for a generic node.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Recovering Beam Search

A

Beam search, but able recover from bad decisions, using the recovery step, that at each level of the tree looks for partial solutions that are better than the selected ones (modifying those solutions, e.g. with swap operations).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Steepest Descent

A

Select the most promising node from all the available ones.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

First improvement

A

First improvement takes takes the first solution that is better than the current one. Suffers from getting stuck in local optma.

First improvement and steepest descent could be combined to get the best of both worlds:

  • First improvement is used at the beginning to reduce the solution space.
  • Steepest descent for exhaustively search the best neighbor of the solution found with first improvement.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Local search

A
  • Main features:
    • Common to all LS algorithsm
      • Initialization: find initial feasible solution
        • not always trivial, when it’s hard, we have to generate a new optimization problem where we minimize the unfeasibility and then use the solution for initializaton of the real problem.
      • Neighborhood generation and evaluation:
        • Solution encoding
        • definition of the neighborhood and its size: operatos to apply to get from one solution to another.
        • complexity of the objective fun
    • Depending on the heuristic used
      • acceptance test
      • stopping condition
        • max number of iterations
        • allocated time
        • % of improvement
How well did you know this?
1
Not at all
2
3
4
5
Perfectly