Exact Methods Flashcards

1
Q

Lower bound computation (for exact methods)

A

Computation procedure is determined by the structure of the problem, specially in NP-hard problems with a PLI model that is inpractical to use (e.g. too many integer variables).

There are 4 ways of calculating the lower bound:

  • Problem relaxation by constraint elimination
  • Modifying the objective function
  • Lagrangian Relaxation
  • Surrogate Relaxation (hardly used in practice, since the surrogate problem is usually not polynomial)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Problem relaxation by constraint elimination

A

Constraints are removed or relaxed (integer variables are transformed into continuos ones).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Problem relaxation by modifying the objective function

A

P1 should be polynomial, otherwise it wouldn’t be efficient enough.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Lagrangian Relaxation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Lagrangian Relaxation: equality and inequality constraints;

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Solving the Langrangian dual

A

When the constraints (easy and hard) are linear in x, there are general exact methods.

The subgradient is one of such methods that converges to the optimal lower bound.

Algorithm:

  1. k = 0, u0 is given
  2. Solve linear subproblem w(u) = min L(x,u) s.t. u >= 0, and get optimal x*
  3. using x* derive a subgradient in x*
    set uk+1 = max(0, uk + ak (d - D x(uk) )
  4. Go to 2 until u converges or gets in pre-defined range.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Lagrangian Example Exact Solution

A

In this example an optimal solution for the lagrangian subproblem w(u), can be found by setting to 1 every x with a coefficient > 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Lagrangian Example Subgradient

A

max 10x1 + 4x2 + 14x3

3x1 + x2 + 4x3 <= 4

x1,x2,x3 boolean.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Dynamic Programming Optimality Principle

A

Also called Bellman principle: if we consider an optimal solution and a subset of it, then that sub-solution is a noptimal solution for the corresponding sub-problem.

The shortest path algorithm is an example of it: a sub-path from i to j of the shortest path is an optimal solution of the shortest-path problem from edge i to j.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Relationship between UB and LB of a node i and its parent j

A

In min. problems: The lower bound of a child node is greater or equal to the lower bound of the father. No relationships exists with the upper bound.

In max. problems: the upper bound of the father is >= max(UB of each child)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

KP 0/1 Dynamic Programming optimality recursive relation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Proof that lagrangian relaxation is lower bound of a problem

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly