Autumn 2021 Paper (Long Questions) Flashcards

1
Q

What is meant by polynomial time complexity in the context of linear programs? Comment on the computational complexity of Simplex Algorithm, Ellipsoid Algorithm, and the interior point method for solving a linear program. Which of these is known to run in polynomial time and which are fast in practice ?

A

Polynomial Time Complexity:
An algorithm runs in polynomial time if its running time can be expressed as a polynomial function of the size of the the input data.

Simplex Algorithm:
Technically, the simplex algorithm does not run in polynomial time in the worst case. It can experience exponential time complexity.
Practical Speed: The simplex algorithm is very efficient and is the preferred method for solving most LP problems due to its typically first performance on real-world problems.

Ellipsoid Algorithm:
The Ellipsoid Algorithm runs in polynomial time. It was the first algorithm proved to solve LP problems in polynomial time.
Practical Speed: Despite its theoretical importance, the Ellipsoid algorithm is not efficient in practice when compared to other methods and is rarely used for actual LP problems.

Interior Point Methods:
Interior point methods also run in polynomial time and are more efficient than the Ellipsoid Algorithm.
Practical Speed: Very efficient for large-scale LPs and are commonly used in practice, used instead of simplex, if problem is too large.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
1
Q

Explain what is the meaning of the sensitivity interval associated with the right-hand side term of a constraint. If the right-hand side term of a constraint changes, but without leaving its sensitivity interval, which of the following will remain unchanged
(i) the composition of the basis, or (ii) the value of the basic variables, or (iii) the value of the objective function?

A

Sensitivity Interval:
This term refers to the range of values that the RHS of a constraint can take without changing the optimal basis of the solution. Within the interval, the solution remains optimal, but the specific values of the variables and the objective function may adjust.

Changes within the Sensitivity Interval:
(i) Composition of the basis: Does not change
(ii) Value of the basic variables: May change
(iii) Value of objective function: Changes linearly based on the change in the RHS of the active constraints

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

State the definition for an optimization problem to be convex. Moreover, give a simple example for both, (a) a convex optimization problem, and (b) a non-convex
optimization problem.

A

A function 𝑓:𝑅^𝑛→𝑅 is convex if for any two points
𝑥, 𝑦 in its domain and for any 𝑡 ∈[0,1], the following inequality holds:
𝑓(𝑡𝑥+(1−𝑡)𝑦)≤𝑡𝑓(𝑥)+(1−𝑡)𝑓(𝑦)

An optimizer is convex if its objective function is a convex function, and all of the constraint functions are also convex.

Examples:
Convex: Minimize f(x) =x^2 subject to x ≥ 0.
Non-Convex: Minimize f(x) = cos(x) subject to x ∈ (0, 2π)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the difference between gradient method and Newton method? What is the basic idea behind the Newton method?

A

Gradient Method:
Uses the gradient of the function to find the direction in which the function decreases the fastest. The method iteratively updates the variable x in the direction opposite to the gradient.

Newton Method:
Uses both the gradient and the Hessian matrix to find the direction of the update. The method is faster in terms of the convergence compared to gradient methods because it considers the curvature of the function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the difference between local and global optima? Can a convex optimization
problem have multiple local minima?

A

Local Optima:
Points where the function value is lower than at all nearby points. A function may have multiple local optima.

Global Optima:
The absolute best value of the function over its entire domain

Convex Function:
A convex function has the property that any local minimum is also a global minimum. Thus, a convex optimization problem cannot have multiple local minima; it has only one global minimum.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Explain what is meant by NP-Hard optimization problems

A

NP-Hard problems are as hard as the hardest problems in NP. An optimization problem is NP-Hard if the decision version (yes/ no question) of the problem is NP-Complete.

This classification suggests that there is no known polynomial time algorithm to solve these problems optimally for all possible instances, and it is unlikely one exists (assuming P != NP)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly