Lecture 9 - Univariate Optimisation Flashcards

1
Q

What is optimisation?

A

Finding the maximum or minimum of a function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the objective function for least squares?

A

The sum of the residuals squared

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How to find maximum (log)likelihood?

A

Propose a model f(x, θ) for the data.
Form a likelihood - L(θ) = the product of f(xi, θ) (OBJECTIVE FUNCTION)
Find the parameter values which maximise the likelihood

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a systematic line search?

A

Start by bracketing the maximum (find points [a, b] which are known to contain the maximum)
Divide this interval into a regular set of values x, spaced e apart.
Evaluate function at each x
Choose x that has max g(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are pros and cons of iterative methods?

A

Pro:
Can be considerably more efficient

Con:
Can be less reliable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the general iterative algorithm?

A
  • Start at iteration t=0 begin with an intial guess x(0)
  • Start next iteration, t=t+1
  • At iteration t, produce an improved guess, x(t) using an updating equation, x(t) = f(x(t-1))
  • Evaluate whether new guess is sufficiently accurate using stopping rule
  • If yes, stop and return x* = x(t) and g(x*) = g(x(t))
  • If no consider whether to report no convergence, otherwise go to back to step 2.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the algorithm for the bisection method?

A
  • Start an iteration t=0 with (a(0), b(0))
  • Set starting value x(0) = (a(0) + b(0))/2
  • Start next iteration t=t+1
  • Update (a(t), b(t), x(t)) according to:
  • If stopping rule met stop and return x* = x(t)
    else go to 3.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the bisection method?

A

Simple example of an iterative method for finding the root g’(x) =0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is implied if g’(x) is continuous on some interval [a,b] and g’(a)g’(b) <= 0

A

There is at least one x* in [a,b] for which g’(x) = 0 making g(x) a local optimum.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is convergence order?

A

Index of how fast convergence happens

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What stopping criteria are not reliable?

A

Criteria based on how close g’(x) is to 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the two main types of convergence criteria?

A

Absolute and relative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is absolute convergence criteria?

A

Stop when |x(t) - x(t-1)| < e, for some desired maximum imprecision e.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the problem with absolute convergence criteria?

A

There is always the same absolute imprecision regardless of the size of numbers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is convergence criterion?

A

( |x(t) - x(t-1)| / |x(t-1)| ) < e

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are partial remedies for numerical computer imprecision preventing convergence?

A
  • More numerically stable updates
  • Good enough convergence
  • Stop after n iterations whether convergence achieved or not
  • Stop if abs/relative convergence criteria or |g’(x)| fail to decrease over several iterations
17
Q

Why might you miss global optimum?

A

May have failed to bracket it or there may have been multiple optima and the one bracketed is not the best one.