Week 2 Flashcards

1
Q

What is a generic optimization algorithm (or problem)?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the Finite Differences (FD) method?

A

We have the direction found using

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Why is FD biased?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How to compute the variance of the FD-estimator? What do we want the value of c to be?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How to implement FD?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the hyperparameters we set when optimizing with FD?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is gradient descend?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the SPSA algorithm?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are conditions for the SPSA algorithm?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

When does SPSA have to be used with caution?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does SPSA stand for?

A

Simultaneous Perturbation Stochastic Approximation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does GSFA stand for?

A

Gaussian Smoothed Functional Approximation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the GSFA estimation?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly