Week 2 Flashcards
1
Q
What is a generic optimization algorithm (or problem)?
A
2
Q
What is the Finite Differences (FD) method?
A
We have the direction found using
3
Q
Why is FD biased?
A
4
Q
How to compute the variance of the FD-estimator? What do we want the value of c to be?
A
5
Q
How to implement FD?
A
6
Q
What are the hyperparameters we set when optimizing with FD?
A
7
Q
What is gradient descend?
A
8
Q
What is the SPSA algorithm?
A
9
Q
What are conditions for the SPSA algorithm?
A
10
Q
When does SPSA have to be used with caution?
A
11
Q
What does SPSA stand for?
A
Simultaneous Perturbation Stochastic Approximation
12
Q
What does GSFA stand for?
A
Gaussian Smoothed Functional Approximation
13
Q
What is the GSFA estimation?
A