Essentials Flashcards

1
Q

What is the formula for the entropy of a discrete probability distribution?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the formula for KL-divergence for two probability distributions?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the formula for the entropy of a continuous probability distribution?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the formula for the KL-divergence of a continuous probability distribution?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the entropy of a Gaussian Distribution?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the entropy of a d-dimensional Gaussian distribution?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the KL-divergence between two d-dimensional multivariate Gaussian Distributions?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the Wasserstein difference for two multivariate Gaussian Distributions?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the cross entropy error for a binary classification task?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the Gaussian Distribution equation?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the multivariate Gaussian Distribution equation?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

For softmax, what is Prob(i)?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

For softmax what is log Prob(i)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the equation for the gradient using softmax?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the true value (V*) of the current state?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the formula for Q*(s,a)?

A
17
Q

What is the formula for number of weights per filter?

A

(filter width) x (filter height) x (input depth) + 1 (for bias)

18
Q

Number of neurons in this layer?

A

(output width) x (output height) x (depth), width= ((input width - filter width)/stride)+1

19
Q

Number of connections into the neuron’s in a layer?

A

(num neurons) x (connections per neurons / filter wights - bias)

20
Q

Number of independent parameters?

A

(num filters) x (num weights)