Last minute Flashcards

1
Q

MLE vs Bayes Inference

A

MLE treats the params as unknown constants and tries to find their ML value.

Bayes uses prior knowledge to estimate parameter values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Cross validation

A

validation set is used to see how your model generalizes before using test set

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

E step of EM algorithm

A

find the expected value of the hidden variables based on current param values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

M step of EM algorithm

A

recompute the ML value of the params based on the value of the hidden variables and observed data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Gaussian parameters

A

k, means covariances and weights

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

p(x) = sumToK p(x | G(i)) * P(Gi)

A

Gi - components
P(Gi) priors
P(x|Gi) - component densities
p(x) the mixture model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Which lines do you remove from adaboost?

A

Adaboost remove line 7 -12

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is p in the adaboost question

A

p is the probability of sampling each data point

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is B in the adaboost question

A

B controls how much the probabilities are adjusted each round

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What effects do bags have on ensemble learning?

A

more bags = high bias, low variance

less bags = low bias, high variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

In the 2018 pseudocode what is N?

A

Number of data points

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

In the 2018 pseudocode what is d?

A

Number of features

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What’s a benefit of online in GD?

A

Converge faster, updates weights one-by-one as inputs are seen.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

11.1 What is the weight?

A

wTH

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What do you need to replace in 11.1?

A

6,7,8 to yi <= sigmoid(vTiZ)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What’s bias and variance?

A

Bias is the difference between the predicted data points and the actual data points

Variance is how scattered the predicted values are from the actual value