metrics Flashcards

1
Q

What is type 1 error

A

false positive rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is type 2 error

A

false negative rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is precision

A

TP / ( TP + FP )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is recall

A

TP / ( TP + FN )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is accuracy

A

(TP + TN) / all

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the ROC curve

A

receiver operating characteristic curve

x is FPR = 1 - specificity
y is TPR = sensitivity = recall

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is f-score

A

2(precisionrecall) / (precision+recall)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is bias

A

model bias = underfitting

not complex as data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is variance

A

model variance = overfitting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is overfitting

A

model has high variance

model is more complex than data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is underfitting

A

model has bias

model is less complex than data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Increase overfitting is an increase in?

A

variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Increase underfitting is an increase in?

A

bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is regularization for

A
  • control model complexity
  • help generalization
  • add penality to cost function
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

L1 regularization is

A
  • L1 norm = sum of absolute beta weights
  • feature selection
  • add to min error cost function
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

L2 regularization is

A
  • L2 norm = sum of squared beta weights
  • add to min error cost function
  • penalizes large weights