Performance Metrics Flashcards

1
Q
  • How is misfit different for classification and regression
A
  • Classification – you are wrong (discrete)
  • Regression – how much you are wrong (continuous)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q
  • Type 1 vs Type 2
A
  • Mistaken vs missed significance – case dependent
  • false positive vs false negative
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What do the rows of the confusion matrix sum up to?

A
  • Rows sum to true +ve and -ve
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q
  • How class imbalance could cause problem for some accuracy measures.
A
  • Favouring the majority thus misleading accuracy results.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q
  • What is the role of an accuracy measure during training of a model?
A
  • Assess how well the model is learning patterns in the data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q
  • Accuracy measure to penalise larger misfits during model training
A

Lp norms
penalises misfits based on how wrong they are

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q
  • What should be the AUC value for a model that makes perfect predictions?
A

1.0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q
  • How threshold value selection in binary model forms a value judgement? Who makes the selections?
A
  • Engineer makes the judgement on where to make the split
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q
  • 3 reasons to measure model accuracy.
A
  • Credibility – stakeholder confidence
  • Improvement – how, where, whether
  • Comparison – assist with model or feature selection
How well did you know this?
1
Not at all
2
3
4
5
Perfectly