Performance Metrics Flashcards
1
Q
- How is misfit different for classification and regression
A
- Classification – you are wrong (discrete)
- Regression – how much you are wrong (continuous)
2
Q
- Type 1 vs Type 2
A
- Mistaken vs missed significance – case dependent
- false positive vs false negative
3
Q
What do the rows of the confusion matrix sum up to?
A
- Rows sum to true +ve and -ve
4
Q
- How class imbalance could cause problem for some accuracy measures.
A
- Favouring the majority thus misleading accuracy results.
5
Q
- What is the role of an accuracy measure during training of a model?
A
- Assess how well the model is learning patterns in the data
6
Q
- Accuracy measure to penalise larger misfits during model training
A
Lp norms
penalises misfits based on how wrong they are
7
Q
- What should be the AUC value for a model that makes perfect predictions?
A
1.0
8
Q
- How threshold value selection in binary model forms a value judgement? Who makes the selections?
A
- Engineer makes the judgement on where to make the split
9
Q
- 3 reasons to measure model accuracy.
A
- Credibility – stakeholder confidence
- Improvement – how, where, whether
- Comparison – assist with model or feature selection