ML-05 - Model evaluation Flashcards
ML-05 - Model evaluation
Describe the typical ML workflow. (7)
(See image)
ML-05 - Model evaluation
Describe k-fold cross-validation.
(See image)
ML-05 - Model evaluation
Describe the validation workflow.
(See image)
ML-05 - Model evaluation
Describe the test workflow.
ML-05 - Model evaluation
What is a requirement for using accuracy as a metric?
Your data has to be balanced.
ML-05 - Model evaluation
What should you use in place of accuracy, if your data is imbalanced?
Precision, recall, F1 or ROC-AUC.
ML-05 - Model evaluation
What is the formula for precision?
TP/PP = TP / (TP + FP) = True positives / predicted positives.
ML-05 - Model evaluation
What is the formula for recall?
TP/P = TP/(TP + FN) = True positives / actual positives
ML-05 - Model evaluation
Fill in the labels “Predicted” and “Actual”. (See image)
(See image)
ML-05 - Model evaluation
Which one is used to calculate precision? (See image)
2
ML-05 - Model evaluation
Which one is used to calculate recall? (See image)
1
ML-05 - Model evaluation
What’s the formula for F1 score?
(See image)
ML-05 - Model evaluation
What are some benefits of the F1 score? (2)
- 1 metric over 2 (P&R) makes it easier to make decisions.
- Automatically selects threshold values.
ML-05 - Model evaluation
What is a drawback of the F1 score?
Difficult to interpret.
ML-05 - Model evaluation
How can you visualize precision/recall? (3 mentioned)
- Confusion matrix
- ROC curve (from slides; technically TPR/FPR?)
- Precision/recall curve