Evaluation Flashcards
Notes on Evaluation Strategies that may help with the exam
What three methods are commonly used to measure Generalisation Error?
Training/Validation/Test
Cross-Validation
Independent Test Set
How does K-fold Cross Validation work? Go through step-by-step.
- Split training data into K amount of folds
- For each fold, train on all other folds and make predictions on the held-out test fold
- Iterate all folds as the test fold
- Combine all the results of K folds, so that every data point has been used as a test data point
What is the difference between Nested K-Fold Cross Validation and K-Fold Cross Validation?
The Nested K-Fold updates its hyperparameters as it goes along, whereas traditional K-Fold does not.
How does Nested K-Fold Cross Validation work?
Splits the data into K sections.
For each section, tests each part, using the rest of the data in the section as the training dataset. The hyperparameters stay consistent throughout
Updates the hyperparameters each time the inner loop completes, using the results found from the subsequent analysis.
What is the equation for Classification Accuracy?
Classification Accuracy = (TP + TN) / (TP + FP + FN + TN)
Where:
TP = True Positive
TN = True Negative
FP = False Positive
FN = False Negative
What is the equation for Balanced Classification Accuracy?
(Sensitivity + Specificity) / 2
What is the equation for Precision in regards to Evaluation?
Precision = TP / (TP + FP)
Where:
TP = True Positive
FP = False Positive
What is the equation for Recall in regards to Evaluation?
Recall = TP / (TP + FN)
Where:
TP - True Positive
FN = False Negative
What does Precision mean in regards to Evaluation?
Percentage of correct positive predictions among all predicted positive predictions
What does Recall mean in regards to Evaluation?
Percentage of correct positive predictions among all real positive cases e.g. don’t want to miss any diseased cases
How is Sensitivity calculated?
Sensitivity = Recall
How is Specificity calculated?
Specificity = TN / (TN + FP)
Where:
TN = True Negative
FP = False Positive
What does Specificity mean in regards to Evaluation?
Percentage of correct negative predictions among all real negative cases
What is F Measure in regards to Evaluation?
F Measure combines Recall and Precision into a single measure, and they range from 0 to 1, where 1 indicates perfect precision and recall.
What is the equation for F Measure?
f_b = (1 + b^2) * (Precision * Recall) / ((b^2 * Precision) + Recall)