Classifier Evaluation Flashcards
Classifier Evaluation
Usually advised that a classification algorithm (binary or multi-class) is evaluated. Applies to all classification algorithms whether learning is supervised or unsupervised, based on metaheuristics or others.
The purpose of evaluations
is to determine how well the classification algorithm performs on some test data.
Classifier Evaluation using Test Data
The objective is to evaluate how the classifier performs on our test data. Use 80% of data for training and 20% for testing. Some algorithms also continuously use test data during training. The test dataset is input to the classifier and the output is given in a table.
confusion matrix
We use a confusion matrix to summarize the results. It summarizes the performance of the classifier.
Binary Classifier Evaluation
Tp, Tn, Fp, Fn
True Positives (TP)
True positives are the cases when the actual class of the data point was true and the predicted was also true.
True Negatives (TN)
True negatives are the cases when the actual class of the data point was false and the predicted was also false.
False Positives (FP)
False positives are the cases when the actual class of the data point was false and the predicted was true.
False Negatives (FN)
False negatives are the cases when the actual class of the data point was true and the predicted was false.
Recall or Sensitivity or True Positive Rate (TPR)
Number of items correctly identified as positive out of total actual positives. TP/(TP+FN).
Specificity or True Negative Rate (TNR)
Number of items correctly identified as negative out of total actual negatives. TN/(TN+FP).
Precision
Number of items correctly identified as positive out of total items identifiers as positive. TP/(TP+FP)
False Positive Rate/Type I Error
Number of items wrongly identified as positive out of total actual negatives: FP/(FP+TN).
False Negative Rate or Type II Error
Number of items wrongly identified as negative out of total actual positives: FN/(TP+FN).
F1 Score
Harmonic mean of precision and recall given by:
2PrecisionRecall/(Precision+Recall)