Losses Flashcards
What is the purpose of Mean Squared Error (MSE) in regression?
MSE calculates the average squared difference between predicted and actual values, penalizing larger errors more heavily.
What does Mean Absolute Error (MAE) measure in regression tasks?
MAE measures the average absolute difference between predicted and actual values, providing a more robust metric against outliers compared to MSE.
What is Log-Cosh Loss, and why is it used?
Log-Cosh Loss is which behaves as MSE at small values and as MAE at large values, and differentiable everywhere.
What is Binary Cross-Entropy used for?
Binary Cross-Entropy measures the difference between predicted probabilities and true lables (turned into 1 and 0) for binary classification tasks.
What does Categorical Cross-Entropy measure in classification?
Categorical Cross-Entropy is a loss function to evaluate the performance of a model by comparing predicted probabilities with actual one-hot encoded labels.
What is Focal Loss designed to address in classification?
Focal Loss focuses on hard-to-classify examples by reducing the loss contribution from easy examples, helping with class imbalance.
What is the purpose of Triplet Loss in ranking tasks?
Triplet Loss optimizes the relative distances among anchor, positive, and negative samples to improve ranking models.
What is IoU Loss and how is it used in object detection?
IoU Loss measures the intersection-over-union between predicted and ground-truth. It is used in OD for bounding boxes, focusing on localization accuracy.
What is Smooth L1 Loss, and where is it applied?
Smooth L1 Loss is used in object detection to be more robust to outliers.
What is the role of Focal Loss in object detection?
Focal Loss helps address class imbalance by focusing training on hard-to-detect objects.
What is Dice Loss?
Dice Loss is 2 times the intersection over the sum.
What does Jaccard Loss measure?
Jaccard Loss, or IoU loss, is the intersection of union of 2 sets.
What is Tversky Loss, and how does it differ from Dice Loss?
Tversky Loss generalizes Dice Loss by adding an option to give different weights to false positives and false negatives, making it suitable for imbalanced data.
What is Adversarial Loss in generative models?
Adversarial Loss is used in GANs to optimize the generator and the discriminator together.
What is Reconstruction Loss in generative models?
Reconstruction Loss measures how well the generated output matches the input, commonly used in Autoencoders.