Losses Flashcards
What is the purpose of Mean Squared Error (MSE) in regression?
MSE calculates the average squared difference between predicted and actual values, penalizing larger errors more heavily.
What does Mean Absolute Error (MAE) measure in regression tasks?
MAE measures the average absolute difference between predicted and actual values, providing a more robust metric against outliers compared to MSE.
What is Log-Cosh Loss, and why is it used?
Log-Cosh Loss is which behaves as MSE at small values and as MAE at large values, and differentiable everywhere.
What is Binary Cross-Entropy used for?
Binary Cross-Entropy measures the difference between predicted and true probabilities for binary classification tasks.
What does Categorical Cross-Entropy measure in classification?
Categorical Cross-Entropy is a loss function to evaluate the performance of a model by comparing predicted probabilities with actual one-hot encoded labels.
What is Focal Loss designed to address in classification?
Focal Loss focuses on hard-to-classify examples by reducing the loss contribution from easy examples, helping with class imbalance.
What is the purpose of Triplet Loss in ranking tasks?
Triplet Loss optimizes the relative distances among anchor, positive, and negative samples to improve ranking models.
What is IoU Loss used for in object detection?
IoU Loss measures the intersection-over-union between predicted and ground-truth bounding boxes, focusing on localization accuracy.
What is Smooth L1 Loss, and where is it applied?
Smooth L1 Loss is used in object detection to be more robust to outliers.
What is the role of Focal Loss in object detection?
Focal Loss helps address class imbalance by focusing training on hard-to-detect objects.
What is Dice Loss used for in segmentation?
Dice Loss measures overlap between predicted and ground-truth masks
What does Jaccard Loss measure in segmentation tasks?
Jaccard Loss, or IoU loss, evaluates the similarity between predicted and ground-truth masks, favoring higher overlap.
What is Tversky Loss, and how does it differ from Dice Loss?
Tversky Loss generalizes Dice Loss by weighting false positives and false negatives differently, making it suitable for imbalanced data.
What is Adversarial Loss in generative models?
Adversarial Loss is used in GANs to optimize the generator to produce realistic outputs that can fool the discriminator.
What is Reconstruction Loss in generative models?
Reconstruction Loss measures how well the generated output matches the input, commonly used in Autoencoders.