Losses Flashcards

1
Q

What is the purpose of Mean Squared Error (MSE) in regression?

A

MSE calculates the average squared difference between predicted and actual values, penalizing larger errors more heavily.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does Mean Absolute Error (MAE) measure in regression tasks?

A

MAE measures the average absolute difference between predicted and actual values, providing a more robust metric against outliers compared to MSE.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is Log-Cosh Loss, and why is it used?

A

Log-Cosh Loss is which behaves as MSE at small values and as MAE at large values, and differentiable everywhere.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is Binary Cross-Entropy used for?

A

Binary Cross-Entropy measures the difference between predicted and true probabilities for binary classification tasks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does Categorical Cross-Entropy measure in classification?

A

Categorical Cross-Entropy is a loss function to evaluate the performance of a model by comparing predicted probabilities with actual one-hot encoded labels.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is Focal Loss designed to address in classification?

A

Focal Loss focuses on hard-to-classify examples by reducing the loss contribution from easy examples, helping with class imbalance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the purpose of Triplet Loss in ranking tasks?

A

Triplet Loss optimizes the relative distances among anchor, positive, and negative samples to improve ranking models.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is IoU Loss used for in object detection?

A

IoU Loss measures the intersection-over-union between predicted and ground-truth bounding boxes, focusing on localization accuracy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is Smooth L1 Loss, and where is it applied?

A

Smooth L1 Loss is used in object detection to be more robust to outliers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the role of Focal Loss in object detection?

A

Focal Loss helps address class imbalance by focusing training on hard-to-detect objects.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is Dice Loss used for in segmentation?

A

Dice Loss measures overlap between predicted and ground-truth masks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does Jaccard Loss measure in segmentation tasks?

A

Jaccard Loss, or IoU loss, evaluates the similarity between predicted and ground-truth masks, favoring higher overlap.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is Tversky Loss, and how does it differ from Dice Loss?

A

Tversky Loss generalizes Dice Loss by weighting false positives and false negatives differently, making it suitable for imbalanced data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is Adversarial Loss in generative models?

A

Adversarial Loss is used in GANs to optimize the generator to produce realistic outputs that can fool the discriminator.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is Reconstruction Loss in generative models?

A

Reconstruction Loss measures how well the generated output matches the input, commonly used in Autoencoders.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the role of KL Divergence in generative models?

A

KL Divergence measures the difference between two probability distributions, often used in Variational Autoencoders to regularize latent space.

17
Q

What are Weighted Losses, and why are they used?

A

Weighted Losses assign different importance to specific samples or classes, addressing class imbalance or sample-specific priorities.

18
Q

What is Multi-Task Loss in custom loss functions?

A

Multi-Task Loss combines losses from multiple objectives, balancing the training of different tasks in a single model.