Losses Flashcards

1
Q

What is the purpose of Mean Squared Error (MSE) in regression?

A

MSE calculates the average squared difference between predicted and actual values, penalizing larger errors more heavily.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does Mean Absolute Error (MAE) measure in regression tasks?

A

MAE measures the average absolute difference between predicted and actual values, providing a more robust metric against outliers compared to MSE.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is Log-Cosh Loss, and why is it used?

A

Log-Cosh Loss is which behaves as MSE at small values and as MAE at large values, and differentiable everywhere.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is Binary Cross-Entropy used for?

A

Binary Cross-Entropy measures the difference between predicted probabilities and true lables (turned into 1 and 0) for binary classification tasks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does Categorical Cross-Entropy measure in classification?

A

Categorical Cross-Entropy is a loss function to evaluate the performance of a model by comparing predicted probabilities with actual one-hot encoded labels.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is Focal Loss designed to address in classification?

A

Focal Loss focuses on hard-to-classify examples by reducing the loss contribution from easy examples, helping with class imbalance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the purpose of Triplet Loss in ranking tasks?

A

Triplet Loss optimizes the relative distances among anchor, positive, and negative samples to improve ranking models.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is IoU Loss and how is it used in object detection?

A

IoU Loss measures the intersection-over-union between predicted and ground-truth. It is used in OD for bounding boxes, focusing on localization accuracy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is Smooth L1 Loss, and where is it applied?

A

Smooth L1 Loss is used in object detection to be more robust to outliers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the role of Focal Loss in object detection?

A

Focal Loss helps address class imbalance by focusing training on hard-to-detect objects.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is Dice Loss?

A

Dice Loss is 2 times the intersection over the sum.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does Jaccard Loss measure?

A

Jaccard Loss, or IoU loss, is the intersection of union of 2 sets.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is Tversky Loss, and how does it differ from Dice Loss?

A

Tversky Loss generalizes Dice Loss by adding an option to give different weights to false positives and false negatives, making it suitable for imbalanced data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is Adversarial Loss in generative models?

A

Adversarial Loss is used in GANs to optimize the generator and the discriminator together.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is Reconstruction Loss in generative models?

A

Reconstruction Loss measures how well the generated output matches the input, commonly used in Autoencoders.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is KL Divergence?

A

KL Divergence measures the difference between two probability distributions.

17
Q

What are Weighted Losses, and why are they used?

A

Weighted Losses assign different importance to specific samples or classes, addressing class imbalance or sample-specific priorities.

18
Q

What is Multi-Task Loss in custom loss functions?

A

Multi-Task Loss combines losses from multiple objectives, balancing the training of different tasks in a single model.

19
Q

What is the generator goal in the adversarial loss in GAN

A

To minimise log(1-D(G(z)))

20
Q

What is the discriminator goal in the adversarial loss in GAN

A

To maximise log(D(x)) + log(1-D(G(z)))

21
Q

Why are we using MSE for loss in reconstruction of images in Auto-Encoder?

A

Because we assume that the pixel values follow a Gaussian distribution and becacuse we want to maximise the negative log likelihood of the input with the output.

22
Q

How maximising the negative log likelihood turns into MSE?

A

Plugging the Gaussian exponent into the log function gives back the squared distance between the output and the label