Loss Functions Flashcards

1
Q

nn.CrossEntropyLoss

A
  • Combines both softmax and cross-entropy in a single, more numerically stable expression. CrossEntropyLoss requires raw, unnormalized values from the NN (also called logits).
  • Equivalent to softmax + cross-entropy loss.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly