Loss Functions Flashcards
1
Q
nn.CrossEntropyLoss
A
- Combines both softmax and cross-entropy in a single, more numerically stable expression. CrossEntropyLoss requires raw, unnormalized values from the NN (also called logits).
- Equivalent to softmax + cross-entropy loss.