Machine_Learning_General Flashcards

1
Q

What is batch normalization?

A

Problem: Distribution of each layer’s inputs changes during training, as params of previous layer change. Slows down learning and requires careful param initialization.

Batch norm ameliorates this problem by normalizing layer inputs for each mini-batch. Allows us to use much higher learning rates, be less worried about initialization, and acts as a regularizer (in some cases eliminating the need for dropout).

Pioneered by Google in there 2015 paper: https://arxiv.org/pdf/1502.03167.pdf

How well did you know this?
1
Not at all
2
3
4
5
Perfectly