W7 NN2 (eta optimization) Flashcards

1
Q

Different ways of using training data to train classifier?

A

Batch, mini batch, Stohastic Gradient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is problem with using same eta on all weights?

A

To much diversity in NN

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Running Average(exponenital moving average)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is ADAM?

A

Momentum rule, adaptive gradient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Momentum rule?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Adagrad

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Ways of regularization in NN

A

Weight decay, Early stopping, Pertubation, Dropout, Batch Normalization

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Batch normalization

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly