QA Flashcards

1
Q

What does data normalization prevent?

A

reduce the risk of sigmoid saturation preventing minimization
stuck, premature convergence or undue increase of the estimated weights.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

T or F Data normalization prevents vanishing gradient

A

F

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

For different task you always need different dataset T or F

A

T at least the label changes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

For small batch size the number of iterations required increases as the batch size increases t or f

A

F

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

CNN rests on filter to reduce the image size t or f

A

F

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Activation maps are synonym of feature map t or f

A

T

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Hidden neurons are not active most of the time when the sparsity factor is approaching 0.5

A

T

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

In Gan the discriminator maximizes the objective D(G(z)) =0 t or f

A

T

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

In the training the gan learns the distribution of the images in the reference dataset t or f

A

T

How well did you know this?
1
Not at all
2
3
4
5
Perfectly