Module 5 Flashcards

1
Q

Binary classification

A

Only two possible classes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Multi-class classification

A

Multiple classes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Multi-label classification

A

Each input can belong to more than one class

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Batching

A
  • combining vectors of several data points into one matrix
  • improves speed
  • reduces noise
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Backpropagation

A

Propagating the gradients backwards through the network layers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Low learning rate

A

Model will take forever to converge

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Learning rate too high

A

Keep stepping over the optimal values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Learning rate decay

A

Reducing the learning rate by a factor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Weight initialisation

A
  • zeros

- draw randomly from N(0,1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Network capacity

A

Correlation between the capacity of a neural network and its ability to overfit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Network underfitting

A

Increase the number of neurons/layers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Network overfitting

A

Lowering the number of neurons/layers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Regularisation

A

Adding some information/constraints to stop the model from overfitting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

L1 regularisation

A

Adding the absolute weights to the loss function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

L2 regularisation

A

Adding the squared weights to the loss function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Dropout

A

training: randomly set some neural activations to zero
testing: use all neurons but scale the activations