classification (neural nets) Flashcards

1
Q

What is a neural network?

A

A set of matrix multiplies to apply X to get Y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Why is it called feed-forward?

A

None of the weights cycle back to an input unit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does fully connected mean?

A

Each unit provides input to each unit in the next forward layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Give a general summary of what feed forward neural networks do.

A

Each output unit takes a weighted sum of the outputs from units in the previous layer, then applies an activation function to the weighted input

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What happens at the input layer?

A

Inputs are fed into the input layer, weighted, and fed simultaneously to the hidden layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the hidden layer?

A

Second layer of “neuronlike” units. Outputs of the hidden layer can be input to another hidden layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the output layer?

A

Weighted outputs of last hidden layer. It emits the network’s prediction for given tuples

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What can you add to a 2 layer neural network to make it a logistic model?

A

A sigmoid function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does adding bias do?

A

It’s like adding a constant “hot” feature to your instance. It acts as an additional node at every layer except for the output layer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is backpropagation?

A

An algorithm for iteratively improving the prediction of the model by updating the model’s weights based on prediction error (loss)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What makes backpropagation “backwards”?

A

Modifications are made in the backwards direction through each layer down to the first hidden layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the 4 steps of backpropagation?

A

1) Initialize the weights
2) Propagate inputs forward
3) Backpropagate the error
4) Terminating condition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is an epoch?

A

After all data has been included in a batch, this is called an epoch

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is stopping criteria?

A

Decides when number of epochs is sufficient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly