Neural Networks Flashcards

1
Q

Activation Function

A

The ‘gate’ between the current neuron and the output for next layer. Typically a sigmoid/ReLU function that takes in all weights and inputs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

ReLU

A

Rectified Linear Unit – an activation function thats a simple step function, either 0 or positive.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Feed Forward Neural Network

A

Simplest form of Neural Network. Information feeds forward. This is in contrast to RNNs where information can occur in loops etc. This has nothing to do with backpropagation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Backpropagation

A

The training algorithm which goes back from the output layer and uses gradient descent to determine how the weights need to be updated based on how off the output was.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Gradient Descent

A

For each change in the weights of a neuron, which direction has the greatest effect on the cost function. Based on this gradient, it optimizes the cost function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Gradient Checking

A

Checking the gradient descent algo during backprop against real examples in order to make sure algo is working correctly.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly