Neural Networks Flashcards
Activation Function
The ‘gate’ between the current neuron and the output for next layer. Typically a sigmoid/ReLU function that takes in all weights and inputs.
ReLU
Rectified Linear Unit – an activation function thats a simple step function, either 0 or positive.
Feed Forward Neural Network
Simplest form of Neural Network. Information feeds forward. This is in contrast to RNNs where information can occur in loops etc. This has nothing to do with backpropagation.
Backpropagation
The training algorithm which goes back from the output layer and uses gradient descent to determine how the weights need to be updated based on how off the output was.
Gradient Descent
For each change in the weights of a neuron, which direction has the greatest effect on the cost function. Based on this gradient, it optimizes the cost function.
Gradient Checking
Checking the gradient descent algo during backprop against real examples in order to make sure algo is working correctly.