Intro to Neural Networks Flashcards
What is backpropagation? What is its goal?
Backpropagation is a common method for training a neural network.
Our goal with backpropagation is to update each of the weights in the network so that they cause the actual output to be closer the target output, thereby minimizing the error for each output neuron and the network as a whole.
What is the equation for calculating total error?
What is it called?
Squared error function
How do you successfully perform a backward pass?
What is the power rule and what do you need to remember about it?
When the inner function is more complicated you have to remember to take the derivative of the inner function as well. In this case it isn’t noticed because the derivative of x is just 1, but the calculation was still performed regardless.
What is the derivative of the logistic function?
Note the final form of f(x)(1-f(x))
What are the parts of the node you need to remember when taking the derivative for new weight calculations?
What are the steps to calculating a new weight with backpropagation?
What do you need to remember with this calculation?
This is just the calculation for weight and layer. So a few more calculations need to be perfomed.
And you don’t use the new weights for the hidden layer.