Error Back Propagation Flashcards
What is back propagation?
a supervised learning algorithm, for training Multi-layer Perceptrons
What does the backpropagation algorithm look for and how?
the minimum value of the error function in weight space using a technique called the delta rule or gradient descent
What is considered the solution to the learning problem in backpropagation?
The weights that minimize the error function
-> value of weight such that the error becomes minimum
How do we reach a solution in backpropagation?
figure out whether we need to increase or decrease the weight value
What happens after weight value is increased or decreased?
keep on updating the weight value in that direction until error becomes minimum
How do we know when to stop changing the weight value?
reach a point, where if you further update the weight, the error will increase. At that time you need to stop, and that is your final weight value
What are the steps to error back propagation?
Step – 1: Forward Propagation
Step – 2: Backward Propagation
Step – 3: Putting all the values together and calculating the updated weight value
How does EBP relate to perceptrons?
It is a generalization of the delta rule for perceptrons to multilayer feedforward neural networks.
Why is it called backwards propagation?
stems from the fact that calculation of the gradient proceeds backwards through the network, with the gradient of the final layer of weights being calculated first and the gradient of the first layer of weights being calculated last.
How is information reused in backwards propagation?
Partial computations of the gradient from one layer are reused in the computation of the gradient for the previous layer
Why is it useful to have backwards flow?
backwards flow of the error information allows for efficient computation of the gradient at each layer versus the naive approach of calculating the gradient of each layer separately
What do synaptic weight modifications depend on?
only on the activity of presynaptic and postsynaptic neurons
What features of the brain does error back propagation include?
spike time-dependent plasticity, patterns of neural activity during learning, and properties of pyramidal neurons and cortical microcircuits
In what way are neuronal synaptic processes related to error back propagation?
include both feedforward and feedback connections, thereby allowing information about the errors made by the network to propagate throughout the network without requiring an external program to compute the errors
What common framework exists between error back propagation and neuroplasticity?
energy minimisation