Error Back Propagation Flashcards

1
Q

What is back propagation?

A

a supervised learning algorithm, for training Multi-layer Perceptrons

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does the backpropagation algorithm look for and how?

A

the minimum value of the error function in weight space using a technique called the delta rule or gradient descent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is considered the solution to the learning problem in backpropagation?

A

The weights that minimize the error function

-> value of weight such that the error becomes minimum

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How do we reach a solution in backpropagation?

A

figure out whether we need to increase or decrease the weight value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What happens after weight value is increased or decreased?

A

keep on updating the weight value in that direction until error becomes minimum

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How do we know when to stop changing the weight value?

A

reach a point, where if you further update the weight, the error will increase. At that time you need to stop, and that is your final weight value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the steps to error back propagation?

A

Step – 1: Forward Propagation

Step – 2: Backward Propagation

Step – 3: Putting all the values together and calculating the updated weight value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How does EBP relate to perceptrons?

A

It is a generalization of the delta rule for perceptrons to multilayer feedforward neural networks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Why is it called backwards propagation?

A

stems from the fact that calculation of the gradient proceeds backwards through the network, with the gradient of the final layer of weights being calculated first and the gradient of the first layer of weights being calculated last.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How is information reused in backwards propagation?

A

Partial computations of the gradient from one layer are reused in the computation of the gradient for the previous layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Why is it useful to have backwards flow?

A

backwards flow of the error information allows for efficient computation of the gradient at each layer versus the naive approach of calculating the gradient of each layer separately

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What do synaptic weight modifications depend on?

A

only on the activity of presynaptic and postsynaptic neurons

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What features of the brain does error back propagation include?

A

spike time-dependent plasticity, patterns of neural activity during learning, and properties of pyramidal neurons and cortical microcircuits

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

In what way are neuronal synaptic processes related to error back propagation?

A

include both feedforward and feedback connections, thereby allowing information about the errors made by the network to propagate throughout the network without requiring an external program to compute the errors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What common framework exists between error back propagation and neuroplasticity?

A

energy minimisation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the purpose of forward propagation in EBP?

A

To check performance

calculation flow is going in the natural forward direction from the input -> through the neural network -> to the output.