M06 - Cutting edge paper Flashcards

1
Q

What is this article about?

A

Hebbian deep learning without feedback. Getting rid of backpropagation and use a different way to train a network that matches more to the brain does it.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What problem does this paper address?

A

Deep Learning models are not bio-realistic in reward assignment (backpropagation not realistic)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the goal of this paper?

A
  • Train a deep learning model with a Hebbian learning rule
  • Create bio-plausible Machine Learning
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What were the results? (in sum)

A

With up to 5 hidden layers and an added linear classifier reached relatively high accuracies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is synaptic plasticity?

A

Neurons being co-active and weight updates

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is Spike-timing dependent plasticity?

A

The timing of pre- and post-synaptic spike determines the sign of the weight update

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the problems with back-propagation?

A
  • weight transport
  • non-local plasticity
  • update locking
  • global loss function
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the problem of weight transport mean in back propagation in connection to the neural network?

A

Backpropagation of error requires synapses (=weights) to invert or work backwards. This is not biologically plausible.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the problem with non-local plasticity in back propagation in connection to the neural network?

A

In NN, weights are updated if the pre- and post-synaptic cells are co-active. With BP, teaching signals are computed later and at a different place.
= computational inefficiency

How well did you know this?
1
Not at all
2
3
4
5
Perfectly