M06 - Cutting edge paper Flashcards
What is this article about?
Hebbian deep learning without feedback. Getting rid of backpropagation and use a different way to train a network that matches more to the brain does it.
What problem does this paper address?
Deep Learning models are not bio-realistic in reward assignment (backpropagation not realistic)
What is the goal of this paper?
- Train a deep learning model with a Hebbian learning rule
- Create bio-plausible Machine Learning
What were the results? (in sum)
With up to 5 hidden layers and an added linear classifier reached relatively high accuracies
What is synaptic plasticity?
Neurons being co-active and weight updates
What is Spike-timing dependent plasticity?
The timing of pre- and post-synaptic spike determines the sign of the weight update
What are the problems with back-propagation?
- weight transport
- non-local plasticity
- update locking
- global loss function
What is the problem of weight transport mean in back propagation in connection to the neural network?
Backpropagation of error requires synapses (=weights) to invert or work backwards. This is not biologically plausible.
What is the problem with non-local plasticity in back propagation in connection to the neural network?
In NN, weights are updated if the pre- and post-synaptic cells are co-active. With BP, teaching signals are computed later and at a different place.
= computational inefficiency