Exercise 5 - Hebbian Learning Flashcards
3 generations of neuron models
1st generation: binary
2nd generation: real numbers
3rd generation: action potential
learning
finding the right weights to solve the task
universal approximation theorem
- a neural network with only a single hidden layer can approximate any function arbitrarily well
what learning model is hebbian learning?
unsupervised learning
Two learning rules in hebbian learning
rate-based hebbian rules
precise-timing-based rules
Hebbian learning
When an axon of cell A is near enough to excite B and repeatedly or presistently takes part in firing it, some growth process takes place such that A’s efficiency is increased.
Properties of hebbian learning
- saturation: avoid unbound growth of synaptic weight
- competition -> selectivity: avoid weights to converge to the same value
- locality: weight change depends on local variables
Does Oja’s rule satisfy all properties?
Yes
Does covariance learning with sliding threshold fulfil all properties?
Yes
Spike-timing dependent plasticity (STDP)
- temporally asymmetric form of hebbian learning induced by tight temporal correlations between the spikes of pre-and postsynaptic neurons