Computational Neuroscience 3 Flashcards
Function of the hidden layer in a multilayer network
Contains the distributed representation
Training for similar stimuli …
…reinforces the original one i.e training a network trained for yellow for orange reinforces yellow, in the network and in animals thus distributed representation is supported in vivo
Backpropagation
A mechanism in which weights are adjusted starting from those in the outer layers back to those in the input layers. It is a generalisation of the Widrow-Hoff rule.
Now the standard algorithm for training networks at WH rule can’t train MLNs
Unsupervised learning
Learning by simple exposure, e.g. the autoassociative networks in the hippocampus
Unsupervised learning requires…
…a different neuronal architecture and/or a different rule
Animals with hippocampal damage…
…cannot learn unsupervised.
Network structure compatible with unsupervised learning
CA3
Pattern completion
(A) Two different patterns are stored in the network
(B) An incomplete patterns in presented
(C) Because of the synapses linking the nodes in the pattern assembly,
activity spreads.
(D) The pattern is recovered
Pattern recognition
(A) Two different patterns are stored in the network
(B) A pattern not stored in the network is presented
(C) Strong synapses spread the activity.
(D) Activity in cells that do not belong to a stored pattern decays, and a
stored patterns becomes active.
The Hopfield Model
model for auto-associative memory
Every neurone synapses onto every other neurone.
Weights are set according to a Hebbian rule.
Each vector (pattern) is represented as the desired activation state of the network for that pattern.
Behaviour of the Hopfield Model
The model has a temporal behaviour (dynamics) such that it can evolve over time.
The ‘memories’ stored are stable attractors of the dynamics.
If the model is initialised closed to a memory (with a partial pattern) it will evolve towards the closest attractor. Thus, retrieving the stored memory by association.
The model always tries to look for the closest trough in the energy function.
When presented with an incomplete or distorted pattern, the Hopfield network…
…evolves to the closest attractor (stored memory)
The hippocampus and autoassociation
Area CA3 receives input from EC (highly processed information from sensory cortex). It also receives input from DG, which in turn receives its input from EC. Then CA3 sends information to CA1 and Fornix, and then from CA1 to S.
The connections from DG to CA3 are called Mossy Fibers. They make sparse and powerful synapses onto CA3 neurons.
CA3 has a high degree of internal recurrency and random connectivity.
Each CA3 neuron in the rat receives 4000 synapses from EC and 12000 from CA3 itself. That is a level of recurrency orders of magnitude higher than in other brain areas.
The connections between CA3 neurons are modified by LTP very fast (minutes or hours)
Difference between sensory system architecture and hippocampal architecture
Sensory systems which have a direction of information flow, hippocampal architecture is heavily recurrent
Storage
(A) Synapses from Mossy Fibers elicit strong activity in a cell assembly (B) Synapses within the assembly strengthen via LTP (Hebbian learning)