Lecture 10 Flashcards
3 principles of neural network processing
- Distributed representations
- Connectivity
- Excitation/inhibition
Why are representations distributed?
So that they are robust against noise, and damage, we have an increased capacity, and networks have an increased dynamic and flexible behaviour
Connections between neurons (the synapses) are …
Plastic
LTP
What fires together wires together
Long term neuronal representation
Relatively strong synapses between the neurons of the ensemble
What happens when you remember something?
Stimuli activate a part of the neuronal representation (cue), cue partially activates the patterns, then cued part of the ensemble of neurons activates the rest of the pattern
Autoassociation
Local synaptic strengthening (within one layer) within one representation
Heteroassociation
Synaptic strengthening between different representations, different
aspects of a stimulus, coded in different layers
General principles of conectivity
Divergence
Convergence
Point-to-point connectivity
(these patterns have a role in parallel processing)
Divergence
Spread of information from one source cell to multiple target cells
Convergence
Compression (combination) of information from multiple source cells onto 1 target cell
Point-to-point (Topological)
Number of source and target neurons is the same. Copy of information from source to target layer
Where in the hierarchy are connections often more plastic?
On higher levels
Pattern completion
In memory tetreival: the cued part of the ensemble activates the rest of the pattern (most strongly interconnected neurons)
Mechanisms through which inhibition is executed
Feedforward and feedback