Computational Neuroscience 3 Flashcards

1
Q

Function of the hidden layer in a multilayer network

A

Contains the distributed representation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Training for similar stimuli …

A

…reinforces the original one i.e training a network trained for yellow for orange reinforces yellow, in the network and in animals thus distributed representation is supported in vivo

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Backpropagation

A

A mechanism in which weights are adjusted starting from those in the outer layers back to those in the input layers. It is a generalisation of the Widrow-Hoff rule.
Now the standard algorithm for training networks at WH rule can’t train MLNs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Unsupervised learning

A

Learning by simple exposure, e.g. the autoassociative networks in the hippocampus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Unsupervised learning requires…

A

…a different neuronal architecture and/or a different rule

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Animals with hippocampal damage…

A

…cannot learn unsupervised.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Network structure compatible with unsupervised learning

A

CA3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Pattern completion

A

(A) Two different patterns are stored in the network
(B) An incomplete patterns in presented
(C) Because of the synapses linking the nodes in the pattern assembly,
activity spreads.
(D) The pattern is recovered

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Pattern recognition

A

(A) Two different patterns are stored in the network
(B) A pattern not stored in the network is presented
(C) Strong synapses spread the activity.
(D) Activity in cells that do not belong to a stored pattern decays, and a
stored patterns becomes active.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

The Hopfield Model

model for auto-associative memory

A

Every neurone synapses onto every other neurone.
Weights are set according to a Hebbian rule.
Each vector (pattern) is represented as the desired activation state of the network for that pattern.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Behaviour of the Hopfield Model

A

The model has a temporal behaviour (dynamics) such that it can evolve over time.
The ‘memories’ stored are stable attractors of the dynamics.
If the model is initialised closed to a memory (with a partial pattern) it will evolve towards the closest attractor. Thus, retrieving the stored memory by association.
The model always tries to look for the closest trough in the energy function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

When presented with an incomplete or distorted pattern, the Hopfield network…

A

…evolves to the closest attractor (stored memory)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

The hippocampus and autoassociation

A

Area CA3 receives input from EC (highly processed information from sensory cortex). It also receives input from DG, which in turn receives its input from EC. Then CA3 sends information to CA1 and Fornix, and then from CA1 to S.
The connections from DG to CA3 are called Mossy Fibers. They make sparse and powerful synapses onto CA3 neurons.
CA3 has a high degree of internal recurrency and random connectivity.
Each CA3 neuron in the rat receives 4000 synapses from EC and 12000 from CA3 itself. That is a level of recurrency orders of magnitude higher than in other brain areas.
The connections between CA3 neurons are modified by LTP very fast (minutes or hours)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Difference between sensory system architecture and hippocampal architecture

A

Sensory systems which have a direction of information flow, hippocampal architecture is heavily recurrent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Storage

A

(A) Synapses from Mossy Fibers elicit strong activity in a cell assembly (B) Synapses within the assembly strengthen via LTP (Hebbian learning)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Retrieval

A

C) Weaker synapses from EC induce low level activity in a cell population
(D) Because of the strong synapses the previously stored pattern is recovered.

17
Q

It has been proposed that mossy fibres are…

A

…responsible for the strong activation CA3 ‘nodes’ –> undergo Hebbian learning thus LTP, so could Mossy fibres be responsible for storage?

18
Q

Limitation of autoassociative networks

A

Catastrophic interference occurs when the number or stored patterns increases
(occurs in the Hopfield model)

19
Q

Marr proposed that…

A

CA3 can only hold memories for a limited period of time after which the memories are transferred to the neocortex