Week 6: Extended BVC Model using BCM Rule Flashcards

1
Q

But BVC and sharp model can not account for long-term dynamics of place fields in similar enviroment

A

nor the importance of synaptic plasticity in place field stability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Recall: Global remapping is not the whole story as there is long-term dynamics in place field cells - (4)

A

Global remapping is akin to change between attractor

But slow, experience dependent changes to place fields

As the attractor is slowly deformed over time (i.e. changing the active representation of one enviroment by moving to new enviroments)

Likely reflects synaptic plasticity and can not be explaiend by a ‘fixed’ model (lead to extended BVC model…)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Extended BVC model where we added

A

learning rule called Bienestock-Cooper Monroe (BCM) rule

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

BCM rule is a modification of

A

Hebbian plasticity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

In extended BVC model we update weights using BCM rule where (3)

A
  • Requires pre and post-synaptic activity (xi and yi)
  • Reduces if post-synaptic activity low
  • Increases if activity is high
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

In extended BVC model using BCM rule we change weight and update activity of yi (place cells)

A

sum of inputs xj (BVC) * weights and put through transfer function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

There is a sliding threshold in extended BVC model using

A

BCM rule

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

The threshold of yi output firing in extended BVC model using BCM rule is

A

proportional to the square of recent activity of yi (Place cells)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

BCM rule explained is that you get pattern of activity x drives a neuron y (3)

A

y < threshold = weight decreases (unlearn weak inputs)
y > threshold = weight increases (learning)
Updating the threshold based on average activity of y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Applying BCM rule to BVC (3)

A
  • From BVCs, and get PC field representation
  • Repeated so get PC field representation for all PC
  • Update the weights between BVC and PC using BCM rule
  • What happens if you have a weak secondary peak in your PC at a specific location in enviroment then learning rule will weaken the weights for the neuron that gives you this weak secondary peak over time
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

UnSupervised learning (don’t impose any teaching rules) examples (3)

A
  • Hebbian learning
  • Competitive learning rule
  • BCM Rule
How well did you know this?
1
Not at all
2
3
4
5
Perfectly