Problem 4 Flashcards

1
Q

Hebbs law

A

Suggests that if 2 neurons on either side of a synapse are activated simultaneously, the strength of that synapse is selectively increased

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Weight

A

Is determined by the degree of activity between the 2 neurons

e.g.: activate simultaneously = increase; activate separately = reduce

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Strong positive vs strong negative weights

A
  1. Strong positive weights
    - -> nodes that tend to be both positive/ both negative at same time
  2. Strong negative weights
    - -> nodes that tend to have opposites
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

There are certain drawbacks on the hebbian learning method .

Name them.

A
  1. Interference
    - -> the number of associations which can be stored before they begin to interfere on each other are limited
  2. Saturation
    - -> when the weights keep increasing, all units will be active when one presents an input to the network
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Neo-Hebbian learning

Solution to Saturation

A

Involves the “forgetting” term which allows you to forget a little bit of the weight every time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Differential Hebbian learning

A

Interested in whether the neurons behave the same way

–> not interested in whether neurons are active at the same time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Drive reinforcement theory

A

Builds upon differential hebbian learning and introduces time + time difference

–> in order to capture classical conditioning, we have to account for the prediction of CS based on US

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Pattern associator

A

Refers to a particular kind of network, which is presented with pairs of patterns during training

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the results after successful learning of the pattern associator ?

A
  1. Recalls one of the patterns at output when the other is presented at input
  2. Responds to novel inputs by generalizing from its experience it had with similar patterns
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Name the advantages of the pattern associator NWs.

A
  1. Tolerance of noisy input
  2. Resistance to internal damage
  3. Ability to extract a central prototype from similar examples
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Explain the NS model on the basis of the “taste and sight of chocolate”.

A

Suggests that during learning in the NS, 2 patterns are presented simultaneously

  1. Representing the taste of chocolate
    - -> reaching the dendrites via unmodifiable snaypses
  2. Representing the sight of chocolate
    - -> via modifiable synapses

THUS: learning takes place via the modification of synapses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q
CN model 
(Connectionist network model)
A

Like in the NS, this model suggests that during learning 2 patterns are presented simultaneously

  1. P1 must be produced as an output unit
  2. P3 is presented to input units

THUS: pattern association takes place by modifying the strength of the connections between input + output units

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

In the CN model there are several terms that are equivalent to the terms used in the brain.

Name the equivalent terms for the following:

  1. Axon
  2. Dendrite
  3. Synaptic strength
A
  1. Input line
  2. Output unit
  3. Weight
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Hebb rule for weight change

A

delta wij = eai aj

–> rule is in multiplicative form as in order for a synapse to increase in strength both pre + postsynaptic activity must be present

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

e

A

Refers to the learning rate constant which specifies how much a synapse alters in any one pairing of the two patterns

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

ai

A

Refers to the activity of element i in pattern 1

17
Q

aj

A

Refers to the activity of element j in pattern 2

18
Q

Pattern associators have several important properties.

Name 6 of them.

A
  1. Generalization
    - -> if a recall cue is similar to an already learnt pattern, the program produces a similar response to the new pattern
  2. Fault tolerance
    - -> even if some synapses on neuron i are damaged, net input might still be a good approximation
  3. Distributed representations
    - -> knowing the state of most elements to know which elements are represented
  4. Prototype extraction + noise removal
  5. Speed
    - -> recall is fast
  6. Interference
    - -> not a bad thing
19
Q

Autoassociative memories

A

Are capable of retrieving a piece of data upon presentation of only partial info from that piece of data

e.g.: Hopefield NWs, capable of remembering data by observing a portion of that data

20
Q

Hopefield NWs can take on 2 different forms.

Name them.

A
  1. Asynchronous
    - -> one unit is updated at a time
  2. Synchronous
    - -> all units are updated at the same time
21
Q

Autoassociator

A

Refers to a particular from of pattern associator which is trained with the delta rule

–> its aim is to reproduce the same pattern at output that was present at input

22
Q

What are the 3 most advantageous assets of autoassociators ?

A
  1. Store independent memories on the same set of connections
  2. perform well with incomplete/noisy input
  3. automatically form prototypical instances of categories
23
Q

Recurrent connections

A

Refer to connections whereby the output line of each unit is connected to the dendrites of the other units

–> present in autoassociators

24
Q

Competitive learning

A

Refers to a variant of Hebbian learning

–> here output units are in competition for input patterns

25
Q

Why do these competitive neurons compete with each other in competitive learning?

A

To see which one of them is the most similar to that particular input vector

–> thus the more strongly one particular unit responds to an incoming stimulus, the more its shuts down the other members of the cluster

26
Q

What is the basic structure of competitive learning models ?

A

A set of hierarchically layered units, with each layer connecting with layer above it

–> connections within are inhibitory; between = excitatory