8 - Recurrent Neural Networks Flashcards

1
Q

What is the problem with using a time as space model?

A

-expensive
-how far back in time should you go?
-semantically not correct

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How to deal with problems in time as space model?

A

allow activity to reverberate through the network by using RECURRENT CONNECTIONS -> dynamic network activity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What analogy can you use to explain reverberating activity?

A

throwing stone in puddle and wave of puddle can tell you where input(stone) came from

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

In Elman Networks specifically, what is the purpose of feedback weights?

A

connect the hidden layer to the context layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Elman Network - What sort of activity is seen between hidden layer and context layer?

A

reverberating activity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Why cant a perceptron (single rate neuron model) perform an XOR function?

A

output is not linearly separable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What can solve the XOR problem?

A

with added layers of neurons !!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Are there any feedback connections in the feed forward model ? (outputs which feedback into itself)

A

noooooooooooooo (called feedforward rate networks for a reason)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the added additional layer in the Elman Network compared to a Feedforward Network?

A

context layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What tool does learning sequence information use to predict the sequential XOR INPUT?

A

uses backpropagation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

When the error is reduced does the ability of the network to predict the next letter/number in the sequence increase or decrease?

A

increases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Word with similar semantic have similar or different pattern of activity? (with predicting letter and level or error)

A

similar semantics and similar patterns

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What does adding recurrent connections between layers (reverberation activity) allow?

A

-introduces a new kind of ‘memory’ capacity which enables networks to represent temporal structure in the sequences of training patterns

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Why can you read a text which has letters omitted in the words?
What sort of memory is this called?
What network is this memory seen in?

A

because there is a pattern in language which the brain has stored (Elman)
-associative memory
-Hopfield Network

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is cell assembly in memory?
What is a cell assembly able to do?

A

-a selection of neurons connected with increased synaptic weights
-this increase in synaptic weights of connections between neurons allows you to store an item in memory

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How do cell assemblies contribute to associate memory? (Hebbian)

A

-items stored via creation of cell assembly
-associated items can be recalled due to activation of cell assembly (this is due to the increase in synaptic weights between neurons in Hebbian Network)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What learning is this: neurons that fire together, wire together?
What does this mean?

A

Hebbian Learning: neurons that are active together results in strengthened connections between them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

In ferromagnets, is the direction of the atom’s spin independent on the neighbouring atom’s spin?

A

no they are not independent - they can have strong or weak interactions which influence eachother

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

If a ferromagnet atom gets excited and spins so to point in the opposite direction, what happens afterwards if you apply the Hebbian dynamics eqn.?
How does this relate to Hebbian Learning/Network?

A

-atom flips back to original orientation
-an example of recalling memory is when the atom flips back as the original orientation is the ‘stored’ memory

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What is the definition of a Hopfield Network?

A

a recurrent network with symmetric weights and no self-connections

21
Q

In the definition of a Hopfield Network, what does it mean when the weights are symmetric?

A

the connection from neuron j to i is equal to the connection between i to j

21
Q

In a Hopfield Network, what is form is the output?
What is the output in a Hopfield Network?

A

-either active +1 or inactive -1
-output OF EACH UNIT is weighted sum of the inputs pushed through a step function to generate +1 or -1

21
Q

Do individual neurons interact with themselves in a Hopfield network?

A

non there are no self-connections

22
Q

How are memories stored as in a Hopfield Network?

A

stored as low-energy states of the dynamics

23
Q

Finish the sentence:
When an axon of cell j repeatedly or persistently takes part in firing cell i,
-What networks is this rule seen in?

A

then j’s efficiency as one of the cells firing i is increased
-it is the basis of Hebbian Learning but it is also used in Hopfield Network

24
Q

What does yi/yj represent in Hopfield Network?
“ xi/xj
“ wij

A

-the activity of the network
-the pattern we want to store
-the weights

25
Q

What is the dynamics equation for the Hopfield Network?
What does the 𝑗≠𝑖 mean in this eqn.?

A

𝑦𝑖 (𝑡+1)=𝑠𝑖𝑔𝑛(∑(𝑗≠𝑖) 𝑤𝑖𝑗 𝑦𝑗 (𝑡) )

weighted sum of all neurons connections apart from connecting to itself so 𝑗≠𝑖

26
Q

What is the weight between two neurons in Hopfield Network equal to?

A

Energy: 𝐸𝑖𝑗 = −𝑤𝑖𝑗 𝑦𝑖 𝑦𝑗

27
Q

Why do we set the weights wij in a Hopfield Network?

A

-to store a pattern x (memorisation)

28
Q

How do we store and recall a pattern in a Hopfield Network? (two steps)

A

-set the weights to wij=xixj to store a pattern 𝒙 (memorisation)
-start the activity at 𝒚 that is similar to 𝒙, the dynamics moves 𝒚 toward 𝒙 (recall)

29
Q

How do you set the weights in memorisation/pattern storage in a Hopfield Network?

A

1) Start with 𝑤_𝑖𝑗=0
2) To memorise pattern 𝒙^𝑝, update weights: 𝑤𝑖𝑗←𝑤𝑖𝑗+𝑥𝑖^𝑝 𝑥𝑗^𝑝
3) Repeat step (2) for every other pattern

30
Q

In a Hopfield Network, what does it mean when connections enforce correlation?

31
Q

How do you recall a pattern in a Hopfield Network? (with associative memory eg. you see an incomplete letter/pattern M which is similar/associated with the stored memory of M (xp)

A

-start with network activity at y(1)
-For each neuron 𝑖 update activity by 𝑦𝑖(𝑡+1)=𝑠𝑖𝑔𝑛(∑(𝑗≠i) 𝑤𝑖𝑗𝑦𝑗 (𝑡)) until 𝑦𝑖 (𝑡+1)=𝑦𝑖 (𝑡) for every 𝑖
-This “stable” activity will match the pattern 𝑥 that 𝑦(1) is most similar to.

32
Q

Hopfield Network: Why are memories stored as patterns which are random/abstract concepts in the brain? (instead of as images)

A

to avoid interference as images would be too similar

33
Q

What are attractor states in Hopfield Networks?
How are the involved in recall?
What are the similar to?

A

-where the pattern is stored
-during recall the system is push towards the attractor state to get original stored pattern
-fixed points

34
Q

What do Hopfield Dynamics do to energy?
What is the eqn. for this?

A

-minimises energy as you go toward attractor states (in a well, like a fixed point)
𝐸𝑖𝑗 = − 𝑤𝑖𝑗 𝑦𝑖 𝑦𝑗

35
Q

When do spurious attractors occur in Hopfield Networks?
What are they?

A

when a network is trying to store too many memories, spurious attractors will appear
-attractors which are very hard to reach compared to desired attractors (have shallow energy wells)

36
Q

What can Hopfield Networks be trained to do?

A

trained to store memories as fixed-point attractor states

36
Q

If the neuron is in a learned pattern state what is the energy equal to?
“ not is a learned pattern state?
Why is it this way around?

A

E = -1
E = +1
-because the energy always is minimised when moving in the direction of the attractor state

37
Q

What type of learning do attractors in Hopfield Networks employ?

A

Hebbian Learning

38
Q

How does Elman introduce networks that introduce time?

A

by introducing recurrent connections from the hidden layer to the context layer

39
Q

How are memories stored in a Hopfield Network?

A

As fixed-point attractor states

40
Q

How are neurons connected in a Hopfield network?

A

Neurons are connected to all other neurons in the network but not themselves, and these connections are symmetric.

41
Q

What is the output of a Hopfield Network defined as? ( include what it generates)

A

output of each unit (yi) = is a weighted sum of the inputs pushed through a step function to generate +1 or -1

42
Q

In the Hopfield model, how are memories stored?

A

random patterns (M) of binary weights corresponding to a stored concept

-not pictures

43
Q

In terms of attractor states, what is the difference between memorisation and recall in the Hopfield Model?

A

Memorisation is creating the attractor states, recall is reaching an attractor state

44
Q

In the Hopfield Network Model, how are memories recalled?
What do x and y represent?

A

-Getting y to iterate until it matches the pattern x correctly
-y is the process of recall (activity of the network)
x is the stored pattern in the brain/memory (the correct pattern that y is trying to reach)

45
Q

Hopfield model, in memory recall, when is a an attractor state (fixed point of model) reached?

A

when the pattern of y activity is the same as x (after all the iterations)