Simple recurrent networks Flashcards

1
Q

What are the gates of the Long-Short Term Memory (LSTM) networks?

A

input, output, forget

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How can Simple Recurrent Networks be trained?

A

trained with the standard back propagation algorithm, because the temporal structure can be “ unrolled ”
into space.

Weights are adjusted with the sum of
the changes at each level (timestep) :

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

The difference between Elman and Jordan networks

A

Jordan nets use output representation as context / memory + Elman nets use internal representations as context / memory, so they are more powerful

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does the feedback loop do?

A

provides the network with a memory (copy) of the last output state, which is stored in «context» neurons

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

The self-feedback loop:

A

allows to keep trace of previous past states

How well did you know this?
1
Not at all
2
3
4
5
Perfectly