Simple recurrent networks Flashcards
What are the gates of the Long-Short Term Memory (LSTM) networks?
input, output, forget
How can Simple Recurrent Networks be trained?
trained with the standard back propagation algorithm, because the temporal structure can be “ unrolled ”
into space.
Weights are adjusted with the sum of
the changes at each level (timestep) :
The difference between Elman and Jordan networks
Jordan nets use output representation as context / memory + Elman nets use internal representations as context / memory, so they are more powerful
What does the feedback loop do?
provides the network with a memory (copy) of the last output state, which is stored in «context» neurons
The self-feedback loop:
allows to keep trace of previous past states