RNN Flashcards
For what kind of problems could we use many to one RNN?
Sentence classification, speech recognition (sound waves to one word), video classification
For what kind of problems could we use one to many RNN?
Image captioning, music composition, natural text generation.
For what kind of problems could we use many to many RNN?
Video classification on frame level, speech enhancment, continues emotion prediction
What makes RNN different from FW networks?
It has feedback.
What is the formula for updating hidden state and calculating output at a timestep in a simple RNN cell?
h_t = tanh( W_{hh} * h_{t-1} + W_{xh} * x_t y_t = W_{hy} h_t
What is direct feedback?
The hidden state is used in the same cell at the next timestep
What is indirect feedback?
The hidden state is connected to a previous cell at the next timestep
What is lateral feedback?
A cell is connected to a cell in the same layer.
How does lateral feedback often affect the output of a layer?
Cells strenghten themself, while weakening others, the strongest cell becomes active.
What is a RNN with symetrical connections to all other cells called?
A Hopfield network
What is the main challenge of using deep RNN’s
Gradient vansihing/ explosion. Batch normalization and dropout layers help.
What is a biderectional RNN?
Cells see inputs both from the past and the future.
What is a LSTM (Long short-term memory cell)
A long short term memory cell has a seperate path for cell state ensuring better gradient propagation?
What is the forget gate in a LSTM cell?
Controls how much of the previous cell state to remember.
What is the input gate in a LSTM cell?
Controlls how much to write to a cell