Recurrent Networks Flashcards

1
Q

Mention an example of a sequence-to-vector rnn

A

Genre classification of a song

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Mention an example of a vector-to-sequence rnn

A

Image captioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Mention an example of different input and output sequence length rnn

A

Speech interpretation/machine translation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Name all three gating mechanism of LSTMS

A

input gate (blocking writing), forget gate and output gate (blocking reading)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How do we create deep RNNs?

A

Multilayer perceptron: n hidden layers

Stacking. I.e stacked RNNs. We take the states of a rnn layer and use it as input into a new RNN.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the memory, compute and serial steps for feed forward, backprop and backprob using target?

A
  1. O(1), O(T), O(T)
  2. O(T), O(T), O(T)
  3. O(1), O(T), O(1)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Why is it useful to process data sequentially?

A

some Information come in sequential settings, like speech recognition.
An immediate response, limited bandwidth, limited computational power, limited storing capabilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What aspect of a RNN is it a LSTM improves upon?

A

Learning long-term dependencies, accounting for exploding/vanishing gradients.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly