RNN Flashcards

1
Q

When are RNN used?

A

For sequential data like text, speech, video, time series data.

RNNs have the features of memory i.e., they can remember the past inputs. That is why they work great on the sequential data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Why RNN over ANN?

A
  • Length of input may vary for sequential data, whereas in case of ANN, length of input is fixed.
  • ANN will be computationally expensive as for sequential data like text, the number of words may be too large.
  • For sequential data, the size of test input may be higher than the training input.
  • ANN will ignore the semantic meaning of the text data.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Zero Padding in RNN

A

Pad the sequences to the maximum length with zeros

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Type of RNNs

A
  1. Many to one - Sentiment Analysis
  2. One to many - Image to Captioning
  3. Many to many - Synchronous - NER, POS
    Asynchronous - Machine translation, Text summarization, Chatbot, Q/A, Speech to text
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How does input in RNN looks like?

A

Input = (Timesteps, No. of features)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Why RNN is called so (Recurrent)?

A

This is because we are using the same hidden network multiple time (re-occurring)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Problem with RNNs

A
  1. Long term dependencies (Vanishing gradient problem)
    As the time steps increases, the partial derivative term in weight updation consist of many terms (in range [-1,1]- due to tanh), and that is why the overall value becomes very less.
  2. Unstable training (Exploding gradient problem) - Similar case as above, except Relu is used as activation.
    Can be prevented by gradient clipping, learning rate
How well did you know this?
1
Not at all
2
3
4
5
Perfectly