Block 3: Recurrent Neural Networks (RNNs) Flashcards

1
Q

What is a key feature that distinguishes RNNs from other neural networks?

A

A key feature of RNNs is their ability to maintain a ‘memory’ of previous inputs in the network’s internal state.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do RNNs handle the temporal dependencies in data?

A

RNNs handle temporal dependencies by using their internal state (memory) to process sequences of inputs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are some common applications of RNNs?

A

Common applications include speech recognition, language translation, and time series prediction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Describe a challenge often encountered with RNNs.

A

A common challenge with RNNs is the vanishing gradient problem, which makes training deep RNNs difficult.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Explain the vanishing gradient problem in RNNs

A

The vanishing gradient problem occurs when gradients shrink as they backpropagate through time, making it hard to learn long-range dependencies.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the advances made to overcome the limitations of traditional RNNs?

A

Advances include LSTM (Long Short-Term Memory) and GRU (Gated Recurrent Units), which have mechanisms to better capture long-term dependencies.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How do RNNs differ in processing data compared to CNNs?

A

Unlike CNNs, which are ideal for spatial data, RNNs are designed for sequential data, processing inputs over time and maintaining a memory of past information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the basic building block of an RNN?

A

The basic building block of an RNN is a recurrent unit or cell that processes one input at a time while maintaining information about previous inputs through its internal state.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Do RNNs share weights across different time steps?

A

Yes, in RNNs, the same weights are applied across all time steps, which helps in learning temporal patterns.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How do RNNs handle variable-length input sequences?

A

RNNs can handle variable-length inputs by processing one element of the sequence at a time until the entire sequence is consumed. This makes them flexible for different lengths of input data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the significance of the hidden state in an RNN?

A

The hidden state in an RNN acts as a form of memory. It captures information about previous inputs, allowing the network to make informed predictions based on past data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly