M8 Flashcards

For studying the material covered in module 8 (LSTM)

1
Q

RNN (Recurrent Neural Network)

A

UNlike most neural network, information in an RNN is fed BACK into the system after each step
- Ideal when context is important
- weight matrices use shared weight across time steps or something (re-uses weight matrix every time-step)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

FNN (Feedforward Neural Network)

A

Neural network where connections between nodes do not form cycles. Data flows in one direction (input -> output), doesn’t retain info from previous inputs.
- Good for individual inputs (like images)
- Not good for sequential data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Recurrent neurons (key component of RNN)

A

Holds hidden state with information about previous inputs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

RNN unfolding/unrolling (key component of RNN)

A

“Process of expanding the recurrent structure over timesteps”, each step of the sequence represented as a separate layer. Enables BPPT (Back Propogation Through Time), where weights are set?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

RNN Process Sequences can be…

A

1:1/”vanilla neural network” (classification of n images), 1:M (generate image caption, input = 1 image, output = many words/a sentence), M:1 (sentiment analysis), M:N (machine translation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly