M8 Flashcards
For studying the material covered in module 8 (LSTM)
RNN (Recurrent Neural Network)
UNlike most neural network, information in an RNN is fed BACK into the system after each step
- Ideal when context is important
- weight matrices use shared weight across time steps or something (re-uses weight matrix every time-step)
FNN (Feedforward Neural Network)
Neural network where connections between nodes do not form cycles. Data flows in one direction (input -> output), doesn’t retain info from previous inputs.
- Good for individual inputs (like images)
- Not good for sequential data
Recurrent neurons (key component of RNN)
Holds hidden state with information about previous inputs
RNN unfolding/unrolling (key component of RNN)
“Process of expanding the recurrent structure over timesteps”, each step of the sequence represented as a separate layer. Enables BPPT (Back Propogation Through Time), where weights are set?
RNN Process Sequences can be…
1:1/”vanilla neural network” (classification of n images), 1:M (generate image caption, input = 1 image, output = many words/a sentence), M:1 (sentiment analysis), M:N (machine translation)