w7 l1 deep learning nlp Flashcards
what are some problems with forward nerual networks
limited context and theres no way to incorporate the history of previosuly made decisions
whats an altertiative to fnns
recurrent neural networks
what is the key difference between fnns and rnns
the input to the hidden layer at time t is expanded with the value from the previous hidden layer at t-1
how is the rrn aritecture impacted by the expeansion of the current input with the previous hidden layer
learning an extra parameter, matrix of weights for the hidden layer from the previous time point
check formula in slides
what are the problems with rnns
focus too much on immediate context
or competly ignore the immediate context
what are rnns with a memory unit called and why is it good to have a memoery unit
Exploding and vanishing gradient problems are fixed by “storing”
data using a memory unit.
RNNs that have a memory unit are called Long Short Term Memory
(LSTM) Networks