Recurrent Neural Nets Flashcards
What is a recurrent neural network?
type of neural network that can model sequential data, such as sentences or ball movements in space
What is a feed-forward neural network?
Takes in a fixed-sized input and returns a fixed-sized output
What is the inherent problem with feed-forward neural networks?
Cannot handle sequential data and information about the past must be supplied
What types of applications are RNN’s well-suited for?
applications that involve sequences of data that change over time, such as natural language processing, sentiment classification, DNA sequence classification, speech recognition, and language translation
What is a benefit of sharing parameters in a RNN?
gives the network the ability to look for a given feature everywhere in the sequence rather than in just a certain area
What do RNN’s do that vanilla neural networks dont?
- Deal with variable length sequences
- Maintain sequence order
- Keep track of long-term dependencies
- Share parameters across the sequence
What kind of architecture does a RNN use?
Uses a feedback loop in the hidden layers and can operate effectively on sequences of data with variable input length
How is a RNN trained?
Uses backprop algo BUT is applied to every sequence data point and uses what is called a backprop through time (BTT) algo
What is a limitation of using RNNs?
The vanishing gradient problem which is short-term memory which basically means that it has trouble retaining information from previous steps
It’s like trying to remember all the numbers in PI - you might remember 3.14 but you’re probably going to forget the rest over time
During RNN training, what is used to make adjustments to weights and biases?
Gradients
What are two variants of RNN that can address the short-term memory problem?
LSTM (Long Short Term Memory) RNN’s and GRNN’s (Gated RNN’s) because they are capable of learning long-term dependencies using gates