Long Short-Term Memory (LSTM) Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Long Short-Term Memory (LSTM)

A

Long Short-Term Memory (LSTM) is a type of Recurrent Neural Network (RNN) and is used to process entire sequences of data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q
  1. Introduction
A

LSTM is an artificial recurrent neural network (RNN) architecture used in the field of deep learning. It was developed to deal with the exploding and vanishing gradient problems that can be encountered when training traditional RNNs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q
  1. Memory Cells
A

Key to the LSTM’s ability to effectively learn from sequence data is the memory cell which maintains its state over time, and a series of gates that control when the cell updates its state and when it outputs its value.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q
  1. Gates
A

LSTMs have three types of gates input gate (decides how much information from current input should be stored in the cell state), forget gate (decides how much information should be discarded from the cell state), and output gate (decides what the next hidden state should be).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q
  1. Handling Long Sequences
A

LSTMs are explicitly designed to avoid the long-term dependency problem. They can remember information for long periods of time, which is an advantage when dealing with sequences and lists.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q
  1. Backpropagation Through Time (BPTT)
A

Like other RNNs, LSTMs use BPTT for training, but they also have a unique way of connecting old information to the present task, which helps in learning from sequences of data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q
  1. Use Cases
A

LSTMs are used on a variety of time series data applications including univariate time series forecasting, multivariate time series forecasting, and even natural language processing tasks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q
  1. Variations
A

There are several variations of LSTM like Bi-directional LSTM, Peephole LSTM, and Gated Recurrent Units (GRU). Each has its specific structure and use-cases, but all aim to capture the temporal dependencies of different types and lengths in sequence data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q
  1. Challenges
A

Despite their success, LSTMs can be difficult and time-consuming to train, and they require a significant amount of computational resources. They may also struggle with tasks that require more complex forms of memory, such as reasoning over facts or constructing complex plans.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly