Chapter 4 - Recurrent Neural Networks: Hopfield & BAM Networks Flashcards
What is the primary function of Recurrent Neural Networks (RNNs)?
RNNs are designed to process sequential data by maintaining a memory of previous inputs
What is a key difference between RNNs and feedforward neural networks?
RNNs have feedback loops, allowing the output of a neuron to be fed back into the network as input for the next time step, whereas feedforward networks do not.
What is the “hidden state” in an RNN?
The hidden state is the “memory” of the network that captures information from previous time steps, updated with each new input.
How do RNNs handle variable-length input sequences?
RNNs use shared weights across time steps to generalize across different sequence lengths and handle variable-length input.
What does “unfolding in time” mean in the context of RNNs?
“Unfolding in time” is a way to visualize an RNN as a series of layers, each representing a time step in the input sequence.
What is Backpropagation Through Time (BPTT)?
BPTT is a variant of backpropagation used to train RNNs by adjusting weights based on gradients calculated over all time steps.
What is a limitation of standard RNNs regarding long-term dependencies?
Standard RNNs struggle to capture long-term dependencies due to vanishing gradients.
How do LSTMs and GRUs address the limitation of standard RNNs?
LSTMs and GRUs use gating mechanisms to control the flow of information, allowing them to remember and forget information over longer periods.
How do Hopfield Networks work as recurrent networks?
Hopfield networks are fully connected, with neurons feeding back to each other, creating a recurrent structure and the network iterates over states until it converges to a stored pattern.
What is the main function of a Hopfield network?
Hopfield networks are designed to store and retrieve memory patterns, acting as a form of associative memory.
How does a Hopfield Network reach a stable state?
Hopfield networks adjust their states by minimizing an energy function, iterating until the network reaches a stable state corresponding to a stored pattern.
What is a key difference between Hopfield networks and modern RNNs like LSTMs or GRUs?
Hopfield networks are static, settling into an equilibrium point, while modern RNNs process sequences of data over time, maintaining evolving hidden states.
What is the primary use of Hopfield Networks?
Hopfield networks are mainly used for associative memory.
What is the “memory problem” in the context of neural networks?
The memory problem refers to the challenge of storing and recalling data efficiently without confusion or interference.
What is a limitation of Hopfield networks?
Hopfield networks have low capacity, and they primarily focus on auto-association (storing and recalling the same pattern).