Chapter 5 - Hebbian and Competitive Neural Networks Flashcards
What is the core principle of Hebbian neural networks?
The core principle is Hebbian learning, often summarized as “Cells that fire together, wire together.” This means that if two neurons frequently activate together, their connection strengthens.
What type of learning do Hebbian networks use?
Hebbian networks use unsupervised learning, where the network learns through association rather than explicit target labels.
What is Hebb’s Law also known as?
Hebb’s Law is also known as the Hebbian learning rule.
What is a key strength of Hebbian networks?
A key strength is their ability to perform unsupervised learning, which is useful when labeled data is scarce. They find patterns and relationships by strengthening connections between neurons that activate together.
How is Hebbian learning related to biological neural networks?
Hebbian learning is inspired by how biological neural networks in the brain learn, where “neurons that fire together, wire together,” closely mirrors real-world neural processes in synaptic strengthening.
What is a limitation of basic Hebbian learning?
A limitation is uncontrolled weight growth, as weights continually increase if neurons keep firing together.
What does Hebbian learning lack in terms of weakening connections?
Hebbian learning lacks mechanisms to weaken connections when neurons do not fire together.
Why might Hebbian networks struggle to learn complex patterns?
They struggle because they rely solely on pairwise correlations between neurons.
What is a main use case for Hebbian networks as unsupervised models?
They can be used for clustering data, finding natural groupings based on which neurons activate together.
How can Hebbian networks be used in pattern recognition?
They can be used to recognize patterns in data. For example, in image recognition, patterns that frequently co-occur strengthen synaptic connections.
What is a primary application of Hebbian learning in neuroscience and cognitive modeling?
It is used to simulate learning in biological brains, helping to understand how memories are formed, associations are learned, and responses are formed to repeated stimuli.
What are Self-Organizing Maps (SOMs)?
SOMs are a type of unsupervised learning neural network that uses competitive learning to map high-dimensional input data to a lower-dimensional grid. They are also known as Kohonen maps.
How are SOMs related to Hebbian learning?
SOMs are an extension of Hebbian learning, using the principle of neurons strengthening their connections through frequent activation but introducing competition among neurons.
What is competitive learning in the context of SOM?
Competitive learning is a process where neurons compete to become the most activated in response to a given input. The “winning” neuron gets updated to better represent the input.
What happens to the weights of the “winning” neuron in a SOM?
The weights of the “winning” neuron (and its neighbors) are adjusted to more closely match the input pattern.