Week 8: The Perceptron - A Supervised Learning Algorithm: How does supervised learning is used to perform pattern recognition in perceptron? Flashcards
**Main Goal of this flashcards: **Learn to describe supervised learning is used to perform pattern recognition
in a perceptron
Rosenblatt’s Perception (2)
- Describes how a set of examples of stimuli and correct
responses can be used to train an artificial neural network
to respond correctly via changes in synaptic weights - Learning governed by the firing rates of the pre- and post-synaptic
neurons and the correct post-synaptic firing rate
(i.e.,a teaching signal) => “supervised learning”
Rosenblatt’s Perceptron is an
important historical example: instructive in
understanding the aim of using neural networks
for pattern classification.
What is a teaching signal?
Tell the network what the corrct output should be
The different types of learning rules (3)
- Unsupervised
- Supervised
- Reinforcement
What is unsupervised learning? (2)
There is no ‘teacher’ or feedback about right and wrong outputs
We just give pattern to the network and apply learning rule and see what happens
Examples of unsupervised learning (3)
- Hebbian learning
- Competitive learning rule
- BCM Rule
Supervsied learning is
providing a teaching signal
What is reinforcement learning?
Occasional reward or punishment (‘reinforcement learning’)
Reinforcement vs supervised learning (2)
In RL don’t have a teaching signal for every input-output combination as compared to SL
Only get occasional reward and punishment
Perceptron uses
standard artifical neurons with no dynamics
Simple Perception (4)
- One output neuron (O1) and two input neuron (x1 and x2)
- X1 and X2 neuron each have a input weight of w1 = 1 and w2 = 1
- To get output (activity of O1 neuron), you sum the input activity * input weight and put through trasnfer function (which is step function)
- Output neuron is active if both input neurons are active (given they reach a threshold of 1.5) = model performs logical ‘AND’ function
Diagram of Simple Perception
Diagram of Simple Perception Table = performs logical ‘AND’ function (3)
If both input neurons are not active then output neuron is not active
If one input neuron is active and other isn’t then output neuron is not
It is only when two input neurons are active that the output neuron is active
Rosenblatt’s produced graph of all possible x1 and x2 combinations from simple perceptron model
(4)
Dashed line is decision bounary
Left of line, 0 1 = 0
Right of line 01 = 1
We can write equation of line: x1 + x2 = 1.5, x2 = -x1 + 1.5 (rearranged first equation)
In Rosenblatt’s Perception graph, the line separating O=1 and 0=0 is where the net input equals the
threshold: w1x1 + w2x2 = T