Week/Topic 7: Connectionism Flashcards
Connectionist networks
are a form of
computation that is loosely inspired by the
way the brain works
Connectionist networks are (roughly) the
same as:
Neural Networks
Parallel Distributed Processing (PDP)
Connectionism
is the view that such
networks are useful for understanding
how information is processed in the mind
Perceptron Convergence Rule
A learning algorithm for perceptron’s (single-unit networks). It changes a perceptron’s threshold and weights as a function of the difference between the unit’s actual and intended output.
Hebbian Rule
a model of how neurons learn and form connections in the brain
Difference between Hebbian Rule and Perceptron-Convergence Rule
Hebbian learning rule – It identifies, how to modify the weights of nodes of a network. Perceptron learning rule – Network starts its learning by assigning a random value to each weight
Who came up with the Perceptron-Convergence rule, and when?
While the perceptron algorithm was invented by Rosenblatt, this theorem and its proof were from NYU mathematician Albert B. J. Novikoff in 1962
Name the 4 different activation functions
Linear
Threshold Linear
Sigmoid
Binary Threshold
Why cant a single layer network compute XOR.
It can only assign 1 weight to each input.
Hidden layer
a layer of hidden units in an artificial neural network
Why is the backpropagation algorithm not very biologically plausible?
There is no evidence error is propagated backwards in the brain, and nature never really provides feedback as detailed as the algorithm requires
What aspect do multilayer networks have?
hidden units that are neither input units nor output units