Neurocognition Flashcards
Which type of networks can you distinguish? Give a brief description of each.
feedforward network and fully recurrent network.
in FFN, info moves in 1 direction from input layers to hidden layers then to output layers. it focuses on classification
in FRN, info moves in both directions and every neuron is connected to every ohter neuron. it’s known for the ability to memorise
What are the arguments (observations) in favor of the suggestion that object recognition and
classification occur in the brain by means of feedforward networks?
ventral object recognition happens very fast, too fast for feedback processing
shown in study w primates
What is a receptive field of a neuron?
specific area of the neuron that it responds to. it only activates for stimuli in that certain area
Which variations can you see of receptive fields in feedforward networks in the brain that process
visual information for object recognition and classification?
as info goes from V1 to IT, the receptive field becomes larger, the neurons become less specific and are activated by more objects. this allows recognition regardless of size/ location and angle changes.
What is a topographic or retinotopic representation?
when the order of objects perceived in the retina are projected to the V1 the same order
eg. whats visually next to each other are next to each other in V1 representation too
What is a topographic or retinotopic representation?
when the order of objects perceived in the retina are projected to the V1 the same order
eg. whats visually next to each other are next to each other in V1 representation too
What does size/ location invariant object recognition mean?
the size and location of the object in the visual field does not interfere with recognition
Which observations in the Quian Quiroga article support the notion of invariant object representation
in the brain?
cells in the medial temporal lobe is activated when participants were shown same person in different background.
- respond to the idea of the percepts rather than the details that falls in the retina (conceptual representation)
What is the relation between information and uncertainty? Give an example.
information is the reduction of uncertainty. without reducing uncertainty, there is no information given.
eg. if only 1 candidate ran for president and he won, his victory is not information.
(=0 bits)
What is the amount of information? How can it be expressed?
AOI is the minimal amount of signals needed for communication
-expressed in ‘bits’
What is the relation between information and classification?
classification reduces information.
-eg. in the AND problem, the initial input is 2 bits (x, y) but the output is only 1 bit (0, 1)
What is a perspective or Frame Of Reference (FOR)? Give an example.
- the info that the neuron has access to.
- different for each neuron
- eg. FOR of output layer is the layer that is right below it.
Why is a perspective or Frame Of Reference (FOR) important for understanding how a network
operates?
- cuz each node of each layer have a different FOR
- have to undestand what info it has access to before we can understand how it learns
What are the similarities and (typical) differences between real neurons and artificial neurons?
- both can be activated/ inhibited depending on whether the threshold is reached
- real neurons: activation/ inhibition depends on action potential that goes thru the synapse
- artificial neurons: activation/ inhibition depends on the activity number and connection weight
What does it mean that a classification problem is linear separable? Give an illustration (example).
when in an input space, a line can separate the coordinates that signify activation from coordinates that does not signify activation
What is the input space of a network such as a perceptron?
input space: spatial representation of all possible inputs
Describe in global terms a learning procedure for a perceptron. Explain what the error is in the
learning rule. What is the role of the error?
learning occurs thru shifting the connection weights,
-aka supervised learning
- according to the learning rule, the new weight equals to a random weight plus the error
- error: difference between the desired output and the actual output
- it measures the error the network makes
Does the learning procedure for a perceptron stop changing the weights of the perceptron when the
perceptron achieves classification? If so, why?
-yes
- weights stop shifting when the desired output is achieved
- so that the weights can surpass the threshold
What is supervised learning? Give an example.
- learning procedures that use a measure of error
- connection weights can be shifted
- eg. learning of a perceptron.
begin w random weight (Wi) and update w learning rule
learning depends on the actual output and the desired output,
-we want to the desired output to be 1 and current output to be 0 so the weights would be larger than the threshold
Can all classification problems be learned by a perceptron? If not, why not?
- no
- some problems, eg. EXOR, can’t be solved because it’s not linear separable
Give an example of a classification problem that cannot be solved by a perceptron. Explain (and
illustrate) why not
- EXOR problem
- not linear separable
- L5 p5
What is a squashing function? Give examples.
- activation function the reduces potentially large input into a small output range (-1 to 1)
- eg. logistic function, hyperbolic tangent
Why do you need squashing functions for a feedforward network with hidden layers (so a network
with more than two layers), if the network has to achieve more than a perceptron?
- squashing function w hidden layers needed when the classification problem is not linear separable,
- without a squashing function the problem can’t be solved
Show why a multi-layer feedforward network without squashing functions in the hidden layers is
similar to a two-layer feedforward network.
if the multi-layered network uses a linear function (instead of a squashing function), it still wouldn’t solve the EXOR problem
- bcuz it doesn’t change the input the output layer receives