Task 4 Flashcards

1
Q

What is meant by connectionism ?

A
  • human mental processes (such as learning) can be explained by the computational modelling of neural nets
  • Computational modeling is INSPIRED by neuronally information processing
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do human neurons generally work ?

A
  • One neuron passes info to the next neuron
  • learning changes the strength between neurons
  • Work parallel
  • Information proccessing is distributed over many neurons
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the 5 Assumptions of neuronal or connectionim network ?

A
  • intergrate info and need to overome a treshold
  • Neurons give info about there input lvl via the first firing rate of the output
  • Brain structues work in layers (hierarchically / transformation of represntation)
  • The influence of one to another neuron depends on the strength
  • Learning is achieved by changing the strength between two neurons
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is meant by weight ?

A
  • amount of connectons between neurons
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is meant by the activation function?

A
  • Different functions representing the activation of the ouput of a neuron
  • linear, threshold binary and sigmoid
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is meant by the linear activation function ?

A

What ever I put in is what I will get out 1 = 1

- f(x) = x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is meant by the thershold activation function ?

A
  • if my input was larger then some threshold then it is equal to the input
  • f(x)>0
  • if smaller then the threshold then the output is zero
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is meant by the binary activation function ?

A
  • if my input is larger then my threshold then it would be one
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is meant by the sigmoid activation function ?

A
  • it is a super complicated function which represnt that a variety of function can be used as a Activation function
  • it is also the best function to ensures that theactivty can not go beyond a fixed minimum
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is meant by the output function in the equation principle ?

A
  • determines the output actually sends onwards
  • usually in conncetionist model it is a liner function 1=1
  • usually in biological model it is not linear
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is meant by the bias ?

A
  • it is just the same as the threshold but a negative thershold
  • the negative threshold can learn to identify the optimal threshold
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are some properties of the connectionist model ? (Task 3: Why are neuronal networks so key for machine learning ?)

A
  • AI are damage resistant and resistant to fault tolerant
  • Allow for content addressable memory (cue activates a pattern of memory)
  • Tries to satisfy/compromise as much as it can regarding the constraints
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Why are connectionsist model so damage resistant and resistant to fault tolerant ?

A
  • No individual neuron/ connection is crucial

- If one neuron is incorrect the population will make up for it (graceful degregation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are the two type of neuronal networks ?

A
  • pattern associator

- Autoassociator

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is meant by the pattern associator ?

A
  • Describes how different stimuli become linked when they are repeatedly presented together (training) in a learning period
  • can generalize from existing input which means it can respond to novel inputs and is damage resistent
  • have been used to model the function of memory
  • consist of input and output unit each input unit is connected to all output unit
  • it is a non recurrent network
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is meant by an Autoassociator pattern ?

A
  • ONE form of pattern associator
  • includes the ability to recall a complex memory with a cue (linking)
  • make recurrent connections
  • can reproduce something even though there are noises or incomplete inputs
  • each unit serves as both an input unit and an output unit, so that each unit is connected to every other unit.
  • Every autoassociator is a recurrent network
17
Q

How does the autoassociator network and the pattern associator learn ?

A
  • Hebbian learning rule
  • delta rule
  • Neo hebbian leraning rule
  • Differential hebbian learning
  • drive reinforcment theory
18
Q

What is meant by the original hebbain learning rule ?

A
  • What fires together wires toegther
  • if two neurons on either side of a synapse are activated simultaneously the strength of the synapse is selectively increased
19
Q

what is meant by the neo hebbian learning rule?

A
  • Same as the hebbian learning rule but a weight will always forget a certain proportion of its learning
  • decrease strength
  • it overcomes the weight problem
  • It is better then the hebbian learning rule
20
Q

what is meant by the diffrential learning rule ?

A
  • Same formula as Hebb learning but with a Delta in front of the activation !
  • Delta means = change /differnce
  • In this law we are not interested if the neurons are active at the same time
  • But we are interested if the neurons change in the same way (behave in the same way)
  • If they do so then they will learn
  • learning only happens when there is an activity change BETWEEN units
21
Q

What is meanty by the drive reinforcment theory ?

A
  • extension of the differntial hebbian learning rule
  • It takes time and time differences into account
  • So a neuron has to change and the other neuron has to change a bit before that
  • The DRT network considers not only the incoming stimulus at the current time, but also the recent history of incoming stimuli
22
Q

Name the limiattions of the original hebbian rule ?

A
  • If to many patterns are in one network they will interfere -> Problem of Interference
  • Problem of Satuartaion = weights only increase they do not become smaller -> eventually neurons will be active no matter what input will be there
23
Q

Why is it good to have a bit of interference ?

A
  • Well it uses the inference to cancel out certain noises
24
Q

How does the delta rule work ?

A
  • Different then the hebbian rule since we know what kind of desired output activity we want
25
Q

What is the formula of the delta rule ?

A
  • desired activity - (minus) actual activity = missing or to much energy (o)
26
Q

What is meant by a double layer network ?

A
  • you have an input layer and an output layer
27
Q

what is meant by a multilayer network ?

A
  • instead of having an input and an ouput layer u have hidden layer
28
Q

Why do we need hidden layers ?

A
  • because of seperability
  • double layer network can only differentiate between the LOGICAL OR (linear)
  • either it is a cat or a dof
  • > BUT multilayer can differentiate the logical or and also the exclusive or (curve)
29
Q

how do we train a neuronal network?

A
  • Backpropagation = same as delta rule but you can imply that to hidden layer
  • part of the error in the ouput is caused due to a failure in the hidden layer
  • based on that assumption we can use the delta rule
30
Q

What is meant by competitive learning ?

A
  • neurons compete for the right to respond to a given subset of inputs -> such that only one output neuron is active
  • to be the winning neuron its activity must be the lagest among all other competing neurons
  • The output of the winning neuron is set to 1 and the output of all the other neurons is set to 0
31
Q

What is the difference between a pattern associator and an autoassociator ?

A
  • The difference is that the output line of each unit is connected back to the dendrites of the other units (Recurrent) in a autoassociator
  • It repoduces the same input as output pattern
  • > NOT 100% sure
32
Q

What learns in an autoassociator ?

A
  • Autoasscociator contains internal and external inputs
  • the internal input undergoes recurrent learning
  • external inputs only activates then a few unit (cue) which is enough to see the bigger picture
33
Q

What were ginas problem with the original hebbian learning ?

A
  • Does not specify how connections should increase
34
Q

What is meant by instar?

A
  • receives incoming stimulation

- Neo hebbian learning

35
Q

What is meant by the outstsar ?

A
  • transmits output back to other instars

- Neo hebbian learning

36
Q

What are neurons instars or oustars ?

A
  • neurons are both simultaneously
37
Q

What is the opposite of the recurrent network ?

A
  • Feed-forwardmodel (able of classification)
38
Q

Give a definition or reccurent neuronal network ?

A
  • Recurrent networks, the output of a layer is added to the input of the next layer and also feedback into the same layer.
  • Able of forcasting
  • Example would be the hippocampus