Synaptic plasticity in simple systems Flashcards
What is learning and memory?
· Learning: ”the process of acquiring new information” – through experience, teaching or study
Memory: ”the persistence of learning in a state that can be revealed at a later time” – information that is learnt to be stored, encoded and retrieved at a later time
What is nativism and empiricism?
· Nativism: Plato - ’All knowledge is innate’ – hardwired into our brains at the time of birth
- Empiricism: Aristotle - Learning is the process of drawing information into the mind. The mind is a ‘tabula rasa’ on which experience is subsequently recorded. The mind is moulded from the environment and sensory experience
What is the physical basis of memory?
· Stephen Rose (1995): “Something, somewhere has to change”.
· Engrams are the physical manifestation of memory in the brain. Permanent physical change
· Plasticity is the physical process of experience-dependent change in the brain. Leads to memory formation.
Are neurons static?
Neurons are not static; their dendrites are continuously active.
How do we test memory?
· Simulated memory
· Memory in a petri dish – artificial networks from real neurons
Biological memory – most realistic, nervous system of whole organisms
What theories are used for experimentation?
· William James (1890) – when two elementary brain processes
· Donald Hebb (1949) – when axon of site A is close to site B
· McClelland & Rumelhart (1986) – when unit a and unit b are simultaneously excited
· These views convey very similar ideas: memory can be encoded by experience-dependent changes in excitability.
Connectionism: networks of neurons – memory is not stored in single neurons, instead in dense networks of neurons
What is synaptic plasticity and how can it be measured?
· Synaptic plasticity – how effectively neurons can communicate with each other
· Before learning: space between the neurons is important (called the synapse), neuron 1 is getting activity propagating across the synapse to the second neuron
· During learning (trial 1):
· Trial two: activity profile grows in the postsynaptic neuron
After learning: increase in activity in the postsynaptic neuron with less energy in the pre-synaptic neuron, can excite his neighbours so neurons can communicate more effectively
What is LTP and LTD?
· Long-term potentiation (LTP) is the long-lasting enhancement in efficacy of the synapse between two neurons.
· Long-term depression (LTD) is an alternative form of plasticity in which there is a decrease in efficacy between two neurons. (opposite to LTP)
· Occurs in many areas in the central nervous system
· These forms of Hebbian learning might form the physiological basis of memory in the brain.
· Presynaptic neuron, synapse and postsynaptic neuron
· Slow stimulation of the presynaptic neuron, take a recording of the activity in the postsynaptic neuron
Tetanic stimulation, large stimulation fired within one or two seconds.
What is associative LTP?
· A tetanus will elicit LTP if applied to a strong input.
· It will not do so if applied to a weak input alone.
However, when a tetanus is applied to both inputs simultaneously, LTP occurs in both. This is known as associative LTP.
What are characteristics of LTP?
· Synapses in LTP behave according to Hebbian rules. These are:
1. The tetanus induces repeated contiguous pre- and post-synaptic activity.
2. This results in increased efficacy between pre- and post-synaptic neurons.
· These synapses have four properties:
1. Rapidity – LTP can be induced
2. Cooperativity
3. Associativity – weak stimulation of a single pathway is insufficient for LTP if one pathway is strongly activated the neighbouring neuron experience LTP
Input specificity
What is connectionism?
· Memory is distributed in networks of neurons.
· Parallel distributed processing (PDP) McLelland & Rumelhart (1986) – inspired the parallel nature of brain networks
· Input and output with a dense network in-between, activity is propagated across the layers
Architecture consists of a neural network made up of idealised neurons connected with Hebb-like synapses. – artificial synapses
What is backprogation?
· Backpropogation: The networks are presented with training sets. Containing several examples of inputs with corresponding outputs
· Actual output – desired output = error. Connections have values reflecting their strengths.
· errors
The error signal is propagated backwards and the initially random weightings are adjusted with learning until errors are minimal.
What are the advantages of connectionism?
· Biological realism. – don’t need to be biologically realistic, easy to learn and test
· Networks learn through experience. – trial and error, how we operate in the real world, using mistakes to adjust behaviour
· Graceful degradation. – lesions to the units degrades memory but doesn’t abolish, distributed across the networks
Analytical solutions not required. – parallel networks meaning errors can be resolved easily
What are the disadvantages of connectionism?
- Notion that they can learn anything given enough training. – unlike our real brains, we cannot be good at everything
- Information is sub-symbolic. – abstract connection views
- Training is time consuming. – have to go through the process of input and output
- Networks forget learned material fast. – when they’re trained on other data sets, unlike our real brain
Learning in simple networks?
· Is there any evidence that networks of real neurons use Hebbian principles?
· It is possible to manipulate and observe the activity of real networks.
· LTP and LTD can be shown:
· In artificially grown neurons – in lab conditions
· In slices of brain tissue – taken from animals
In the nervous system