Week 4: recurrent visual processing Flashcards
What are lateral connections?
Inputs to a unit coming from other units in the same layer
For what kind of networks have are temporal inputs important?
Linguistic processing
What is a recurrent convolutional artificial neural network?
The convolutional filter is not only applied to a spatially-limited group of units in the previous layer but also to the surrounding units in the same layer, and also possibly other layers
Why are biological and recurrent convolutional networks time-dependent?
The activity in a layer changes depending on results of surrounding activity in the layer and activity in the next layer. there is dynamic interaction between layers to reach an equilibrium which is time-dependent
How is recurrent activity implemented?
Through lateral and feedback connections
What are dynamic neural oscillations?
The dynamic interactions between excitatory and inhibitory neural population produce oscillations in neural population activity. at oscillation peaks, excitatory population activity is highest
How is recurrent activity implemented in deep networks and how is this different from biological networks?
In a layer with recurrent activity, the activity of each unit at one time point feeds into the same unit at the next time point. In biological networks neurons never synapse directly with themselves
What is a BLT artificial network?
Bottom-up, lateral and top-down artificial network - type of recurrent network
What happens to the network’s activity in a BLT network?
The network’s activity moves back and forth because the results of each interaction affect the activity of the interacting elements - no fixed state of activity
How do BLT networks change the computational time? why?
Much more computationally intensive
- more extensive filters
- multiple time steps modelled
What developments have there been in recent convolutional networks?
Recently developed recurrent convolutional networks are investigating lateral and top-down connections
What happens to image classification when using BLT networks?
Image classification performance in difficult tasks improves considerably
Why are BLT networks time dependent?
The interactions between different layers make the network time-dependent
What is the difference between a deep feedforward network and a shallow recurrent network?
Recurrent connections effectively make networks deeper
- the same transformation is repeated by recurrent cycles
- each layer performs multiple layer operations but each with the same set of weights, fewer weights to learn
- recurrent networks match neural architecture and activity more closely than deeper networks.
What happens in attractor/hopfield networks?
Frequently seen patterns of activity produce strong connections between the activated group of neurons due to Hebbian learning. this pattern gets built into the connection weights. new incoming patterns causes a sequence of recurrent activity that activates units even when there is no input to activate then directly. neurons usually activated together in the higher layer become active - an incomplete pattern comes in it is completed based on experience
What does the attractor/hopfield network require?
Multiple recurrent cycles
How do different networks perform with recognising incomplete or noisy patterns?
Feedforward networks perform poorly and increase linearly as more of the object is shown. A recurrent hopfield attractor network approaches human performance
How does feedback work in predictive encoding?
- the initial sweep forward extracts image features
- prediction is generated based on previously seen patterns
- the completed prediction is fed back to the lower layer and inhibits activity that fits with this prediction
- the feedback connection carries the prediction
- the feedforward signal becomes only the parts that don’t fit with the prediction, the prediction error
- remaining activity signals prediction error
- Hebbian learning follows this error signal, like back-propagation
What are PredNets?
A recent type of recurrent network used to implement predictive coding. they predict future activity based on current activity and previous experience, but no labels
How can predictive coding happen in networks?
It can emerge spontaneously in a network where each layer has a full set of within-layer connections and feedforward and feedback convolution, without blocks explicitly designed to generate predictions
Why do researchers try to train a network to minimise pre-activation in a predictive network?
Asking a network to minimise the summed activation of its inputs, the network can’t just set everything to 0 because the input image will produce a lot of activation. it needs to actively inhibit the input image. this is effective for reducing energy consumption and predicting the next input
Why might the brain use predictive coding?
To reduce energy demands
What is the blue brain project?
A project that has recently finished mapping the full neural circuitry of mice
Why were current processes such as convolution and rectification created?
They were chosen for computational efficiency rather than accurately simulating the underlying process
What has neuroscience revealed as the basic principles of neural responses and their interactions?
- recurrent interactions
- attractor networks
- predictive coding
- Hebbian learning from prediction errors
What are current neural networks good for?
Making predictions about external events, some characteristics of biological understanding and intelligence
What is the software toolbox Neuron used for?
biophysical simulation of interacting neurons as mapped from mouse brains. uses the neural parameters and connection patterns actually measured in the mouse brain
What are the two problems with making a similar map of the human brain?
- ethically we can’t make genetically engineered human clones to study
- the computing power needed to run the simulation of even the mouse brain is not available
What is the problem with simulating neural circuitry on a computer?
Most computer processors are designed for linear mathematics
-they can perform any operation accurately but they are slower and more energy-hungry than the brain, so they are currently being implemented in simpler neural models
How is the computing power problem being overcome?
Running biophysical simulations in real time required neuromorphic processors, designed to run neural simulations. they are developing quickly and may be ready in 10 years