Week 4: recurrent visual processing Flashcards
What are lateral connections?
Inputs to a unit coming from other units in the same layer
For what kind of networks have are temporal inputs important?
Linguistic processing
What is a recurrent convolutional artificial neural network?
The convolutional filter is not only applied to a spatially-limited group of units in the previous layer but also to the surrounding units in the same layer, and also possibly other layers
Why are biological and recurrent convolutional networks time-dependent?
The activity in a layer changes depending on results of surrounding activity in the layer and activity in the next layer. there is dynamic interaction between layers to reach an equilibrium which is time-dependent
How is recurrent activity implemented?
Through lateral and feedback connections
What are dynamic neural oscillations?
The dynamic interactions between excitatory and inhibitory neural population produce oscillations in neural population activity. at oscillation peaks, excitatory population activity is highest
How is recurrent activity implemented in deep networks and how is this different from biological networks?
In a layer with recurrent activity, the activity of each unit at one time point feeds into the same unit at the next time point. In biological networks neurons never synapse directly with themselves
What is a BLT artificial network?
Bottom-up, lateral and top-down artificial network - type of recurrent network
What happens to the network’s activity in a BLT network?
The network’s activity moves back and forth because the results of each interaction affect the activity of the interacting elements - no fixed state of activity
How do BLT networks change the computational time? why?
Much more computationally intensive
- more extensive filters
- multiple time steps modelled
What developments have there been in recent convolutional networks?
Recently developed recurrent convolutional networks are investigating lateral and top-down connections
What happens to image classification when using BLT networks?
Image classification performance in difficult tasks improves considerably
Why are BLT networks time dependent?
The interactions between different layers make the network time-dependent
What is the difference between a deep feedforward network and a shallow recurrent network?
Recurrent connections effectively make networks deeper
- the same transformation is repeated by recurrent cycles
- each layer performs multiple layer operations but each with the same set of weights, fewer weights to learn
- recurrent networks match neural architecture and activity more closely than deeper networks.
What happens in attractor/hopfield networks?
Frequently seen patterns of activity produce strong connections between the activated group of neurons due to Hebbian learning. this pattern gets built into the connection weights. new incoming patterns causes a sequence of recurrent activity that activates units even when there is no input to activate then directly. neurons usually activated together in the higher layer become active - an incomplete pattern comes in it is completed based on experience