Cognitive๐ข Flashcards
Cognitive machine challenge
Build a machine that reasons with logic: create new logic from facts already known (inferences)
Use symbolic logic (replace all words with symbols so machine can make inference automatically)
Existing logical computers
Blocks and chess
Blocks 1970s- Computer solves spatial arrangements by starting with known facts and infers how to arrange
Chess 1990s- start with known facts of chess positions, generate new positions and consequences of moves, choose best outcome
Learning with neural networks
Strengthen connections
Before- U to R
After-CS strengthened connections to U
3 types of Cognitive psychology models
Data analysis model- data driven and descriptive (line graph)
Box and arrow model- information processing, conceptual. IMPLICIT assumption
Computational model-informational processing as simulation, various levels of abstraction. EXPLICIT assumption
Performs cognitive tasks to learn how it is implemented in the brain, without neurobiology
Explicit vs implicit models
Cognitive process- implicit models
Computational models can make assumptions explicit and assumptions can then be tested (specific predictions for outcomes)
Goals of modelling
Must be specific, models are often implicit
Can test assumptions
Predicts outcome, help to select which experiment to perform and distinguish between models
Can be exploratory (not predictive) trends difficult to show from the brain (oversimplified however)
Abstraction and idealised real concepts
All models wrong but some are useful but explanation does not imply prediction
Marrโs 3 levels of understanding
How to think about info processing systems like the brain
Computation-WHY (problem) the goal e.g. recognise objects
Algorithm- WHAT (rules) represented approach, how carried out e.g. detect edges, outline
Implementation-HOW (solved physically) e.g. visual neurons sensitive to lines
Bottom up approach
Implementation (neural circuits)
Algorithm (how generate algorithms)
Computation (what can we solve this with)
Neuroscience favours this
Top down approach
Computational (specific problem)
Algorithm (to solve problem)
Implementation (how these representations can be implemented in neural circuits)
Experimental techniques favour this method (epistemological bias)
Marrโs level favoured by neuroscience
Implementation (has epistemological bias towards it)
Bottom up
Moravecโs paradox
Computers can do hard tasks to us e.g. solving tasks but not easy tasks to us e.g. perception
Multisensory integration and touch/movement study
Seeing and feeling that you touch the object is important
Anaesthetise finger to block all touch sensations (does not affect motor abilities)
Much slower and less precise to pick up objects, need to integrate
Reference frames
(representational schemas) represent different info from senses, transform representations to common representation
Need to know body orientation and position of object, relative positions(body schema) to external space to unify frames
Reference frames of the senses
Vision- eye centred/retinal, location of stimulus on retina
Audition- head centred, location of sound with respect to ears
Touch-body centred, location of tactile stimulus on skin
Need to convert between reference frames and to external space (the world, irrespective of location and orientation of body)
Reference frames and snake game
Need to know information about snakeโs body layout to convert from player perspective to snake perspective
See space in top down view as player, but could control actions of snake in different view (first person)
Reference point
Reference frames need to be in same reference point
โIs the object Iโm seeing the same object Iโm touchingโ
Transformations needed for us to convert between reference frames
Eye to head transformations- orientation of eyes needed
Head to body transformations- orientation of head needed
Body schema
Position of body in space, relative position of body parts
7 body schema factors
Spatially coded- body parts in external and relative space
Adaptive-changes over lifetime
Modular- body parts processed in different brain regions
Interpersonal- others movements make sense to you
Supramodal-combines input from proprioception, senses
Coherent- Keeps continuity so donโt feel disembodied
Updated with movement- continually tracks posture
Body image
Body percept (โfeelโ you have a body) Body concept (of what a general body look like) Body affect (how we feel about our body)
Structural description- hand attached to arm
Body semantics- names for body parts
Does body posture affect perception?
Temporal order judgement tasks
Participant stretches fingers on hand they think was stimulated first
Crossed arm condition-
worse at body perception, body schema INTERFERES with perception, confused about where touched
How does body schema develop?
Experiment-temporal order judgement task
4 month olds no difference if feet are crossed or not, no body schema interference BUT
6 month olds reach to incorrect foot half the time when crossed, Body schema/ posture now matters for perception. Interferes with tactile orientation
Cross modal contingency- tactile discrimination task
Determine which finger was vibrated:
Visual light distractors on same finger that was vibrated OR other finger
Delay= incongruent reaction time - congruent reaction time
Longer delays when distractors on same hand vs different hand
Touch and vision, two reference frames are converted between
Cross modal contingency task with crossed arms
Tactile stimulus on same side of body, visual stimulus on different side of body
Effect of visual distractor (delays) moved with hand when arm crossed as it switches visual hemispheres.
Cross modal interactions mediated by body schema
Peripersonal space
Space immediately around the body, objects can be immediately grasped, can contract and expand
Tool use
Tools are incorporated into body schema
Crossed hands-some delay effects
Body schema disorders
Alice in wonderland syndrome- size perception distortion, body parts appear smaller or bigger associated with childhood and migraines
Autotopagnosia- inability to locate body parts, loss of spatial unity. Fused percept of fingers
Phantom limbs- body schema do not adapt, feel sensory input of missing limb. Often painful and can change in size over time
Neural basis of body schema
Cross modal neurons found in brain, respond to inputs from several senses
Neuron fires when body touched or if visual stimulus moves close to hand
Visual receptive area moves with hand, modified by body schema
What happens with integration when posture changes
Crossmodal effects remap
Neural basis of tool use
Neurons incorporate tools, expands space during tool use, encode the peripersonal space
Monkeys: neurons responded to stimuli at far end of tool which would have been too distant from the monkey to be triggered in peripersonal space (without tools)
Crossing tools- connect right hand to left visual field and becomes an extension of the hand
Mirror experiments
Tactile stimulation corresponds to visual stimulation seen indirectly in the mirror reflection at a distance
Reflections near participants hands re-coded as originating from peripersonal space near those hands. Peripersonal space extends if sees self in mirrors but not when see another persons same parts
The integration problem
Integrate into a common reference frame
Vision and audition in different reference frame, sensory conflict
(Can see and hear the dog so where is it?)
Reducing lens experiment-Testing conflicts between vision and touch
Participant manipulated object, look through reducing lens- object looks smaller but feels bigger
Asked to judge size by vision or touch
Visuals- trusted more than touch (VISUAL CAPTURE)
also if could draw or feel before (touch or motor)
Touch- small but consistent influence
Testing for a sensory hierarchy experiment
Report number of auditory beeps played during flashes of light
Number of beeps determined reported number of visual flashes
Audition can influence vision -NO strict hierarchy
Visual capture not universally true
Visual capture
Trusting visual info over other senses
Modality precision hypothesis
Use the modality with highest precision (lowest uncertainty) for task
Spatial task- vision
Temporal task- audition
Where do we get sensory uncertainty (affecting precision) from?
Perceptual limits: spacing of photoreceptors in the fovea
Neural noise in synapses
Cognitive resourse limits (attention)
Maximum likelihood estimation (two models)
How to solve a problem
Normative model- establish bounds, is this the best method? Based on theory e.g. Minimise uncertainty
Process model- how a problem is actually solved, based on data
Normative model (sensory integration)
Pick integration method that minimises sensory uncertainty (maximum likelihood estimator)
Smallest sensory uncertainty is integrated (low variance = low sensory uncertainty
If one mode is more certain, can rely on it more
Optimal weights of senses
Haptic input is weighted higher due to lower sensory variance
Integrating information from multiple sources always causes uncertainty to decrease
Manipulating sensory uncertainty experiment
Virtual bar with sensory conflict between haptic and visual information
Asked to judge the size of the bar
Determine point of subjective equivalence, manipulate uncertainty
Discrepancies between visual and haptic can change uncertainty
Without visual noise- perception biased to visual input
With visual noise- perception determined by both haptic and visual
High visual noise- determined by haptics
Human performance follows optimal sensory integration rules
Do we always integrate info optimally?
Need to know uncertainty for optimal integration- hard to estimate, easier in sensory perception than cognitive reasoning
Calculations can be intractable or take long time- heuristics are subopitmal but fast
Estimating sensory noise vs cognitive noise
Estimating tilt of lines with sensory noise (low contrast) or cognitive noise (random tilts to different lines)
Integrate sensory noise OPTIMALLY but cognitive SUBOPTIMALLY
How might probabilities be encoded in the brain
Mean and variance
Full distribution
Samples
The correspondence problem
In external space: if hear a woof and see a dog how do you know it is one dog or two?
Conflict tasks
Stroop-Word reading (automatic) interferes with controlled colour naming
Flanker- respond to central arrow»_space;<ยป
Simon- press left or right button with congruent or incongruent hand
Go/no go task- assess inhibitory control
Response manipulations: reverse stroop
Point at coloured patch, colour word appears in middle in a different ink
(Reverse stroop test)
Associations between stimulus and response (response compatibility) causes automacity, is not just a sensory process
Colours interfered with word naming not the other way around
Attentional manipulations
Stroop effect eliminated or reduced when only one letter was coloured
Automatic processes do not operate independent of attention
Stimulus onset asynchrony manipulations
Onset of colour and word (of stroop) presented at different times
Flash up coloured background then written word
No amount of head start for colour information produced interference on word reading
Automatic processes are not simply faster
Training on stroop task
Trained to name shapes as a colour
Shape then presented in a contrasting colour ink to its โnameโ
Initially colour ink interferes with naming the shape (as a colour)
Later- colours interfere with shapes and vice versa
Training- shapes interfere with naming colour
Automacity as a continuum, with practice anything can become automatic, a process in not either (completely) automatic or controlled
Making skills automatic and controlled processes
Skills are overlearnt stimulus response pairs, triggered by environment, rapid and stereotypes
People can alter skills, less habit like, elaborate cognitions
Logan and crump typing skills experiment
Participants type words, feedback on screen
Errors corrected OR shows error that wasnโt made by participant
When errors inserted, participants believed they made it (illusion of authorship) Sensitive to feedback not actions themselves.
Slowed when made real error (even when not told) and didnโt slow when told they made one
Typing skill hierarchical loops
Skill controlled by hierarchical loops. Automate some parts to pay attention to harder parts. Complex behaviours neither entirely automatic or controlled
Outer loop- language comprehension and generation decode on words to type, output on screen
Inner loop- finger and keyboard interactions sensitive to feedback from fingers