EEG Flashcards
How ie EEG recorded?
EEG cam with wholes for electrodes
On screen you see that electrical activity is recorded for every electrode
Quick look at recording and sampling
conversion from analogue signal to digital one.
Analogue signal –> sampling –> digital signal
What is the sampling rate?
it is how often you sample the signal
- For EEG or MEG often 1000 times per second
STRENGTH: Get signal on a millisecond resolution
What is the Nyquist-Shannon theorem
sets a bandlimit to what you can look at in your data: B < fs/2
fs = sampling frfequency
Sample more than twice as high as your fastest signal
fs/2: Nyquist : you want to sample double as high as your noise
What is aliasing?
occurs for lower fs
What are advantages of MEG and EEG and downsides?
MEG and EEG = very good temporal resolution, bad spatial resolution
EEG history
EEG first recording 1920s by Berger
First doubted, then confirmed by Adrian Matthews
Also found in other animals (water beatles and honey bee –> (alpha rhythm is also found)
Oscillations
Alpha: 8
Clinical example of EEG: epilepsy
- neurological disorder with recurring seizures
- abnormal synchronized electrical activity of neurons
- seizures can be generalized or focal
- medication or surgical treatment (need to know where the seizure is/comes from) –> inplanted electrodes
-seizure: amplitude of signal goes up and highly sinchronized firing in the brain, making communication between brain areas very difficult
Localizing epiloptogenic zones
Investigating how much additional information MEG can provide in the identification of epileptogenic zones from data recorded between seizures
Place and grid cells
The coordinate system of the brain -> discover using single cell recordings
O’Keefe discovered place cells
- they fire when the rat is at a specific spot
Moser & Moser discovered grid cells
- They fire at a particular recurring locations
grid cells in humans
visual exploration and eye tracking
- grid cells for cisual space
-discovered with MEG
–> hexadiagonal
Strongest hexa-directional modulation of signal in mediotemporal lobe
What is MEG
Not as portable, shielded room
What are we measuring
–> something the neurons do in the brain
–> something with electricity
–> something that makes it to the scalp
–> summation of activity is needed
Spatial summation
–> PYRAMIDAL neurons:
- quite big
- parallel
- cortex of the brain
Pyramidal neurons are nicely aligned in parallel.
Their activity can thus sum across space: larger active patches occur as one active patch. ACTIVITY must also ALIGN IN TIME!
Temporal summation
Action potentials are too short to sum well over time. Main contribution to MEEG: postsynaptic potentials
How can cell currents be modelled?
- we can model cell currents as dipoles.
- current outside the dendrites
- We can measure the potential between two measuring points
- this still holds when many cells are aligned and concurrently active
- conductivity/volume conduction play a role for how currents flow
EEG equipment
electrodes –> electrodes –> connector box –> amplifier –> USB adapter –> to computer–>
EEG electrode conventions
10-20 and related systems
- each electrode has a unique name:
- letter or letter combination: region
What are letter combinations of brain regions
O - occipital
PO - parietal occipital
FP - frontal pre
F - frontal
Letter number combination cap
odd: left
even: right
Z: midline (zero)
EEG measurements: reference and ground
- We can measure potentials between two measuring points
–> every EEG electrode needs a reference
- During recording: usually one reference electrode
- Re-referencing is possible
- Reference electrode: usually flat
Extra electrode: ground electrode for noise
Does referencing matter?
It changes what your data looks like
Data referenced to:
- linked mastoids
- average
- FCz
- PO4
IMPORTANT: when comparing studies, make sure they have the same reference they used!!!
What do you not want as reference?
You do NOT want a noisy reference
Superposition of activity
electrodes measure a mix of underlying sources
what causes EEG data to be noisy?
- heartbeat
- bad electrode due to bad connection or so
- eyeblink
- movement or so
Dipoles, currents, and fields: MEG
- current source ( arrow) and current lines
- magnetic field
–> the dipoles also produce magnetic fields
–> you use the right-hand rule to understand how they behave
MEG shielding
reduce interfering noise
passive
passive: thick layer of mu metal and aluminium
active: cancellation coild
MEG history: Cohen and the MSR
- first human MEG was measured by David Cohen in the 1960s
- built an MSR to get satisfying signal quality
MEG equipment
- the magnetic field of the brain are so tiny, that special sensor are needed
- magnetometers are picked-up coils
- induced current is tiny - resistance
- material needs to be superconducting (make it cold : 4K = -269 degrees, submerge in liquid helium)
- SQUID: superconducting quantum interference device
Sensor types of MEG
There are different types of MEG sensors: magnetometers and gradiometers. Gradiometer setup help reduce non-brain nois.
a: magnetometer
b: planar gradiometer
c: axial gradiometer
Understanding difference between EEG and MEG in data generation
- current goes in different directions in eeg vs MEG
EEG vs MEG
- EEG measures potentials that stem from currents
- MEG measures magnetic fields that stem from currents
- EEG can see sources independent of their orientation
- MEG can only see tangential sources and is blind to radial ones
tangential: both with EEG and MEG
radial: only EEG
Orientation of pyramidal cells in gyri and sulci
Averaging
- noise is high in single trials
- averaging N trials decreases noise by factor 1/srtN
- increase thus signal-to-noise ration (SNR)
- needs precise timing information
what are triggers
- time code sent to amplifier, which are marked in data to cut around data eventually
- data snippets can then be cut around relevant triggers and further analyzed. We call those epochs or trials
event-related potentials or fields
visual event-related potential –> average = outcome
topographies
- averaging epochs of similar kind (e.g., picture presentations)
- visual, auditory,… ERFs and ERPs
- also cognitive events: e.g., “surprises or “making errors”
- averaging usually preceded by pre-processing
Pre-processing
- data cleaning (remove artifacts)
- EEG: re-referencing
- filtering: remove drifting and remove noise
- High-pass filter –> filter lets frequencies higher than cut-off pass
- Low-pass filter: opposite as high-pass
ERP labelling standards
visually evoked potential (Oz-Fz)
- Latency labelling: N-egative, P-ositive latency
Ordinal labelling: N1, P1, N2
latencies don’t always fit perfectly… many experimental factors play a role
Error-related negativity
experiment: decide if the middle arrow is pointing to the left
time point 0: response of participant response-locked
incorrect –> negativity (anterior cingulate cortex) ERN (event related negativity
Source reconstruction of MEG and EEG data
Goal: estimate the source activity underlaying our channel-level measurements
- disentangle measured source activity
- increase spatial resoltuion of M/EEG data
forward and inverse solution
forward (easier): what gave rise to your data –> if you go from an active real or simulated source in the brain to what your topography looks like
what you measure
inverse: if you start with topography and then estimate where it comes from
why is the inverse problem more difficult?
III-posed problem:
many more source points (thousands) than sensors (hundreds)
- infinite number of solutions
- use contraints to make solvable:
biophysical constraints:
> forward model
> additional mathematical
> constraints in the inverse solution
Biophysical constraints: the forward model
The forward solution describes the relation between known sources and the channel-level activity they produce
- simulation
What does the forward model incorporate?
- source model (where in the brain do you have sources and how do we matematically model them?)
- head model ( what is between those sources and the scalp? tissue/conductivity)
- channel properties (how do you mathematically model your electrodes or sensors)
The source model
How should we model the source (activity)?
- Temporally and spatially aligned neuronal activity sums up to “big dipoles”
- sources are modelled as equivalent current dipoles
The head model
How should we model how currents/fields propagate through the head?
Sometimes also called volume conductor model
- describes the volume (geometry)
- describes the electric properties (the conductivity)
Why do we need to model head model for MEG?
Volume currents also generate magnetic fields, not only the primary current sources!
WHat are 3 most used head models?
- single shell models:
- models only the brain
- can only be used for MEG: skull and scalp relevant for EEG
- boundary element models (BEM)
- models shells (boundaries)
- usually brain, skull, scalp
- homogeneous and isotropic
- finite element models (FEM)
- models volumes
- also usually 3 compartments
- allows inhomogeneous and anisotropic
How to build a volume conductor model?
Step 1:
MRI segmentation
Step 2:
create boundaries (example BEM)
Step 3:
add conductivities for each boundary
Sensor properties: coregistration
Unifying all the head model and the channels in one coordinate system
Example MEG - Volume conductor model: MRI space - sensors: MEG head space
calculate coordinate transform between those coordinate systems
Single dipole models
Idea: Find one dipole that explains the measured data best
Manipulate the following parameters until fit is best:
- location of dipole
- orientation of dipole
- strength of dipole
Solved via gradient descent
Multiple dipole models are also possible
DIpole fit for auditory evoked field –> use model to create forward model - goodness of fit
What are pros and cons of single dipole models
- sparse model with goodness of fit measure
- assuming of single activation probably wrong
- no “brain imaging”
- good if one single dipole explaines a high percentage e.g. –> epilepsy
Minimum norm estimation
Idea: estimate source strength at pre-defined positions all across cortex
set up a source space on cortical surface
Constraints:
- strength gets estimated across al dipole
- distribution of sources with minimum current
- minimizing the residuals (error towards the measured data)
Different flavours:
MNE, dSPM, eLORETA, sLORETA,.,… (NO NEED TO LEARN BY HEART!)
Minimum norm estimation in practice
MNE solution for auditory evoked field
pros/cons:
- activity gets estimated over whole brain
- all measured activity (+noise) lands in source space
- lower spatial resolution
Beamforming (aka: spatial filtering)
Isea: Estimate source activity for pe-defined positions independently
- set up a source space on surface or throughout the brain
you get way more focal estimation
constraints:
- for each sourcepoint, create a spatial filter that:
> passes activity of this source point without loss
> attenuates pther sources: minimizes the variance across all sources
different flavours again
Beamforming in practice
Beamformed auditory evoked field (positive negative does not mean anything only amplitude)
Pros&cons:
- activity gets estimated over whole brain
- selective to activity (noise suppressant)
- needs very precise forward model
- tricky with correlated sources
What is an oscillation?
An oscillation is a signal that repeats periodically in a very specific ways
How can an oscillation be described?
- frequency (cycles per seconds) - how fast is it?
- amplitude - how big is it?
- phase - where in the cycle are we at a given point in time? - degrees or radians (from 0 to 2pi for one cycle)
Analysis of oscillation
Averaging does not seem like the best solution.. that does not help us to describe the oscillation. Plus, such signals are often induced, not evoked.
induced: happens each time, but not exaclty at the same time
evoked: happens always at the same time
- signal is sometimes delated/shifted (so might cancel out each other
Analysis of oscillations
frequency: periods per seconds, unit Hertz (HZ)
power: strength of the signal, unit: e.g. uV^2
Compute power at different frequencies in the signal: Fourier transform
Rhythms of the brain
In HZ
delta: 1-3
theta: 4-7
alpha: 8-12
beta: 13-30
gamma: 30-100
Where do oscillations come from?
Oscillations are self-organizing phenomenon
- communication and drive
- different mechanisms for different frequencies
- inhibitory and excitatory connections play a role
- sometimes, pacemakers are assumed
- not all mechanisms are fully understood
ING and PING mechanisms
- ING: interneuronal network gamma
- aka: I-I model, inhibitory model
- GABAergic interneurons: gamma-aminobutyric acid
- zero-lag: simultaneous firing
- phase-lag: one neuron fires first (less stable) –> global network synchronization –> top: spiking rastergram bottom: membrane potentials of two cells
- PING: pyramidal-interneuronal network gamma
- aka: E-I model, excitatory-inhibitory model
- AMPAergic pyramidal neurons and GABAergic interneurons
- strength of inhibitory and excitatory input needs to be balanced
Alpha revisited
It started with Berger in the 1920’s…
amplitude of alpha goes up if if you lose your eyes, then disappear when you open them back up
A logical hypothesis followed:
- alpha when you “do nothing”
BUT: alpha scales with memory load: (more items to retain = more alpha
many theories:
- inhibition
- routing of information
- attention-related
-eye-movement-related
So, what function do oscillations have?
we do not know if they have any function at all. they might be an epiphenomenon
The function of oscillations is still one of the big questions of electrophysiology in neuroscience
is there more activity in slower or faster frequency? what ration can describe this?
- 1/f: inherent to the brain - always more activity at slower frequencies than higher frequencies
- frequency bands
Limitations of the Fourier transform
- can only resolve frequencies of which n full cycles fit the data snippet
- we call this frequency resolution
- frequencies that do not fit show up blurred: this is called spectral leakage
es. 10 Hz: yes! 10 full cycles in one second
BUT 10.5 Hz: NO
lowest frequency you can resolve is at the same time your frequency resolution
1/T (T length of data in seconds)
What about time?
narrow-band (lower gamma spectrum) gamma oscillation –> baseline vs visual stimulation
time-frequency reconstuction
resolving power in time with a sliding window - each of those outputs –> power
larger - more blurred in time
narrower- frequency resolution goes down
so - either nice resolution in time, or nice resolution in frequency space
Recap:
- the estimate of a time window gets represented at its center time point
- time-frequency is now determined by time window!
- time-frequency trade-off: blurring time vs frequency domain
cross-frequency interactions
There seems to be an interplay between different frequencies
es. faster oscillations are pocketed in slower oscillations –> carrier
Gamma power is modulated by theta phase in human cortex –> i.e. at peak of theta there is no gamma power, while in trough there is high gamma power
–> there is theories that coupling is functionally relevant
MEG in a cognitive task
background:
a lot of discussion on how visual awareness works
- strong link between visual awareness and neural representation assumed
- is there maintenance of invisible information, and if so, where in the brain?
Experiment:
- target with varying orientation and varying contrast: 0%, 25%, 50%, 75%
- judge tilt (left or right) and visibility (0-3)
-180 trials, 20 participants
–> participants understood the assignment
–> visibility reports correlate with performance
–> even if participants said they did not see anything, they were better than chance in their estimation
Decoding of target vs no target
Which sensor or brain regions show a difference between target and no target? (visibility 0%) trials?
Decoding performance: source space
- region-of-interest analysis: choose brain regions a priori
- shows propagation of activity
Parallel encoding of multiple stimulus features
target presence
target contrast
target spatial frequency
target phase
target angle
Decoding of target cs no target
Which sensors or brain regions show a difference between target and no target (visibility 0%) trials?
also decision features are encoded!
visibility 0
early time non visible is almost like very visible is still recordable in your brain
- visual processing independent of visuability ratings
- irrelevant to exam
Evaluation of e