PET&SPECT Flashcards

1
Q

planar imaging and tomography

A

radioactive tracer administered to patient
scintillation detector detects gamma ray emission
image displays in-vivo distribution of activity
tracer distribution and variation with time represents organ function or active molecular process

slices throughout the body

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

PET&SPECT

A

SPECT (Single Photon Emission (Computed) Tomography)
Substance labelled with radio-nuclide (photon emission)
Relatively cheap but relatively low resolution and sensitivity
PET (Positron Emission Tomography)
Substance labelled with radio-nuclide (positron emission)
Expensive due to scanner and cyclotron

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

SPECT

A

radioactive decay is a random/stochastic process

planar imaging but detector moves around subject for multiple slices

counting photon by photon

need collimator for spatial info

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

SPECT collimator resolution

A

low resolution (large holes)
lots of photons detected but blurrier

high res (smaller holes)
longer time needed for image acquisition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

PET

A

radioactive decay is random
detection of photon-pair
no collimator
measures counts

PET is probably the most sensitive in-vivo medical imaging technique

FDG is by far the most common radiotracer used in clinical practice. It is mainly used for oncology

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

PET physics

A

positron emitted by radionuclide and travels for a distance (positron range)
and annihilates w electron
produces 2 high energy photons (gamma photons)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

coincidence detection

A

detecting photons and record all “coincidences” (i.e. when the arrival times of 2 photons (almost) coincide). There is a ~5 nanosecond (= 5 10^-9 seconds) window on this. This is because there is some error on the detection of the arrival time, and also because of the speed of light

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

noise and characterisation

A

variance
mean
CV coefficient of variation = s.d/mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

poisson noise process

A

Poisson distribution gives you the probability of counting 𝑛 events, when the mean is 𝜇

a discrete distribution
is asymmetric
has a single parameter (the mean 𝜇), as opposed to the normal distribution which has 2 (the mean and standard deviation).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

reducing noise

A

Inject more radioactivity
Limitations:
Patient dose (max ~10mSv)
Scanner counting limitations
Purity of injected substance (radio-chemistry)
Scan longer
Limitations:
Scanning time available
Patient movement
Avoid attenuation (e.g. “arms up” for torso)
Limitations:
Patient comfort
Scanner/Acquisition mode changes
Increase voxel-size
Limitations:
Decreases resolution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

image reconstruction algorithms

A

analytic
FBP
Based on geometrical inversion formulas
Fast, linear, but low quality and inflexible

statistical or iterative
ML
Based on statistical estimation theory
Use ‘measurement model’ and how to treat ‘noise’, and maybe other information (e.g. anatomical)
Try to find ‘most likely’ image by repeated adjustments
Slow, non-linear, but potentially higher quality and flexible

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

FBP

A

filtered back projection

Are based on mathematical inversion of the complete set of ‘line integrals’ (X-ray transform)

Common to CT, SPECT, PET, even some MRI sequences.

Measured data have to be precorrected (i.e. attenuation, scatter etc.) to get as close as possible to “line integrals” before they can be handed to FBP.

backprojection in which the ray is traced back over the image. We update the image on a computer by assigning a constant number proportional to the number of counts on the projection to every pixel along the ray path.
This process is completed for all projection lines, i.e. the image contains an accumulation of all backprojected counts.

sharpening filter before back projection = FBP
after back projection = back project and filter (BPF)

ramp filtering

FBP is inverse of xray transform

FBP is fast, but inflexible and can lead to noisy images (low-pass filtering is essential).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

image reconstruction

A

Data are (in first approximation) proportional to “line integrals” (also known as “projections”).
We can store these in “sinograms”, which are a way to order all possible Lines of Response (by angle and distance from origin)
Image reconstruction attempts to “invert” the measurement, i.e. find the image that is consistent with the measured data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

iterative reconstruction components

A

forward model

goodness-of-fit function

iterative scheme to improve the fit

The process starts of with an initial estimate of the image (here just a blank image), it then estimates what would be measured (forward_project + background), then we compare it to the measure data (here by using a ratio (see later why)), and then uses the discrepancies to compute an improvement to the image, and so on

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

forward model

A

system matrix
mean of measured data = estimated data

estimated data = forward project(image) + background

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

system matrix

A

probabilities of detection

attenuation and blurring decreases prob

Pij
i = detector
j = source

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

maximum likelihood

A

the forward model for the mean of the estimated data; and the noise model (i.e. the probability of a measurement given the mean data).
The likelihood therefore tells you how likely some measured data are if you know the image (and the forward model).

Maximum Likelihood (ML): No a priori info on image Prob(Image)=constant

noise models:
normal distribution (Gaussian)
maximising 𝐿 (or equivalently log⁡𝐿) then corresponds to the weighted Least Squares fit

poisson distribution
it provides a better noise model for counting statistics.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Prob(image)
prob(data)

A

Prob(Image) is “prior” information, i.e. prior the scan. This encodes the probability that the image corresponds to the actual tracer distribution. This then leads to Maximum Likelihood.

Prob(Data) is possibly even harder to get your head round. It is the probability of measuring any data at all.

Prob(Image | Data) Probability that the distribution of the tracer corresponds to a certain image, given the current measured data
(prob of image given data)

Find the most probable imageMaximum a Posteriori (MAP)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

gradient ascent

A

contour plots are generated for an “objective” function Ψ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

MLEM

A

maximum likelihood via expectation maximisation
commonly used example of an iterative reconstruction algorithm for poisson data
It “converges” to the ML solution.
It involves forward and back projection, and compares measured and estimated data by division.

initial image
gets sharper after iterations
At later iterations, the image stabilises. (no noise)
at later iterations, more noise/noise dominates (w noise)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

MLEM issues

A

require many iterations to get to ML solution
algorithm slows down

lots of computation

not enough iterations leads to blurring

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

MLEM OSEM
acceleration: ordered subsets

A

ML-EM: each update involves BP and FP for all projection angles
OSEM: each update only uses a subset of projection angles

so OSEM less projections, less computation time for one update
but less data, more noise

OSEM (using early-stopping and with post-filtering) is much faster than MLEM and is currently most-popular, but is not optimal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

early stopping

A

(at fixed number of iterations, which is dependent on the scanner, tracer, target organ and local practice)

problem:
quantification
lesion would have a different quantitative “recovery”
values were all underestimated

for “cold” objects (i.e. lower activity than the surrounding regions), usually MLEM initially overestimates the true activity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

MAP and fitting

A

maximum a posteriori
we want to find the image that is most likely for some measured data, and (via Bayes’ rule), this corresponds to maximising the sum of the log-likelihood (which is higher if the data is close to the estimated data) and the log-prior (which is higher if the image is more likely to correspond to a patient).

Fitting takes the opposite (but equivalent) point of view: it minimises the sum of a “distance” between the data and the estimated data and a penalty (which is lower if the image more desirable).

Note that the sum of the 2 terms is often called the “objective function”: we are trying to find the image that optimises (i.e. maximise for MAP, minimise for fitting) this function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
penalties
reduce noise quadratic penalty- for osem, prevents grainy image, makes smoother edge-preserving penalties high contrast low noise
26
MAP adv over MLEM
MAP algorithms promise more reliable quantification and better noise-contrast trade-off. not dependent on iteration number results easier to predict Iterative problem is easier to solve Algorithm can be designed to need less iterations for MAP than MLEM
27
MAP disadv
More choice Which penalty are we going to use? What parameters are we going to use? (how large is the penalty?)
28
contrast noise relationship
higher contrast higher noise
29
survival probability
probability that a photon is not scattered
30
attenuation modelling in SPECT
Different points on a LOR will therefore each have different attenuation (increasing away from the detector). Therefore, in SPECT we cannot “correct” the measured data for attenuation. But this means that SPECT data cannot really be precorrected to fit the “line-integral” model (or “x-ray transform”) that we used to derive FBP.
31
Approximate attenuation correction in SPECT: “Chang” method
Calculate average attenuation factor for each point in the object / patient Calculate correction factor for each point as correction = 1/mean attenuation Multiply reconstructed NAC (“No Attenuation Correction”) image with correction factor at every point Nowadays, the method is mostly useful to get a rough estimate of the effect of attenuation in a part of the image.
32
edge /middle photons chance of reaching detector
the photons from the edge have a larger chance of reaching the detector than those from the middle of the patient/phantom.
33
most popular method for scatter correction in SPECT.
Triple Energy Window scatter estimation relatively effective and simple
34
measuring SPECT and PET transmission
SPECT measure a to find mu PET measure a can be immediately precorrected
35
using CT with SPECT/PET
Anatomical localisation Attenuation correction Quantification (e.g. scatter, partial volume correction) Complementary diagnostic information
36
attenuation effect
attenuation results in apparent reduced uptake (looks like a perfusion defect) No Attenuation Correction decreases ‘relative activity’ where high attenuation breathing artefacts CT streak artefacts and movement artefacts (&metal implants) CT artefacts cause PET/SPECT errors
37
2D vs 3D PET
2D the volume is obtained as a stack of slices SEPTA between crystals 3D Detected LORs are therefore not in a plane, but have all possible orientations. This needed a step-up in various aspects (faster electronics, faster image reconstruction, better scatter estimation, etc etc). However, these days, all PET scanners are 3D only. NO SEPTA SNR better in 3D: More counts! Compensated by higher randoms and scatter fractions. Balances to overall gain (at lower injected activity)
38
time of flight PET
the different in arrival time is related to the location along the LOR where the annihilation occurred. If the difference Δ𝑡 is zero, the annihilation occurred at the mid-point between the 2 detectors. For non-zero arrival time difference, the formula below: both gammas travel with speed of light (c) difference in time of detection is (t2-t1) emission origin is at distance d from centre of LOR where d = (t2-t1) c/2
39
TOF timing resolution
Higher TOF timing resolution Reduced uncertainty in localisation Reduced noise
40
TOF adv
Enabling TOF increases your certainty about the location. TOF converges faster and achieves better contrast for given noise (for OSEM iterations)`
41
PET coincidences
Accidental coincidences occur when 2 photons are detected (within the coincidence timing window) that originate from different annihilations. true (unscattered) scatter random single (1 out of 2 detected)
42
estimating accidental coincidences: delayed time window
delayed time window If you put a time delay (of a few milliseconds) in your coincidence circuitry and it detects a (delayed) coincidence, you know that the 2 photons were from different annihilations So, the mean number of accidental coincidences in the (non-delayed) coincidence window), is equal to the mean number of delayed coincidences. The delayed method has 2 disadvantages: The (mean) number of delayed coincidences is low, so it’s a noisy estimate of the mean of the randoms It needs extra electronics and can keep your coincidence circuitry busy, so could increase dead-time (although that’s not a problem in current systems anymore)
43
estimating accidental coincidences: randoms from singles
Provide nearly noiseless estimate of the mean background If you can detect the number of singles in each crystal in a certain time interval, then this can be used to estimate your mean randoms-rate. adv: singles-rates are quite high. Therefore, if we measure the number of singles in a time interval, it will allow us to give an accurate estimate of the singles-rate (Poisson statistics again!). RFS estimate is far less noisy than the delayed estimate disadv: now you need to count those singles, so you need extra electronics. this formula ignores dead-time in the coincidence circuitry
44
PET resolution
factors: crystal size, which is around 4-5mm on current clinical systems. smaller crystals are more expensive and they also might have a somewhat lower detection efficiency. There will also be more inter-crystal scatter etc positron range colinearity
45
SPECT resolution
mostly determined by collimator usually given as a sum of 2 components: the collimator blurring and the intrinsic detector blurring. They are both often modelled as a Gaussian blur.
46
detector efficiency
Probability of detection of a photon depends on Place and direction of incidence of photon on the detector block Energy of the incoming photon Scintillator PMT/APD/SiPM efficiency Electronics Incidence rate (singles dead-time) Probability of detection of a pair of photons depends in addition on Timing circuit Downstream processing (coincidence dead-time)
47
detector efficiency in block detectors
As the crystals in the middle of the block have a larger chance of stopping gamma photons than the crystals at the edge, the detection efficiency of a block-detector is usually highest in the middle of the block.
48
determining detector efficiencies
2-D PET fan sum method Sum of coincidence counts seen by 1 detector is proportional to its efficiency you add a lot of LORs, so reduce noise. It’s therefore easy to use this method for a regular check/re-calibration of the scanner. SPECT uniformity measurement In SPECT, it’s easiest to use a plane source on top of the collimator. As long as that plane source is uniform, the detected counts in each bin will be proportional to the detection efficiency
49
optimal operating range
determined by looking at CV of corrected counts
50
QA/QC
QA sets rules QC tests against those rules
51
daily PET QC
the aim of the test is to check the response of the PET detectors. This can be either done by using a rotating road source or a solid phantom such as a Germanium phantom shown here. After the acquisition we can either look at the sinograms to check to see if there are any problems. On the left here we see a good example, and on the right an example where there is a problem with a detector (giving us a diagonal line). Some manufacturers perform tests where the performance of each detector is tested with regards to a number of different parameters
52
weekly PET QC
On a weekly basis minor tune-ups of the system can be recommended, and it is sensible to perform a check on the SUV or the activity concentration measured by the system using a cylindrical phantom filled with F18 or Germanium.
53
infrequent PET QC
On a less frequent basis, other detector calibrations will be performed together with activity concentration or SUV calibrations. The frequency of these tests will depend on the make of system and the level of expertise available at the site. In fact many of these tests will be performed by the manufacturers field engineers and not by local staff. A critical test that should be performed is a check on the inherent registration between PET and CT images on dedicated PET/CT systems. Normally calibrations are in place to ensure that images acquired on PET/CT systems are overlayed correctly. However it is important on perhaps a quarterly basis or whatever is suggested by your manufacturer to test these calibrations.
54
QC: image quality, accuracy of attenuation and scatter corrections
Calculate Contrast of hot and cold spheres Variability of background activity Scatter and attenuation correction algorithms checked with ROI analysis on central lung insert Activity to mimic clinical whole body study Additional activity distribution placed outside FOV Scan length to simulate whole body scan time
55
SUV
standard uptake value SUV = injected activity / patient weight kBq/ml
56
partial volume effects (PVE)
PET and SPECT Definition: Any effects on reconstructed image due to limited resolution (system, image reconstruction settings, filters…) PVEs degrade the quantitative accuracy of PET and SPECT.
57
recovery factor
Recovery factor: Ratio of measured ROI mean (or max) over true value RF, also sometimes called the Recovery Coefficient) allows you to convert a measured value to the true value. (just divide by the RF dependent on object shape.
58
types of PVE effects
Between voxel-effects: Spill-over: (or “spill-out”) activity from inside the ROI appears outside Spill-in: activity from outside the ROI appears inside Within voxel effects “Tissue fraction effect”: A voxel usually contains different tissues (and certainly cells). We measure an average activity in the voxel (at best) Discretisation effects for voxels at the edge of an organ/lesion
59
correcting for PVE
ROI-based PV Correction methods: Objects with known size Usually assume uniform uptake in every region Deconvolution methods: Generic Sensitive to noise Hybrid methods: Future
60
region based PVE correction: geometric transfer matrix
segment anatomical regions for structures or tissues smooth regions to match emission resolution define contribution of activities in structures to each region (geometric transfer matrix: 𝑤_𝑖𝑗) compute ROI values 𝑡_𝑗 solve for activity in each region (𝑇_𝑖), given measurement (𝑡_𝑗)
61
tracer kinetic modelling adv and disadv
Advantages Extracted parameters are independent of delivery, therefore more reliable (e.g. for follow-up studies as part of the evaluation of a therapy) Allows incorporation of biological information Disadvantages Longer acquisition time Results can depend on the accuracy of model
62
motion causes and effects
Motion occurs because of: Respiratory motion (breathing) Cardiac motion (heartbeats) Gross patient motion Patient motion impact: Artifacts Diagnostic uncertainty Radiation Treatment Quantitation Reproducibility Prevents use of anatomical information (e.g. from CT or MR) for regularisation. - degrades image quality - decreased resolution - decreased lesion detectability - accuracy of quantification
63
reduce motion
Step 1: Minimise motion Patient management Step 2: Gating and time frames Split data in different “motion states” Step 3: Motion correction Combine data (e.g. through Image Registration) to reduce noise
64
reduce motion gating
Gating reduces the amount of motion. However, it means each image is reconstructed with fewer counts. In current practice, the scan duration is therefore usually increased. Instead, it is better to combine the data from all the gates somehow Use data from a single GATE but lower counts -> higher noise! Combine PET data from all gates Estimate motion between gates from either gated PET images gated CT (or MR) images Combine all PET gates into single image add registered PET images incorporate motion into image reconstruction and estimate single motion-free image Sort data into multiple “gates” based on motion information Independently reconstruct all gates Register to a reference gate and (weighted) add
65
PET/MR issues and benefits
Problems Attenuation correction Issues with QA and testing (lack of phantoms that are suitable for both PET and MR QA/QC) Benefits “One stop shop” Possibilities for “joint solutions” Motion from MR PVC
66
attenuation correction for PET MR
Basic problem Attenuation is related to density MR signal is not Various solutions: Segmentation-based Estimate attenuation from PET data Atlas/database methods (Machine Learning) Combined approaches
67
PET MR adv related to PET image quality
Benefits Increased scan duration (to accommodate for multiple MR sequences) Motion monitoring and correction Anatomical (and other) information Joint kinetics Future opportunities
68
future
Radio-chemistry Algorithms Better corrections Better scanner modelling Better regularisation Motion correction Machine Learning Instrumentation Collimator design Scintillators and detectors Higher TOF resolution Multi-modality
69
SUV max
very easy, but single point measure, sensitivity to noise
70
SUV peak
(still easy and overcomes some main SUVmax limitations) average SUV within a 1 cm3 spherical ROI centred on the “hottest focus
71
PSF
point spread function the image (or volume) obtained of an object consisting of a single point source with activity 1kBq (or other units)
72
Regularisation in PET/SPECT
Low-pass filtering reconstructed image data before reconstruction, which is often used in FBP Early-stopping of MLEM/OSEM Relies on initialisation with “smooth” image and behaviour of MLEM (first updates low frequency features and then “adds in” rest) Is often used, but can lead to wrong quantification (as convergence rate is object/location dependent) Maximum a Posteriori (MAP) / penalised reconstruction Incorporate prior information (e.g. expected image appearance/features) into probability model Penalise undesirable features when maximising Balance between data-fitting and prior/penalty Use of different basis for image reconstruction (e.g. not voxels but “blobs”)