photon dose algorithm, modeling in RTP Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

3D vs IMRT vs SRS vs VMAT

A
  1. RTP data requirements vending specific
  2. IMRT- modeling of mlc critical, small mlc defined fields
  3. SRS- modeling of small circular fields (cones) or small MLC fields
  4. VMAT-no specific modeling requirements.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Attributes of a good dose algorithm

A
  1. Based on first principles
  2. Accuracy (against a standard, typically beam data)
  3. Speed
  4. Expandable( for advancements in treatment delivery)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Why have accurate algorithms?

A

RT driven by maximizing TCP and minimizing NTCP, which are both very sensitive to absorbed dose- 5% change in dose yields a 20% change in NTCP.

We learn from protocols, and they depend on accurate dose reporting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

the photon beam

primary photons before flattening filter…after?

A

primary photons
scatter photons
scatter electrons

Challenge in combining these

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

factors influencing photon scatter

A

as depth goes up, so does scatter
As field size goes up, so does scatter
As energy increases, scatter decreases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Sources of errors/accuracy from 1976

A
Absorbed dose at cal point...2.0%
additional uncertainty for other points...1.1%
Monitor stability...1.0
Beam flatness...1.5
patient data uncertainties...1.5
Beam and patient setup...2.5
overall minus dose calcs...4.1
dose calcs...2,3,or 4
overall...4.6,5.1,5.7. driven by dose calcs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

two types of dose algorithms

A

measurement based…rely on water phantom measurements along with correction factors…Clarkson, ETAR

Model Based…use measurement data to derive model parameters, but don’t calc with it. model uses parameters and physics to calc dose…convolution, MC

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

effective path length…when does it not work?

A

lung in high energy- electron disequilibrium

best for calcs away from inhomogeneities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Limitations of RTAR

A

does not consider position or size of inhomogeneity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Limitations of Batho

A

works well below less dense materials, otherwise overestimates dose
Assumes CPE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

equivalent TAR (ETAR)

A

first designed for computer and to use CT data
correction factor method
3D in principle

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

FFT convolution, Differential scatter air ratio, delta volume

A

attempt to improve on scatter, dimension at location of heterogeneity

too complex at the time…abandoned

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Premodeling era algorithms

A

correction based
cumbersome- cobalt era
but opened door to convolution, superposition, and monte carlo

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Monte Carlo

A

not used except for electrons and small fields.
simulate random trajectories of the individual particles by using machine generated random numbers to sample the probability distributions governing the physical processes involved

by simulating a large number of histories, info can be obtained, particle by particle, about the average of macroscopic quantities such as energy deposition

codes built on foundations of measured and calced probability distributions and are updated as new discoveries are made about how radiation interacts with matter, typically by high energy physicists.

Serve as the ultimate cavity theory and inhomogeneity corrections

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Monte Carlo Advantages

A

algorithms are relatively simple

if algorithm is good, accuracy determined by the cross section beam data

method is microscopic, boundaries not a problem

geometries may be complex, but that is ok

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Monte Carlo disadvantages

A

algorithms are microscopic, so little theoretical insight derived in terms of the macroscopic characteristics of the beam

consumes computing resources

Electron and photon MC still relies on condensed history algorithms that employ some assumptions, yielding the possibility of systemic errors

17
Q

MC and noise

A

very histories makes more noise…want no more than 2% noise

18
Q

Convolution integration

A

integrates over a volume the product of three factors… Mass Attenuation Coefficient, the Primary Fluence, and a Polyenergetic Kernal

Terma (energy deposition) convolved with kernal yields dose.

19
Q

typical model parameters

A
electron contamination (build up region)
incident fluence shape
radiation source size
head scatter
jaw/mlc transmission
resolution of calculation (dose grid)
20
Q

Role of the Convolution Geomoetry

A

TERMA…all energy released…by particles or photons. May be deposited locally or elsewhere

The convolution kernal will describe how that energy will be deposited.

21
Q

role of CT

A

to get at density. RTP will have a CT/density look up table and a mu/rho lookup table

22
Q

dose deposition kernel

A

monte carlo simulation of photons of a single energy interaction at a point in water…the resulting energy released is absorbed in the medium in a drop-like pattern called a dose deposition kernal

Kernel changes as a boundary between materials is involved

23
Q

what if we have a spectrum?

A

polyenergetic kernel

summed kernal to that spectrum

24
Q

pencil beam

A

not scaled laterally to account for changes in rad transport due to inhomogeneities

breaks down at interfaces and for structures smaller than the beam because of the assumption of uniform field

advantage…fast

suffers in inhomogeneity situations

25
Q

superposition vs PB

A

PB errors higher, up to 7%

26
Q

MC vs superposition

A

MC possible in clinic now, but not that much better than superposition convolution.

27
Q

typical set of data requirements

A
  1. CT Scanner characterization
  2. absolute calibration and geometry done under
  3. CAX depth dose PDD
  4. Relative dose profiles- open/wedged
  5. Output factors (Sc,Sc,p)
  6. Wedge and tray factors
  7. Electron applicator/insert factors
  8. VSD