photon dose algorithm, modeling in RTP Flashcards
3D vs IMRT vs SRS vs VMAT
- RTP data requirements vending specific
- IMRT- modeling of mlc critical, small mlc defined fields
- SRS- modeling of small circular fields (cones) or small MLC fields
- VMAT-no specific modeling requirements.
Attributes of a good dose algorithm
- Based on first principles
- Accuracy (against a standard, typically beam data)
- Speed
- Expandable( for advancements in treatment delivery)
Why have accurate algorithms?
RT driven by maximizing TCP and minimizing NTCP, which are both very sensitive to absorbed dose- 5% change in dose yields a 20% change in NTCP.
We learn from protocols, and they depend on accurate dose reporting
the photon beam
primary photons before flattening filter…after?
primary photons
scatter photons
scatter electrons
Challenge in combining these
factors influencing photon scatter
as depth goes up, so does scatter
As field size goes up, so does scatter
As energy increases, scatter decreases
Sources of errors/accuracy from 1976
Absorbed dose at cal point...2.0% additional uncertainty for other points...1.1% Monitor stability...1.0 Beam flatness...1.5 patient data uncertainties...1.5 Beam and patient setup...2.5 overall minus dose calcs...4.1 dose calcs...2,3,or 4 overall...4.6,5.1,5.7. driven by dose calcs
two types of dose algorithms
measurement based…rely on water phantom measurements along with correction factors…Clarkson, ETAR
Model Based…use measurement data to derive model parameters, but don’t calc with it. model uses parameters and physics to calc dose…convolution, MC
effective path length…when does it not work?
lung in high energy- electron disequilibrium
best for calcs away from inhomogeneities
Limitations of RTAR
does not consider position or size of inhomogeneity
Limitations of Batho
works well below less dense materials, otherwise overestimates dose
Assumes CPE
equivalent TAR (ETAR)
first designed for computer and to use CT data
correction factor method
3D in principle
FFT convolution, Differential scatter air ratio, delta volume
attempt to improve on scatter, dimension at location of heterogeneity
too complex at the time…abandoned
Premodeling era algorithms
correction based
cumbersome- cobalt era
but opened door to convolution, superposition, and monte carlo
Monte Carlo
not used except for electrons and small fields.
simulate random trajectories of the individual particles by using machine generated random numbers to sample the probability distributions governing the physical processes involved
by simulating a large number of histories, info can be obtained, particle by particle, about the average of macroscopic quantities such as energy deposition
codes built on foundations of measured and calced probability distributions and are updated as new discoveries are made about how radiation interacts with matter, typically by high energy physicists.
Serve as the ultimate cavity theory and inhomogeneity corrections
Monte Carlo Advantages
algorithms are relatively simple
if algorithm is good, accuracy determined by the cross section beam data
method is microscopic, boundaries not a problem
geometries may be complex, but that is ok
Monte Carlo disadvantages
algorithms are microscopic, so little theoretical insight derived in terms of the macroscopic characteristics of the beam
consumes computing resources
Electron and photon MC still relies on condensed history algorithms that employ some assumptions, yielding the possibility of systemic errors
MC and noise
very histories makes more noise…want no more than 2% noise
Convolution integration
integrates over a volume the product of three factors… Mass Attenuation Coefficient, the Primary Fluence, and a Polyenergetic Kernal
Terma (energy deposition) convolved with kernal yields dose.
typical model parameters
electron contamination (build up region) incident fluence shape radiation source size head scatter jaw/mlc transmission resolution of calculation (dose grid)
Role of the Convolution Geomoetry
TERMA…all energy released…by particles or photons. May be deposited locally or elsewhere
The convolution kernal will describe how that energy will be deposited.
role of CT
to get at density. RTP will have a CT/density look up table and a mu/rho lookup table
dose deposition kernel
monte carlo simulation of photons of a single energy interaction at a point in water…the resulting energy released is absorbed in the medium in a drop-like pattern called a dose deposition kernal
Kernel changes as a boundary between materials is involved
what if we have a spectrum?
polyenergetic kernel
summed kernal to that spectrum
pencil beam
not scaled laterally to account for changes in rad transport due to inhomogeneities
breaks down at interfaces and for structures smaller than the beam because of the assumption of uniform field
advantage…fast
suffers in inhomogeneity situations
superposition vs PB
PB errors higher, up to 7%
MC vs superposition
MC possible in clinic now, but not that much better than superposition convolution.
typical set of data requirements
- CT Scanner characterization
- absolute calibration and geometry done under
- CAX depth dose PDD
- Relative dose profiles- open/wedged
- Output factors (Sc,Sc,p)
- Wedge and tray factors
- Electron applicator/insert factors
- VSD