photon dose algorithm, modeling in RTP Flashcards
3D vs IMRT vs SRS vs VMAT
- RTP data requirements vending specific
- IMRT- modeling of mlc critical, small mlc defined fields
- SRS- modeling of small circular fields (cones) or small MLC fields
- VMAT-no specific modeling requirements.
Attributes of a good dose algorithm
- Based on first principles
- Accuracy (against a standard, typically beam data)
- Speed
- Expandable( for advancements in treatment delivery)
Why have accurate algorithms?
RT driven by maximizing TCP and minimizing NTCP, which are both very sensitive to absorbed dose- 5% change in dose yields a 20% change in NTCP.
We learn from protocols, and they depend on accurate dose reporting
the photon beam
primary photons before flattening filter…after?
primary photons
scatter photons
scatter electrons
Challenge in combining these
factors influencing photon scatter
as depth goes up, so does scatter
As field size goes up, so does scatter
As energy increases, scatter decreases
Sources of errors/accuracy from 1976
Absorbed dose at cal point...2.0% additional uncertainty for other points...1.1% Monitor stability...1.0 Beam flatness...1.5 patient data uncertainties...1.5 Beam and patient setup...2.5 overall minus dose calcs...4.1 dose calcs...2,3,or 4 overall...4.6,5.1,5.7. driven by dose calcs
two types of dose algorithms
measurement based…rely on water phantom measurements along with correction factors…Clarkson, ETAR
Model Based…use measurement data to derive model parameters, but don’t calc with it. model uses parameters and physics to calc dose…convolution, MC
effective path length…when does it not work?
lung in high energy- electron disequilibrium
best for calcs away from inhomogeneities
Limitations of RTAR
does not consider position or size of inhomogeneity
Limitations of Batho
works well below less dense materials, otherwise overestimates dose
Assumes CPE
equivalent TAR (ETAR)
first designed for computer and to use CT data
correction factor method
3D in principle
FFT convolution, Differential scatter air ratio, delta volume
attempt to improve on scatter, dimension at location of heterogeneity
too complex at the time…abandoned
Premodeling era algorithms
correction based
cumbersome- cobalt era
but opened door to convolution, superposition, and monte carlo
Monte Carlo
not used except for electrons and small fields.
simulate random trajectories of the individual particles by using machine generated random numbers to sample the probability distributions governing the physical processes involved
by simulating a large number of histories, info can be obtained, particle by particle, about the average of macroscopic quantities such as energy deposition
codes built on foundations of measured and calced probability distributions and are updated as new discoveries are made about how radiation interacts with matter, typically by high energy physicists.
Serve as the ultimate cavity theory and inhomogeneity corrections
Monte Carlo Advantages
algorithms are relatively simple
if algorithm is good, accuracy determined by the cross section beam data
method is microscopic, boundaries not a problem
geometries may be complex, but that is ok