Mid term 2 Flashcards
Reference Dose for Lec 9
NOEL or BMD10 / UFa * UFh * UFs * UFL * MF * DF
UFa = animal to human extrapolation UFh = average human to sensitive human URs = sub chronic to chronic exposure UFL = LOAEL to NOAEL MF = modifying factor DF = data quantity/quality factor
Stochastic effects
Effects that occur by chance, generally occurring without a threshold level of dose, whose probability is proportional to the dose and whose severity is independent of the dose. In the context of radiation protection, the main stochastic effects are cancer and genetic effects.
Dose response for all individuals higher doses cause a higher random chance of being hit
Linear Non-Threshold (LNT) Approach to Assess
(Genotoxic) Cancer Risk
Approach from the EPA for 45 years. For carcinogens known to have a genotoxic mode of action. Linear approach. Considered conservative approach. Involves extrapolation at low doses
Cancer risk graph
Risk = Exposure (LADD) * CSF
LADD – Lifetime Average Daily Dose
ED10 – Effective dose to achieve
10% cancer incidence
LED10 – 95% lower confidence limit
for ED10
Cancer slope factor(CSF) mg/kg/day
Equation Estimating Exposure and Cancer risk
estimating exposure:
LADD (mg/kg/day) = concentration x intake rate x exposure duration / body weight x lifespan
Estimating risk:
Risk = slope factor (per mg /kg per day) x LADD
LADD lifetime average daily dose
Polycyclic aromatic hydrocarbons (PAHs):
Sources and Uses
Ubiquitous contaminants occurring naturally (crude oil) or created from incomplete combustion and released from both natural (forest fires) or anthropogenic (burning of
fossil fuels)
Natural
Forest fires
Oil seeps
Volcanos
Anthropogenic Wood burning Internal combustion engine (vehicle exhaust) Cigarette smoke Roofing/coal tar products Electric power generation Petroleum
Polycyclic aromatic hydrocarbons (PAHs):
Chemical characteristics
Two or more aromatic rings with a pair of carbon atoms shared, highly lipophilic
16 priority EPA PAHs (ATSDR, 2005)
Toxicity
Potential for human exposure
Frequency of occurrence at hazardous waste sites
Available information
Include probable and known human carcinogens
Broader class of polycyclic aromatic compounds
over 1500 chemicals total
diverse structural features
includes both substituted and unsubstituted forms
O N S CH3
little data on source exposure and toxicity mechanisms
PAH mixtures also complicate things.
Regulation before relative potency factor
Before 1993 all PHA risk were equipotent to benzo[a]pyrene (BaP). The other 6 PHAs evaluated were not as potent and overestimated in cancer risk. They could not calculate slope factors because of insufficient data so they were treated like BaP
1993 relative potency factor for quantitative assessment of PHAs.
Based on tumor studies comparing >1 PAH
Should be able to estimate carcinogenic potency
for various PAHs by comparison to a standard
Recommend BaP as a standard
Estimates of individual slope factors could be
calculated as a percentage of the slope factor for BaP
Apply approach to Group B2 probable PAH
carcinogens
Evaluation of PAHS as complete
carcinogens in skin was most
comprehensive and recommended for use
Unsupervised modeling
The program is given a bunch of data (no labels) and must find patterns and relationships therein.
• Clustering
• Principle components analysis
Supervised modeling
The program is “trained” on a pre‐defined set of “training examples” (with labels), which then facilitate its ability to reach an accurate conclusion when given new data.
• Classification
• Regression analysis
Unsupervised learning methods
The model is not provided with the correct results during the training.
• Can be used to cluster the input data in classes on the basis of their statistical properties only.
• The labeling can be carried out even if the labels are only available for a small number of objects representative of the desired
classes.
Supervised learning methods
- Training data includes both the input and the desired results.
- For some examples the correct results (targets) are known and are given in input to the model during the learning process.
- The construction of a proper training, validation and test set is crucial.
- These methods are usually fast and accurate.
- Have to be able to generalize: give the correct results when new data are given in input without knowing a priori the target.
First step in supervised model
Data
Training set data: a set of examples used for learning where the target value is known.
Bad data yields bad models garbage in garbage out.
Second step in supervised model
Features
Feature: an individual measurable property of a
phenomenon being observed.
Feature selection
Third step in supervised model
Algorithm
Algorithm: the method or predictive modeling
technique used to identify patterns in the data
Support Vector Machine Neural Networks Nearest Neighbors Random Forest Decision Trees
fourth step in supervised model
The model
Model: the final prediction or output
Data + algorithm = model
3 Types of data sets required in Supervised model
Training set
validation set
test set
Training set
a set of examples used for learning, where the target value is known. Overfitting is a common problem due to the test set being small, the model is not that generalizable, and hard to apply to other data sets.
Validation set
a set of examples used to tune the architecture of a classifier and estimate the error
Test set
used only to assess the performances of a classifier. It is never used during the training process so that the error on the test set provides an unbiased estimate of the error.
Validation set
A set of examples used to tune the architecture of a classifier and estimate the error.
Purpose of cross validation
A model is developed using a training set.
Use training data, but remove subset of data for testing (or more, ex 2 fold)
The algorithm optimizes the fitting parameters for the training data. If we test an independent set of data from the same population as the
training data, it will generally not fit as well (lower predictive accuracy).
LOOCV
Leave‐One‐Out Cross Validation (LOOCV) – one observation/sample is removed from training data at a time and used for testing
Measures of model performance
- True positives
- True negatives
- False positives
- False negatives
- Sensitivity
- Specificity
- Accuracy
Sensitivity
True positive rate = # TP out of all positive
observations avoids false negatives
TP / TP + FN
Sensitivity: the ability of a test to correctly identify patients with a disease.
Specificity
True negative rate = # TN out of all negative
observations avoids false positives
TN / TN + FP
Specificity: the ability of a test to correctly identify people without the disease.
Balanced accuracy
sensitivity + specificity / 2
arithmetic mean of sensitivity and
specificity (average accuracy obtained in either class)
accounts for imbalanced data sets
Accuracy
percent correct classification
TP + TN / ALL
Reasons for poor model performance
Over‐fitting • Too small training dataset • Doesn’t account for entire population • Doesn’t account for sources of variance in the system • Too many parameters
Exposome
Encompasses the totality of human environmental exposures from conception onwards complementing the genome
Calls for providing description of lifelong exposure history (from prenatal period)
Exposome can be what?
- General external environment (urban/rural life, climate factors, social)
- Specific external environment (diet, infection, specific exposure, tobacco)
- Internal environment (metabolism, microbiome, inflammation, ageing)
Tools to figure out someone’s exposome
Questionaries' Environmental models Biomarkers passive sampling devises omics technologies pictures mobile devises
Bottom‐up approach (direct): to measure exposures
Assess exposure through
environmental
measurements
Top‐down approach (indirect): to measures exposures
Assess exposure through internal measure of
circulating compounds/metabolites
(blood/urine)
GxE gene environment interaction
How the exposure biology effects the gene biology and visa versa
Expo Cast Purpose
Advance the characterization of exposure required to translate advances and findings in computational toxicology to information that can be directly used to support risk assessment for decision making and public health
Expo cast objective
Develop novel approaches and tools for evaluating and classifying chemicals based on potential for biologically relevant human exposure to inform prioritization and toxicity testing.
What is ExpoCast
Extension of ToxCast
•Goal to prioritize chemicals from HTS based on estimated exposure concentrations
•Model exposures based on available data
Uses farfield and nearfeild Exposure Models
Farfield Exposure Models
used to estimate exposure from chemicals released into the environment
Nearfield Exposure Models
used to estimate exposure to chemicals found in consumer products and other in‐home sources (emphasis on frequency of use)
Linking exposure to toxicity
Exposure concentrations -> internal dose -> in vitro toxicity concentration
exposure does not equal dose
toxicokinetics
the quantitative time-dependent mathematical description and modeling of ADME processes of xenobiotics in the whole organism.
Reverse dosimetry
Models are ‘reversed’ to related blood or tissue
concentrations to an exposure concentration.
Measured biological monitoring data like blood or urine -> external exposures blood to exposures
In vitro concentration data (AC 50) reverse dosimetry
High Throughput Toxicokinetics
Also known as ‘In Vitro to In Vivo Extrapolation’ (IVIVE) Use of in vitro data to establish target dose necessary for biological activity
Relies on ‘reverse dosimetry’ or ‘reverse toxicokinetics’ to convert in vitro HTS activity results to daily doses needed to produce similar levels in humans for comparison to exposure data
Forward dosimetry (Conventional)
PK/TK models are used to related exposure concentrations to a blood or tissue concentration
Oral dose equivalent
Toxcast AC50 or LEC um * ((1mg / kg / day) / (Css um))
Oral equivalent dose is:
•Linearly related to in vitro AC50
•Inversely related to Css
Assumes •Plasma concentrations equivalent to AC50 will elicit in vivo response •Chemicals achieve steady state concentrations
IVIVE
Utilization of in vitro experimental measurement or data to predict phenomena in vivo. Predicted EADx.
Equivalent administration dose (EADx)
External exposure that would lead to an internal concentration equal to in vitro activity concentration.
Assumptions and limitations (sources of uncertainty) for HTTK?
Potential sources of uncertainty (leads to lack of correlation between calculated HTTK and measured activity in animal models: •Uncertainty in calculation of hepatic clearance •Low oral bioavailability •Differences in predicted metabolism (non‐hepatic) INACCURATE ASSUMPTIONS
What is the Aggregate Exposure Pathway and how does it related to AOP?
Capturing the complex nature of human and ecological exposure to stressors is a major challenge for environmental health decision making. The Aggregate Exposure Pathway (AEP) concept offers an intuitive framework to organize exposure data, setting the stage for more meaningful collection and use of exposure data.
AOP
Adverse outcome pathway
•Describes how perturbation of normal biology leads to adverse outcome (AO)
•Links molecular initiating event (MIE) for a drug/chemical to an apical endpoint and subsequent population level effects
•Uses a scientifically “proven” causal chain of events
•Provides a mechanistic basis justifying the use of alternative approaches
•Living workflow
MIE
Molecular initiating event. AOPs link MIEs to apical endpoints and subsequent population effects. Uses causal chain of effects.
Toxicant in (AOP)
Chemical properties
Macro molecular interactions
(MEI)
Receptor ligand interactions
DNA binding
Protein oxidation
Cellular responses in AOP
KE key events
Gene interaction
Protein reduction
Altered signaling
Organ Responses in AOP
KE key events
Altered Physiology
Distributed homeostasis
Altered tissue development / function
Organisms responses
AO adverse outcome
lethality
impaired development
impaired reproduction
Which international agency is leading the standardization of AOPs
AOPs proposed by US EPA for
ecological risk assessment (2010)
•OECD AOP Development Program • Launched 2012 • Advisory Group on Molecular Screening and Toxicogenomics • Development of AOP knowledgebase
AOP vs MOA
An AOP is not a toxicity pathway or the MOA for a chemical
Links molecular initiation event (MIE) to an apical endpoint and subsequent population level events (ecological only)
Uses causal chain of events
MIE → Key Events (KE) → AOP
1st principle of AOP development
- AOPs are not chemical specific
• Does not describe what a single chemical does
• Describes potential outcome for any chemical that perturbs the MIE (with
sufficient potency and duration)
• Utilizing AOPs in a predictive context requires understanding of chemical-
specific properties (e.g. potency, ADME) that dictate the magnitude and
duration of perturbation at the MIE
2nd principle of AOP development
- AOPs are modular
• Minimum information
• Molecular initiation event (MIE) –the initial point of a chemical-biological interaction
• Adverse outcome (AO) –in vivo outcome relevant to risk assessment
2 primary building blocks of AOPs
1 Key events (KEs)
• Functional unit of observation/verification (relevant to outcome)
• Essential to progression of the defined perturbation
• Measurable(experimentally quantifiable)
2 Key event relationships (KERs)
• Function unit of inference/extrapolation (directed relationship –up/down)
•
3rd principle of AOP development
- AOPs are a pragmatic unit of development and evaluation
• AOPs consist of a single sequence of key events connecting MIE to AO (no
branches)
• Pragmatic simplification of complex biology
• Can be functional unit of prediction (e.g. for “pure ligand”)
4th principle of AOP development
4 AOP networks are functional unit of prediction for real-world applications
• Chemicals with multiple biological activities
• Exposure to multiple chemicals
• AOPs are not triggered in isolation, they interact.
• Systems approach
5th principle of AOP development
- AOPs are living workflows
• Method of organizing existing knowledge
• As methods for observing biology evolve
• New possibilities for KEs
• More precise/accurate KEs
• As new experiments are published
• WOE for KERs grow (supporting or rejecting)
• New AOPs and branches in networks
discovered
•
Application of AOP to risk assessment
Uses weight of evidence approach in RA. OECD applies evolved Bradford Hill Criteria to weight of evidence approach for using AOPs in RA
Bradford Hill Criteria
Criteria for establishing causation
• Proposed criteria to provide consistency in making WOE decisions
1. Biological plausibility (does AOP agree with general biology)
2. Essentiality of Key Events (KE are required for AO)
3. Concordance of empirical evidence
• Dose-response
• Temporality
4. Consistency(across multiple studies)
5. Analogy(consistency across chemicals)
Understand the relationship between AOPs and HTS/in vitro assays and QSAR
AOPs can be used to categorize chemicals with common MIEs and/or AOs
•Therefore, AOPs can be used to derive new QSARs using mechanistic
information
AOPs to improve QSAR/Read-Across
Identify plausible MIEs •Explore linkages in pathways to downstream effects •Develop QSARs to predict MIEs from structure •Characterize other KEs as SARs