Mid term 2 Flashcards

1
Q

Reference Dose for Lec 9

A

NOEL or BMD10 / UFa * UFh * UFs * UFL * MF * DF

UFa = animal to human extrapolation 
UFh = average human to sensitive human  
URs = sub chronic to chronic exposure 
UFL = LOAEL to NOAEL 
MF = modifying factor 
DF = data quantity/quality factor
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Stochastic effects

A

Effects that occur by chance, generally occurring without a threshold level of dose, whose probability is proportional to the dose and whose severity is independent of the dose. In the context of radiation protection, the main stochastic effects are cancer and genetic effects.

Dose response for all individuals higher doses cause a higher random chance of being hit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Linear Non-Threshold (LNT) Approach to Assess

(Genotoxic) Cancer Risk

A

Approach from the EPA for 45 years. For carcinogens known to have a genotoxic mode of action. Linear approach. Considered conservative approach. Involves extrapolation at low doses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Cancer risk graph

A

Risk = Exposure (LADD) * CSF

LADD – Lifetime Average Daily Dose

ED10 – Effective dose to achieve
10% cancer incidence

LED10 – 95% lower confidence limit
for ED10

Cancer slope factor(CSF) mg/kg/day

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Equation Estimating Exposure and Cancer risk

A

estimating exposure:
LADD (mg/kg/day) = concentration x intake rate x exposure duration / body weight x lifespan

Estimating risk:
Risk = slope factor (per mg /kg per day) x LADD

LADD lifetime average daily dose

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Polycyclic aromatic hydrocarbons (PAHs):

Sources and Uses

A

Ubiquitous contaminants occurring naturally (crude oil) or created from incomplete combustion and released from both natural (forest fires) or anthropogenic (burning of
fossil fuels)

Natural
 Forest fires
 Oil seeps
 Volcanos

 Anthropogenic
 Wood burning
 Internal combustion engine (vehicle exhaust)
 Cigarette smoke
 Roofing/coal tar products
 Electric power generation
 Petroleum
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Polycyclic aromatic hydrocarbons (PAHs):

Chemical characteristics

A

Two or more aromatic rings with a pair of carbon atoms shared, highly lipophilic

16 priority EPA PAHs (ATSDR, 2005)
 Toxicity
 Potential for human exposure
 Frequency of occurrence at hazardous waste sites
 Available information
 Include probable and known human carcinogens

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Broader class of polycyclic aromatic compounds

A

over 1500 chemicals total
diverse structural features
includes both substituted and unsubstituted forms
O N S CH3
little data on source exposure and toxicity mechanisms

PAH mixtures also complicate things.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Regulation before relative potency factor

A

Before 1993 all PHA risk were equipotent to benzo[a]pyrene (BaP). The other 6 PHAs evaluated were not as potent and overestimated in cancer risk. They could not calculate slope factors because of insufficient data so they were treated like BaP

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

1993 relative potency factor for quantitative assessment of PHAs.

A

 Based on tumor studies comparing >1 PAH
 Should be able to estimate carcinogenic potency
for various PAHs by comparison to a standard
 Recommend BaP as a standard
 Estimates of individual slope factors could be
calculated as a percentage of the slope factor for BaP
 Apply approach to Group B2 probable PAH
carcinogens
 Evaluation of PAHS as complete
carcinogens in skin was most
comprehensive and recommended for use

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Unsupervised modeling

A

The program is given a bunch of data (no labels) and must find patterns and relationships therein.
• Clustering
• Principle components analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Supervised modeling

A

The program is “trained” on a pre‐defined set of “training examples” (with labels), which then facilitate its ability to reach an accurate conclusion when given new data.
• Classification
• Regression analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Unsupervised learning methods

A

The model is not provided with the correct results during the training.
• Can be used to cluster the input data in classes on the basis of their statistical properties only.
• The labeling can be carried out even if the labels are only available for a small number of objects representative of the desired
classes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Supervised learning methods

A
  • Training data includes both the input and the desired results.
  • For some examples the correct results (targets) are known and are given in input to the model during the learning process.
  • The construction of a proper training, validation and test set is crucial.
  • These methods are usually fast and accurate.
  • Have to be able to generalize: give the correct results when new data are given in input without knowing a priori the target.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

First step in supervised model

A

Data
Training set data: a set of examples used for learning where the target value is known.
Bad data yields bad models garbage in garbage out.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Second step in supervised model

A

Features
Feature: an individual measurable property of a
phenomenon being observed.
Feature selection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Third step in supervised model

A

Algorithm
Algorithm: the method or predictive modeling
technique used to identify patterns in the data

Support Vector Machine
Neural Networks
Nearest Neighbors
Random Forest
Decision Trees
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

fourth step in supervised model

A

The model
Model: the final prediction or output
Data + algorithm = model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

3 Types of data sets required in Supervised model

A

Training set
validation set
test set

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Training set

A

a set of examples used for learning, where the target value is known. Overfitting is a common problem due to the test set being small, the model is not that generalizable, and hard to apply to other data sets.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Validation set

A

a set of examples used to tune the architecture of a classifier and estimate the error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Test set

A

used only to assess the performances of a classifier. It is never used during the training process so that the error on the test set provides an unbiased estimate of the error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Validation set

A

A set of examples used to tune the architecture of a classifier and estimate the error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Purpose of cross validation

A

A model is developed using a training set.
Use training data, but remove subset of data for testing (or more, ex 2 fold)
The algorithm optimizes the fitting parameters for the training data. If we test an independent set of data from the same population as the
training data, it will generally not fit as well (lower predictive accuracy).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

LOOCV

A

Leave‐One‐Out Cross Validation (LOOCV) – one observation/sample is removed from training data at a time and used for testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Measures of model performance

A
  • True positives
  • True negatives
  • False positives
  • False negatives
  • Sensitivity
  • Specificity
  • Accuracy
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Sensitivity

A

True positive rate = # TP out of all positive
observations avoids false negatives
TP / TP + FN

Sensitivity: the ability of a test to correctly identify patients with a disease.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Specificity

A

True negative rate = # TN out of all negative
observations avoids false positives
TN / TN + FP

Specificity: the ability of a test to correctly identify people without the disease.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Balanced accuracy

A

sensitivity + specificity / 2

arithmetic mean of sensitivity and
specificity (average accuracy obtained in either class)
accounts for imbalanced data sets

30
Q

Accuracy

A

percent correct classification

TP + TN / ALL

31
Q

Reasons for poor model performance

A
Over‐fitting
• Too small training dataset
• Doesn’t account for entire population
• Doesn’t account for sources of variance in the system
• Too many parameters
32
Q

Exposome

A

Encompasses the totality of human environmental exposures from conception onwards complementing the genome
Calls for providing description of lifelong exposure history (from prenatal period)

33
Q

Exposome can be what?

A
  • General external environment (urban/rural life, climate factors, social)
  • Specific external environment (diet, infection, specific exposure, tobacco)
  • Internal environment (metabolism, microbiome, inflammation, ageing)
34
Q

Tools to figure out someone’s exposome

A
Questionaries'
Environmental models
Biomarkers
passive sampling devises
omics technologies 
pictures
mobile devises
35
Q

Bottom‐up approach (direct): to measure exposures

A

Assess exposure through
environmental
measurements

36
Q

Top‐down approach (indirect): to measures exposures

A

Assess exposure through internal measure of
circulating compounds/metabolites
(blood/urine)

37
Q

GxE gene environment interaction

A

How the exposure biology effects the gene biology and visa versa

38
Q

Expo Cast Purpose

A

Advance the characterization of exposure required to translate advances and findings in computational toxicology to information that can be directly used to support risk assessment for decision making and public health

39
Q

Expo cast objective

A

Develop novel approaches and tools for evaluating and classifying chemicals based on potential for biologically relevant human exposure to inform prioritization and toxicity testing.

40
Q

What is ExpoCast

A

Extension of ToxCast
•Goal to prioritize chemicals from HTS based on estimated exposure concentrations
•Model exposures based on available data
Uses farfield and nearfeild Exposure Models

41
Q

Farfield Exposure Models

A

used to estimate exposure from chemicals released into the environment

42
Q

Nearfield Exposure Models

A

used to estimate exposure to chemicals found in consumer products and other in‐home sources (emphasis on frequency of use)

43
Q

Linking exposure to toxicity

A

Exposure concentrations -> internal dose -> in vitro toxicity concentration

exposure does not equal dose

44
Q

toxicokinetics

A

the quantitative time-dependent mathematical description and modeling of ADME processes of xenobiotics in the whole organism.

45
Q

Reverse dosimetry

A

Models are ‘reversed’ to related blood or tissue
concentrations to an exposure concentration.

Measured biological monitoring data like blood or urine -> external exposures blood to exposures

In vitro concentration data (AC 50) reverse dosimetry

46
Q

High Throughput Toxicokinetics

A

Also known as ‘In Vitro to In Vivo Extrapolation’ (IVIVE) Use of in vitro data to establish target dose necessary for biological activity

Relies on ‘reverse dosimetry’ or ‘reverse toxicokinetics’ to convert in vitro HTS activity results to daily doses needed to produce similar levels in humans for comparison to exposure data

47
Q

Forward dosimetry (Conventional)

A

PK/TK models are used to related exposure concentrations to a blood or tissue concentration

48
Q

Oral dose equivalent

A

Toxcast AC50 or LEC um * ((1mg / kg / day) / (Css um))

Oral equivalent dose is:
•Linearly related to in vitro AC50
•Inversely related to Css

Assumes
•Plasma concentrations equivalent to 
AC50 will elicit in vivo response
•Chemicals achieve steady state 
concentrations
49
Q

IVIVE

A

Utilization of in vitro experimental measurement or data to predict phenomena in vivo. Predicted EADx.

50
Q

Equivalent administration dose (EADx)

A

External exposure that would lead to an internal concentration equal to in vitro activity concentration.

51
Q

Assumptions and limitations (sources of uncertainty) for HTTK?

A
Potential sources of uncertainty (leads to 
lack of correlation between calculated 
HTTK and measured activity in animal 
models:
•Uncertainty in calculation of hepatic 
clearance
•Low oral bioavailability
•Differences in predicted metabolism 
(non‐hepatic)
INACCURATE ASSUMPTIONS
52
Q

What is the Aggregate Exposure Pathway and how does it related to AOP?

A

Capturing the complex nature of human and ecological exposure to stressors is a major challenge for environmental health decision making. The Aggregate Exposure Pathway (AEP) concept offers an intuitive framework to organize exposure data, setting the stage for more meaningful collection and use of exposure data.

53
Q

AOP

A

Adverse outcome pathway
•Describes how perturbation of normal biology leads to adverse outcome (AO)
•Links molecular initiating event (MIE) for a drug/chemical to an apical endpoint and subsequent population level effects
•Uses a scientifically “proven” causal chain of events
•Provides a mechanistic basis justifying the use of alternative approaches
•Living workflow

54
Q

MIE

A

Molecular initiating event. AOPs link MIEs to apical endpoints and subsequent population effects. Uses causal chain of effects.

55
Q

Toxicant in (AOP)

A

Chemical properties

56
Q

Macro molecular interactions

A

(MEI)
Receptor ligand interactions
DNA binding
Protein oxidation

57
Q

Cellular responses in AOP

A

KE key events
Gene interaction
Protein reduction
Altered signaling

58
Q

Organ Responses in AOP

A

KE key events
Altered Physiology
Distributed homeostasis
Altered tissue development / function

59
Q

Organisms responses

A

AO adverse outcome
lethality
impaired development
impaired reproduction

60
Q

Which international agency is leading the standardization of AOPs

A

AOPs proposed by US EPA for
ecological risk assessment (2010)

•OECD AOP Development Program
• Launched 2012
• Advisory Group on Molecular 
Screening and Toxicogenomics
• Development of AOP 
knowledgebase
61
Q

AOP vs MOA

A

An AOP is not a toxicity pathway or the MOA for a chemical
Links molecular initiation event (MIE) to an apical endpoint and subsequent population level events (ecological only)
Uses causal chain of events

MIE → Key Events (KE) → AOP

62
Q

1st principle of AOP development

A
  1. AOPs are not chemical specific
    • Does not describe what a single chemical does
    • Describes potential outcome for any chemical that perturbs the MIE (with
    sufficient potency and duration)
    • Utilizing AOPs in a predictive context requires understanding of chemical-
    specific properties (e.g. potency, ADME) that dictate the magnitude and
    duration of perturbation at the MIE
63
Q

2nd principle of AOP development

A
  1. AOPs are modular
    • Minimum information
    • Molecular initiation event (MIE) –the initial point of a chemical-biological interaction
    • Adverse outcome (AO) –in vivo outcome relevant to risk assessment
64
Q

2 primary building blocks of AOPs

A

1 Key events (KEs)
• Functional unit of observation/verification (relevant to outcome)
• Essential to progression of the defined perturbation
• Measurable(experimentally quantifiable)

2 Key event relationships (KERs)
• Function unit of inference/extrapolation (directed relationship –up/down)

65
Q

3rd principle of AOP development

A
  1. AOPs are a pragmatic unit of development and evaluation
    • AOPs consist of a single sequence of key events connecting MIE to AO (no
    branches)
    • Pragmatic simplification of complex biology
    • Can be functional unit of prediction (e.g. for “pure ligand”)
66
Q

4th principle of AOP development

A

4 AOP networks are functional unit of prediction for real-world applications
• Chemicals with multiple biological activities
• Exposure to multiple chemicals
• AOPs are not triggered in isolation, they interact.
• Systems approach

67
Q

5th principle of AOP development

A
  1. AOPs are living workflows
    • Method of organizing existing knowledge
    • As methods for observing biology evolve
    • New possibilities for KEs
    • More precise/accurate KEs
    • As new experiments are published
    • WOE for KERs grow (supporting or rejecting)
    • New AOPs and branches in networks
    discovered
68
Q

Application of AOP to risk assessment

A

Uses weight of evidence approach in RA. OECD applies evolved Bradford Hill Criteria to weight of evidence approach for using AOPs in RA

69
Q

Bradford Hill Criteria

A

Criteria for establishing causation
• Proposed criteria to provide consistency in making WOE decisions
1. Biological plausibility (does AOP agree with general biology)
2. Essentiality of Key Events (KE are required for AO)
3. Concordance of empirical evidence
• Dose-response
• Temporality
4. Consistency(across multiple studies)
5. Analogy(consistency across chemicals)

70
Q

Understand the relationship between AOPs and HTS/in vitro assays and QSAR

A

AOPs can be used to categorize chemicals with common MIEs and/or AOs
•Therefore, AOPs can be used to derive new QSARs using mechanistic
information

71
Q

AOPs to improve QSAR/Read-Across

A
Identify plausible MIEs
•Explore linkages in pathways 
to downstream effects
•Develop QSARs to predict 
MIEs from structure
•Characterize other KEs as 
SARs