Mid Term One Flashcards
Discuss need for systems biological approaches in risk assessment of environmental health and toxicology
Need to recognize that the human body and the environment are biologically complexed and therefore interact with chemicals and xenobiotics in multiple complex ways that sometimes can not be seen in an controlled study, or when looking for one or a few markers, therefore you need to take the complexity into account when doing risk assessment to be conservative in your approach.
Understand current testing tools and their application to risk assessment
Tools include; high throughput screening, stem cells biology, functional genomics, bioinformatics, systems biology, computational systems biology, physiological based pharmacokinetic models, structure activity relationships, biomarkers, molecular and genetic epidemiology
Describe areas of application for systems biological approaches in environmental health
- Hazard and risk prioritization of chemicals Identifying mechanistic information to tailor chemical testing programs,
-Safety screening of food additives and foot contact products, Support approaches for aggregate and cumulative risk assessment, – –Estimating variability in response in a population, Pharmaceutical lead selection in drug development, Safety screening of pharmaceutical contaminants and drug
metabolites
Key points toxicology
Exposure: any condition which provides an opportunity for an external environmental agent to enter the body, depends on intensity and frequency of exposure.
Dose: the amount of agent actually deposited within the body, significantly different doses can result from the same exposure
Response: Biological effect of the agent
3 basic concepts in toxicology
•Dose/Response
•Exposure x Hazard = Risk
•Individual Sensitivity (age, size, gender, ethnicity, species, genetics,
environment)
Dose‐Response
effects of amount on the response of the induvial, lots of factors effect dose response, someone sensitive might have a higher response with the same dose. Think different people drinking alcohol. The dose effect can be linear (genotoxic chemicals) or non linear which assumes safe references doses or NOEL and LOEL amounts.
NOEL (NOAEL)
No Observable (Adverse) Effect Level; Highest point at which there was no (adverse) effect
LOEL (LOAEL)
Lowest Observable (Adverse) Effect Level; Lowest point at which there was an (adverse) effect
BMD10
Benchmark dose 10, where at that concentration of dose 10 percent have a response. (response is subjective could be death or increase heart rate)
EC50/ED50
Effective Concentration/Dose that
results in 50% response or response
in 50% of the population
POD
Point of Departure; Point at which
there is an effect (threshold
response) Area between NOEL and LOEL
Describe difference between chemical/drug efficacy v. potency
Efficacy: Refers to the potential maximum response that a xenobiotic can produce. Max effect.
Potency: refers to the amount of xenobiotic needed to produce a toxic effect.
You need to look at both when doing risk assessment and figuring out reference doses.
Understand the primary factors for evaluating exposure, including primary routes of exposure
Exposure: oral(gut) dermal(skin), inhalation(lungs).
Describe key factors affecting chemical/drug distribution
Toxicokinetics, effect the chemicals disruption within the body, so this is absorption, distribution, metabolism, excretion. (ADME)
Understand which factors impact individual sensitivity to chemical/drug response leading to variability in the population
Age, size, developmental stage, children, babies, pregnant, sex, genetics, individual differences, species difference.
Understand risk analysis paradigm including difference between risk assessment and risk management
Hazard x Exposure = risk
Risk assessment process of eliminating association between an exposure to a chemical or physical agent and the incidence of an adverse outcome.
Risk Management is policy and rules to reduce hazards found because of risk assessment.
Understand concept of reference dose and primary sources of uncertainty in assessing risk
Do get a reference dose that accounts for uncertainty, find the NOEL from the animal experimental data, and divide by 100 to account for animal to human extrapolation, inadequate animal data, and for sensitive populations.
Hazard Identification
Structure‐Activity Analysis
Short‐term screening tests
Animal Bioassays
Human Epidemiological Data
Exposure Assessment
Exposure Routes
Duration of Exposure
Amount of Exposure (Dose)
Exposed population(s)
Understand historical events leading up to the current computational toxicology programs in the US and EU
1976 Toxic substance control act USA, EPA required now to regulate effects of chemicals in the market for human and environmental health
1984 proposed benchmark dose methodology for dose-response assessment. Adopted by EPA
1986 first industrial implementation of high throughput screening.
1996 (FDA) Amended FQPA to develop screening program for testing potential endocrine disruptors (EDSTAC)
2003-4 EPA issues framework for computational toxicology, comp-tox research for budget.
2009 SEURAT, program to move away from animal testing in the EU.
2016 tox 21
Describe the 3R’s and how this concept supports/parallels computational toxicology approaches
The three Rs are reduce replace refine, reduce animal testing, replace with computational methods, and refine those methods to make them better and more accurate.
Describe the source‐to‐outcome continuum
prepossess that the risk assessment paradigm as a continuum of events leading from the release of a chemical in the environment to an adverse effect. Cascade of events release in environment, concentration, exposure concentrations, target organ dose, early biological effects, adverse outcome.
Understand the goals of EPA ToxCast
To screen at prioritize thousands of chemicals
predict potential human health effects and develop models and toxicity signatures.
Use those signatures to prioritize chemicals.
Identify targets or pathways linked to toxicity
Derive safe exposure levels for all the chemicals to which humans are exposed
Describe the Tox21 collaborative effort
A federal collaboration among EPA, NIH, including National Center for Advancing
Translational Sciences (NCATS) and the National Toxicology Program (NTP) at the
National Institute of Environmental Health Sciences (NIEHS), and the Food and
Drug Administration (FDA) to use animal toxicology data, human toxicology data, computational toxicology data and In vitro cell‐based assays, quantitative high‐throughput screening and informatics to achieve their goals.
goals to identify
environmental chemicals that lead to biological responses and
determine their mechanisms of action on biological systems.
Prioritize specific compounds for more extensive toxicological evaluation.
Develop models that predict chemicals’ negative health effects in humans.
Annotate all human biochemical pathways and design assays (tests) that can
measure these pathways’ responses to chemicals.
Define the primary objectives of the SEURAT‐1 program in the EU
Animal‐free safety assessment of chemical substances
Development of non‐animal replacements until it can be shown that
alternative data streams can fulfill the same purpose with comparable
confidence.
Uses Cheminformatics •High‐throughput screening •High‐content screening (‘omics) •Systems biology approaches •Pharmacokinetics
Understand the tiered approach proposed for the next generation of risk assessment
US chemical regulation still based on 1976
TSCA) ie EPA has to see if the chemical is bad it dose not have to be proved as non toxic before it enters the market.
Define the proposed applications of toxicogenomics to risk assessment
Looking at early biological effects and adverse outcomes by looking at chemicals effect to the genome, transcriptome and proteome, and metabolome.
Exposure assessment •Hazard screening •Variability in susceptibility •Mechanism/Mode of action •Cross‐species extrapolation •Dose‐response relationships •Developmental exposures •Mixtures •Biomarkers
Understand the relationship between gene expression profiling and apical endpoints typically measured from in vivo studies for risk assessment
Gene expression profiling provides a genetic snapshot of response to a chemical (transcription differences) snapshot of overall response to chemical.
Gene expression profiling can be related to the chemical’s MOA
Changes correspond to molecular alterations leading to phenotypic changes, apical endpoints like cancer for example.
Understand the three most common methods for measuring gene expression
microarray analysis, qPCR, and transcriptomics.
Define the experimental and technical considerations for use of toxicogenomicdata in risk assessments
In a technical sense data quality is critical, data is processed and normalized appropriately. Appropriate stats method is used to identify differentially expressed genes. Data is adjusted to account for false positives. Data is deposited in public databases.
How is toxicogenomic data used
1) Assessment of treatment effects, differential expression, heat maps and clustering, principle components analysis (PCA).
2) Assessment of functionality and interactions, functional analysis, pathway/network analysis, upstream regulators.
3) Assessments specific for RA, in vivo/in vitro or species comparisons, Comparisons to publicly available databases, Dose‐response assessment
Define the three most common approaches for assessing gene functionality and interactions using toxicogenomic data
1) Gene functionality analysis, GO pathways, DAVID bioinformatics tool, Gene Seat Enrichment Analysis (GSEA),
2) Pathway and network analysis, Examine and visualize interactions between genes (pathway or network) Decipher chemical’s MOA KEGG, Biocarta, MetaCore, IPA
3) Analysis of upstream regulators, Transcription factor analysis Commonly used to predict initiating molecular events (AOP
Define the three most common approaches for assessing treatment effects using toxicogenomic data
To report differential expressed genes based on fold changes (using a statistical test) using heat maps and clustering.
Principle components analysis (PCA)
Define three approaches for the specific application of toxicogenomics to human health risk assessment (described in the lecture)
Comparison of in vivo or in vitro effects, does the paths compare to animals and humans are they similar or different. Are the adverse effects the same in animals and people.
Compare the data with the publicly available data, is the chemical act similar to other chemical agents. Similar transcription factor induction and MOA
Dose‐response relationships and identification of biologically relevant doses, dose response of gene expression, gene expression response, transcriptional effect levl, noel, loel, ect. can sometimes be used for dose response.
Define approaches collectively considered under systems toxicology umbrella term
Assessment of treatment effects:
heatmaps and clusters of differentially expressed genes
Principle components analysis
Differential expression
Assessment of functionality and interactions
functional analysis
pathway and network MOA analysis
Upstream regulators
Assessments specific for RA
in vivo in vitro comparison
comparison from outside data
dose-response assessment
Understand the overall aim of systems toxicology
Systems toxicology aims to incorporate systems, biology, chemistry and toxicology, incorporates classical models with network models and and quantitative measures of molecular change and function to multiple levels of a biological organization.
Aims to develop a detailed mechanistic and quantitative approach of understanding of toxicological processes, permitting prediction and accurate simulation of adverse outcomes.
Can maybe improve risk assessment.
Define deterministic versus probabilistic risk assessments and understand how systems toxicology data could be used in either approach
Deterministic risk: based on discrete scenarios used to directly calculate a reference dose.
Probabilistic Risk: date evaluated from multiple tests, and each test changes to some extent the probability of a hazard and/or its uncertainty. Includes low probability events (e.g. low dose) missed by deterministic approaches. Comprehensive methodology.
New drug assessment vs chemical interactions with environment maybe,
Understand the concept of a biological network and what it represents (nodes/edges)
Molecular pathways and interactions with other molecules and transcripts and proteins. That whole complex network, pathway induction or repression ect. Binding inhibition, upregulation, down regulation, ect.
A biological network is a method of representing systems as complex sets of binary interactions or relations between various biological entities. In general, networks or graphs are used to capture relationships between entities or objects. A typical graphing representation consists of a set of nodes connected by edges.
Understand the current technical limitations associated with interpreting systems toxicology data within a biological network
genes without annotation, genes involved in multiple pathways, limited info about pathways due to conflicting knowledge, variability, or not enough data to be sure of a causal relationships between toxicogenomic
and adverse outcomes (phenotypic anchoring)
No method for regulatory testing accepted.
Understand the goals of ToxCast/Tox21 high‐throughput screening (HTS) efforts
Identify patterns of compound induced biological responses to
- Characterize toxicity/disease pathways, Facilitate cross‐species extrapolation, Model low‐dose extrapolation
- Prioritize compounds for more extensive toxicological evaluation, predict potential human health effects. Develop predictive models for toxicity signatures. Derive safe exposure levels for humans.
What are the 3 key components of HTS?
1) Chemicals
- databases
2) Assays
- endpoints in vitro models (in databases)
3) Computational methods
- applications
Describe the EDSP and the recent changes made to utilize HTS assays
EDSP The Food Quality Protection Act (FQPA) develop screening program for chemicals that disrupt the endocrine system implemented in 1998 EDSTAC’s recommendations using a two‐tiered approach
Approach
• Prioritization based on exposure and endocrine bioactivity (find em)
• Testing for dose‐response and adverse effects (test them alot)
Took a lot time so they moved to use tox21 HTS and results were equally reliable as the slower system. June 2015
Describe the EDSP and the recent changes made to utilize HTS assays
June 19, 2015 – EPA posts Federal Notice to incorporate computational and high‐throughput screening to improve their ability
to fulfill the statutory mandate to screen pesticide chemicals and other substances for their ability to cause adverse effects on the
endocrine system
Goals
• Prioritize chemicals for further EDSP screening and testing based on estimated bioactivity and exposure
• Contribute to WOE evaluation of a chemical’s potential bioactivity
• Substitute for specific endpoints in the EDSP Tier 1 battery
• Use estrogen bioactivity as proof‐of‐concept
Understand the limitations associated with the current HTS paradigm
limited by solubility of chemicals, and volatility of chemical in assays. Hard to determine to confirm concertation in 1536 well format, normalization across plates/assays, evaluation of dose response, develop database resources
Understand the areas for future development of in vitro models and the benefits for HTS
fast
FIFRA
Federal Insecticide, Fungicide, and Rodenticide Act
First administered by USDA. Watchdog for USDA and industry with chemicals. Enforcement transferred to USEPA in 1978 A cost benefit statue. Protect human health and the environment. Agriculture burden of proof of safety is on the registrant.
FIFRA Modified in 1996 By the FQPA
Modified to protect children from pesticides. Increase use of safety factors for. Registrant provides information for use of chemical, the crop, location, timing, application rate. Also chemical structure class and mode of action.
TSCA
Toxic substance control act, 1976. Provide watchdog on industry, protect human health and the environment. Burden of data and prof of safety is on the government. Industry chemicals like lead and asbestos, mixtures, ect. Gov has 90 days to responde if a chemical is bad, if they dont respode then it can go on the market, this changed in 2016 with the Frank R. Lautenberg Chemical Safety for the 21st Century Act”
Frank R. Lautenberg Chemical Safety for the 21st Century Act”
Mandatory duty on EPA to evaluate existing chemicals with clear and enforceable deadlines
• Old TSCA ‐‐No duty to review, no deadlines for action
Chemicals assessed against a risk‐based safety standard with no consideration of nonrisk factors
• Old TSCA ‐‐Risk‐benefit balancing standard
Unreasonable risks identified in the risk evaluation must be
eliminated
• Old TSCA ‐‐Significant risks might not be addressed due to cost/benefit
balancing and no mandate to act
Expanded authority to compel development of chemical information
when needed by order, rule, or consent agreement
• Old TSCA ‐‐Required lengthy rulemaking
Requires EPA to make an affirmative determination on new chemicals
before entry into the marketplace
• Old TSCA ‐‐New chemicals enter the market in the absence of EPA action
REACH (Registration, Evaluation and
Authorisation of Chemicals)
EU
2003, registrants must provide data that a chemical is safe, identify hazards and risks, demonstrate that these risks are being controlled for in a safety report.
Benefits of using zebra fish model
Same organ systems as humans. Similar genes as humans, 80 percent. Fully metabolically competent by 72 hours. During development they are the most sensitive, but all signals are there.
Interrogation of multiple levels - high content data
Embryonic development serves as a biological marker
Common endpoints in the Zebrafish model
bent body axis
reduced growth
yolk sac edema
lack of swim bladder inflation
benefits of using zebrafish for high-throughput screening compared
to current in vitro and receptor-based assays
Zebrafish model is a biology relevant system, has a metabolism, can screen for behavior and developmental effects, rapid testing to prioritize chemicals for further testing
How does zebrafish screening help provide data on chemical mechanisms of
action? What is an example described in lecture?
Signaling pathways and molecular events are well conserved
• ..But fish are not rodents or humans
• Consequences of disrupted signaling often species (tissue or life stage) specific
• In other words, the mechanism by which a “target” is hit is likely conserved, but the consequence of the “hit” may be species or life stage specific
What is a primary challenge for interpreting zebrafish high-throughput screening
data?
genetic differences, Zebrafish metabolically competent 3 days so the body burden over time is dynamic for some chemicals- can’t predict this yet
• Adsorption and uptake is maybe different for each chemical
Dosimetry