United States
Environmental Protection
Agency
Environmental Criteria and
Assessment Office
Cincinnati OH 45268
EPA-600/9-84-008
March 1984
Research and Development
c/EPA
Approaches to
Risk Assessment for
Multiple Chemical
Exposures
-------
T- EPA-600/9-84-008
t March 1984
si
APPROACHES TO RISK ASSESSMENT FOR MULTIPLE CHEMICAL EXPOSURES
Dr. Jerry F. Stara, Director, ECAO
Editor
Environmental Criteria and Assessment Office
Cincinnati, Ohio 45268
Dr. Linda S. Erdrelch
Technical Editor
Environmental Criteria and Assessment Office
Cincinnati, Ohio 45268
Contract No. 68-03-3111 by Dynamac Corporation
Contract No. 68-03-3156 by ERCO/Energy Resources Co. Inc,
Environmental Criteria and Assessment Office
Office of Health and Environmental Assessment
Office of Research and Development
U.S. Environmental Protection Agency
Cincinnati, OH 45268
U.S. Enviro. •-. ",v-_ection ARency
Region 5, Li: •'-,])
77 West Jaux.. , ., „ -yard, 12th Floor
Chicago, IL bu604-3590
-------
NOTICE
This document has been reviewed 1n accordance with U.S. Environmental
Protection Agency policy and approved for publication. Mention of trade
names or commercial products does not constitute endorsement or recommenda-
tion for use.
11
-------
TABLE OF CONTENTS
Page
SYSTEMIC TOXICANTS (September 29, 1982) 1
Acceptable Dally Intake 1
Presentation 2
Critiques 23
Discussion 36
References 40
Interspedes Conversion of Dose and Duration of Exposure 43
Presentation 44
Critiques 47
Discussion 58
References 62
Risk Assessment for Less-Than-L1fet1me Exposure 64
Presentation 65
Critiques 69
Discussion 72
References 74
Incidence and/or Severity of Effects 75
Presentation 76
Critiques 78
Discussion 86
References 90
Route-to-Route Extrapolation and the Pharmacok1net1c Approach. . . 91
Presentation 92
Critiques 101
Discussion 114
References 116
Multiple Route Exposures 118
Presentation 119
Critiques 124
Discussion 130
References 131
111
-------
TABLE OF CONTENTS (cont.)
Page
The Impact of Carcinogens 1n Risk Assessment of Chemical
Mixtures 132
Presentations 133
Discussion 143
References 164
HEALTH ASSESSMENT OF EXPOSURES TO CHEMICAL MIXTURES
(September 30, 1982) 172
Outline of Issues and Review of Present Approaches 173
Presentations 174
Discussion 175
References 182
Assessment of Exposure 183
Presentations 184
Discussion 186
Subpopulatlons at Greater Risk 189
Presentation 190
Critiques 204
Discussion 208
References 215
Biological Bases of Toxicant Interactions and MathematU Models. . 219
Presentation 220
Critiques 240
Discussion 252
References 258
Summation of Meeting 264
Concluding Comments 273
1v
-------
WORKSHOP PARTICIPANTS
ECAO-C1n AUTHORS AND REVIEWERS
Or. Jerry F. Stara
Tox1colog1st (Director)
Randall J.F. Bruins
Environmental Scientist
Dr. Michael L. Dourson
Tox1colog1st
Dr. Linda S. Erdrelch
Epidemiologist
Dr. Richard C. Hertzberg
B1omathemat1dan
Mr. Steven D. Lutkenhoff
Acting Deputy Director
Dr. Debdas Mukerjee
Environmental Health Scientist
Ms. Cynthia Sonlch Mullln
Epidemiologist
Dr. William E. Pepelko
Tox1colog1st
Ms. Annette Pressley
Task Manager
EXTERNAL PEER REVIEWERS AND PARTICIPANTS
Dr. Roy Albert
Inst. of Environmental Medicine
Tuxedo, New York
Dr. Julian Andelman
University of Pittsburgh
Graduate School of Public Health
Pittsburgh, Pennsylvania
Dr. Eula Blngham
Ketterlng Laboratory
College of Medicine
University of Cincinnati
Cincinnati, Ohio
Dr. Daniel M. Byrd, III
Health and Safety Regulation Dept.
American Petroleum Institute
Washington, DC
Dr. Tom Clarkson
Division of Toxicology
School of Medicine and Dentistry
University of Rochester, New York
Dr. Herb Cornish
School of Public Health
University of Michigan
Ann Arbor, Michigan
Dr. Kenny S. Crump
Science Research Systems
Ruston, Louisiana
Dr. Robert Cummlng
Biology Division
Oak Ridge National Laboratory
Oak Ridge, Tennessee
Dr. Patrick Durkln
Syracuse Research Corporation
Syracuse, New York
Mr. Kurt Ensleln
Health Designs, Inc.
Rochester, New York
Mr. William Gulledge
Chemical Manufacturers Association
Washington, DC
Dr. Rolf Hartung
School of Public Health
University of Michigan
Ann Arbor, Michigan
Dr. Donald Hughes
Procter and Gamble Company
Cincinnati, Ohio
-------
EXTERNAL PEER REVIEWERS AND PARTICIPANTS (cent.)
Dr. Marvin Legator
University of Texas Medical School
Dept. of Preventive Medicine and
Community Health
Galveston, Texas
Dr. Richard Kodba
Dow Chemical
Midland, Michigan
Dr. Myron Mehlman
Mobil 011 Company
Princeton, New Jersey
Dr. Sheldon D. Murphy
Dept. of Environmental Health
University of Washington
Seattle, Washington
Dr. Robert A. Neal
Chemical Industrial Institute of
Toxicology
Research Triangle Park, North
Carolina
Dr. William Nicholson
Department of Environmental Health
Mt. Sinai Hospital
New York, New York
Dr. Ian C.T. Nlsbet
Clement Associates
Arlington, Virginia
Dr. Ellen O'Flaherty
Ketterlng Laboratory
University of Cincinnati
Cincinnati, Ohio
Dr. Magnus Plscator
University of Pittsburgh
Center of Excellence
Pittsburgh, Pennsylvania
Dr. Reva Rubensteln
National Solid Waste Management
Organization
Washington, DC
Dr. Marvin Schnelderman
Clement Associates
Arlington, Virginia
Dr. Harry Skalsky
Reynolds Metals Company
Richmond, Virginia
Dr. Robert Tardlff
Life Systems/ICAIR
Arlington, Virginia
Dr. James L. WhHtenberger
Occupational Health Center
University of California
Irvine, California
Dr. James Wlthey
Food Directorate
Bureau of Chemical Safety
Ottawa, Canada
Dr. Ronald Wyzga
Electric Power Research Institute
Palo Alto, California
OTHER GOVERNMENT SCIENTISTS
Dr. Judith Bellln
U.S. EPA
Washington, DC
Dr. James W. Falco
U.S. EPA
Washington, DC
Dr. Les Grant
Director, ECAO
U.S. EPA
Research Triangle Park, North
Carolina
Dr. Arnold Kuzmack
U.S. EPA
Washington, DC
Dr. William Lappenbusch
U.S. EPA
Washington, DC
Dr. S1 Duk Lee
U.S. EPA
Research Triangle Park, North
Carolina
vl
-------
OTHER GOVERNMENT SCIENTISTS (cont.)
Dr. Robert McGaughy
U.S. EPA
Washington, DC
Dr. James Mellus
DSHEFS, NIOSH
Cincinnati, Ohio
Dr. James Murphy
U.S. EPA
Washington, DC
Dr. Arthur Pallotta
U.S. EPA
Washington, DC
William B. Pelrano
U.S. EPA
Cincinnati, Ohio
-------
PREFACE
The Environmental Criteria and Assessment Office (ECAO) 1n Cincinnati
has prepared methodologies for deriving ambient water quality criteria and
for conducting risk assessments on a specific group of solvents. The
methodology for deriving ambient water quality criteria focused on chronic
exposure to a single chemical from a single route of exposure. The solvent
methodology expanded this approach to consider the effect(s) of a single
chemical by all relevant routes of exposure (oral, dermal and Inhalation)
for all of exposure duration (acute, short-term, subchronic and chronic).
In both methodologies, risk assessments for carcinogens associated an expo-
sure level with a particular Incidence of cancer using a non-threshold model
which is linear at low doses. For systemic toxicants, the no-observed-
adverse-effect level (NOAEL)/Uncerta1nty Factor approach was used to
estimate an acceptable dally Intake (ADI).
The current Mult1chem1cal Health Risk Assessment methodology which ECAO
1s attempting to develop 1s intended to be used 1n conducting site-specific
risk assessments on hazardous waste disposal facilities. In developing this
methodology, it will be assumed that other offices 1n the Agency will be
able to make reasonable estimates of daily doses from oral, dermal and
inhalation routes and will be able to adequately characterize the length of
exposure and population exposed. Ideally, the methodology developed by ECAO
would be used to estimate from the available exposure data the types of
health effects which might be expected, the incidence of these effects, as
well as an estimate of the relative hazard of each facility. Thus, some of
the major areas for methodologlc development include a reasonable approach
for multiple chemical exposures, a system for combining or weighting adverse
effects, and the selection of a reasonable extrapolation model for toxic
effects.
These and other relevant issues were addressed during a 2-day workshop
on "Approaches to Risk Assessment for Multiple Chemical Exposures" held by
the U.S. Environmental Protection Agency in Cincinnati, Ohio on September 29
and 30, 1982. The workshop was attended by 50 scientists from EPA and
private Industry. The first day of the workshop focused on the subject of
"Systemic Toxicants". Presentations were made on seven aspects of this
topic. Each presentation was followed by prepared critiques from other
attendees and then a discussion session. Presentations on the second day of
the workshop addressed the subject of "Health Assessment of Exposures to
Chemical Mixtures".
This document presents the results of this workshop, including presenta-
tions, critiques, discussion and references for each of the 11 subtopics
covered. A summary of the workshop 1s presented at the end of this docu-
ment, as well as concluding comments submitted by participants some time
after the workshop.
Or. Jerry F. Stara
ECAO, OHEA, U.S. EPA
-------
SYSTEMIC TOXICANTS
Acceptable Dally Intake
Presentation: Or. Michael Dourson
ECAO, OHEA, U.S. EPA
Critique: Dr. Thomas Clarkson
University of Rochester
Critique: Dr. Harry Skalsky
Reynolds Metals Company
Critique: Dr. Arthur Pallotta
U.S. EPA, Office of Solid
Waste and Emergency Response
-1-
-------
PRESENTATION
OR. MICHAEL DOURSON: TOXICITY-BASED METHODOLOGY, THRESHOLDS AND POSSIBLE
APPROACHES, AND UNCERTAINTY FACTORS
Present ToxUHy-Based Methodology
In developing guidelines for deriving acceptable dally Intakes (ADIs)
for systemic toxicants, four types of response levels are considered:
NOEL: No-Observed-Effect Level. That exposure level at which there
are no statistically significant Increases 1n frequency or
severity of effects between the exposed population and Us
appropriate control.
NOAEL: No-Observed-Adverse-Effect Level. That exposure level at which
there are no statistically significant Increases 1n frequency
or severity of adverse effects between the exposed population
and Its appropriate control. Effects are produced at this
level, but they are not considered to be adverse (e.g., the
lowest NOAEL can be also termed a LOEL, that 1s a lowest-
observed-effect level).
LOAEL: Lowest-Observed-Adverse-Effect Level. The lowest exposure
level In a study or group of studies which produces statisti-
cally significant Increases 1n frequency or severity of adverse
effects between the exposed population and Us appropriate
control.
PEL: Frank-Effect Level. That exposure level which produces unmis-
takable adverse effects, ranging from reversible hlstopatholog-
1cal damage to Irreversible functional Impairment or mortality,
at a statistically significant Increase In frequency or sever-
ity between an exposed population and Its appropriate control.
Adverse effects are defined as any effects which result 1n functional
Impairment and/or pathological lesions which may affect the performance of
the whole organism, or which reduce an organism's ability to respond to an
additional challenge. Frank effects are defined as overt or gross adverse
effects (severe convulsions, lethality, etc.).
-2-
-------
These concepts are illustrated 1n Figure 1. They have received much
attention because they represent landmarks which help to define the thresh-
old region 1n specific experiments. Thus, 1f an experiment yields a NOEL, a
NOAEL, a LOAEL, and a clearly defined PEL 1n relatively closely spaced
doses, the threshold region has been relatively well defined. Such data are
very useful for deriving an ADI. On the other hand, a clearly defined PEL
has little utility 1n establishing criteria when 1t stands alone, because
such a level gives no Indication how far removed 1t 1s from the threshold
region. Similarly, a free-standing NOEL has little utility, because there
1s no Indication of Us proximity to the threshold region.
Based on the above dose-response classification system, the following
guidelines for deriving criteria from toxldty data have been adopted:
A free-standing PEL 1s unsuitable for the derivation of an ADI.
A free-standing NOEL 1s unsuitable for the derivation of an
ADI. If multiple NOELs are available without additional data,
NOAELs or LOAELs, the highest NOEL should be used to derive a
criterion.
A NOAEL or LOAEL can be suitable for ADI derivation. A well-
defined NOAEL from a chronic (at least 90-day) study may be
used directly, applying the appropriate uncertainty factor. In
the case of a LOAEL, an additional uncertainty factor Is
applied; the magnitude of the additional uncertainty factor 1s
judgmental and should He 1n the range of 1 to 10. Caution must
be exercised not to substitute "Prank-Effect-Levels" for
"Lowest-Observed-Adverse-Effect-Levels."
If -- for reasonably closely spaced doses -- only a NOEL and a
LOAEL of equal quality are available, then the appropriate
uncertainty factor 1s applied to the NOEL.
In using this approach, the selection and justification of uncertainty
factors are critical. The basic definition and guidelines for using uncer-
tainty factors have been given by the National Academy of Sciences {NAS,
1977). "Safety factor" or "uncertainty factor" Is defined as a number that
-3-
-------
100 r
tn
Z
LU
ec
50
A Slight Body Weight Decrease
B Liver Necrosis
C Mortality
10
20
DOSE (ARBITRARY UNITS)
FIGURE 1
Response levels considered 1n defining threshold regions In toxUHy
experiments. Doses associated with these levels are as follows: 3 - NOEL;
4 - LOEL, NOAEL; 5 - NOAEL (Highest); 7 - LOAEL; 10 - FEL; 20 - FEL.
-4-
-------
reflects the degree or amount of uncertainty that must be considered when
experimental data In animals are extrapolated to man. Oourson and Stara
(1983) discusses uncertainty factors 1n more detail.
Thresholds and Suggested Approaches
As part of my presentation, I would like to reference the following
discussions by Drs. Clarkson, Crump, Hartung, O'Flaherty, Tardlff and the
ECAO-C1n staff concerning thresholds and suggested approaches. These
comments were made at previous ECAO meetings on risk assessment.
Dr. Thomas Clarkson:
In view of the large differences 1n tox1c1ty, the nature of the toxic
endpolnt and the mode of action of toxicants, 1t 1s unreasonable to place
all "non-carcinogens" 1n one category.
Categories should distinguish between reversible and Irreversible
action, between compounds that act rapidly on basic cellular process (e.g.,
cyanide) and those that have a delayed complex mode of action (skin sensl-
Mzers), and between compounds that have long biologic half-lives versus
those that are rapidly eliminated.
Dr. Kenneth Crump:
The advantages and disadvantages of three options for estimating the
health risk for systemic toxicants are discussed below.
Option 1:
Definition. Determine a NOEL or a NOAEL and apply a safety factor.
Advantages. This approach has been used for setting allowable exposure
levels for many years. It 1s familiar to regulators, toxlcologlsts and
other scientists, and has been applied effectively to control human expo-
sures to many substances.
-5-
-------
Disadvantages. The NOEL approach does not fully utilize the slope of
the dose-response curve. All other things being equal, a steeper dose-
response should lead to higher safe levels. However the shape of the dose-
response curve 1s disregarded 1n the NOEL approach except for deciding
whether or not effects were observed at Individual dose levels.
The size of the experiment 1s not fully Incorporated Into the NOEL
approach. No observed adverse effects 1n larger numbers of animals repre-
sents greater evidence of safety and should lead to higher permitted expo-
sure levels. However, this 1s not a part of the NOEL approach. Rather than
rewarding good experimentation by the proponents of chemicals, 1t encourages
them to use as few animals as possible, because with fewer animals adverse
effects are less likely to be observed.
The NOEL approach also does not furnish particularly useful Information
for cost-benefit analyses; from a NOEL H 1s possible to estimate doses for
which no effects are anticipated, but not the possible magnitudes of effects
from exposures to higher doses.
Option 2:
Definition. Fit a mathematical model to dose-response data and, using
statistical confidence limits, extrapolate downward to doses appropriately
safe for humans (e.g., doses corresponding to risks of 10~5 or less).
Advantages. This approach has been used to a considerable extent 1n the
past few years, chiefly for carcinogenic risks. It rewards good experimen-
tation 1n that larger experiments tend to produce narrower confidence limits
and consequently larger lower limits for safe doses. It also explicitly
takes Into account the shape of the dose-response curve because a mathemati-
cal model 1s fH to all of the dose-response data. It provides estimates of
risk corresponding to any dose, along with associated confidence limits, and
-6-
-------
therefore can be conveniently used In cost-benefit analyses. The method
does not, 1n principle, rule out thresholds, because a model which Incorpo-
rates a threshold could be used for the extrapolation.
Disadvantages. The chief disadvantage of an extrapolation approach 1s
related to the choice of the model; different models that fH the observed
data equally well can yield vastly different results when extrapolated to
doses corresponding to very small risks. With carcinogenic risks, Informa-
tion on the nature of the carcinogenic process 1s used to aid 1n the selec-
tion of a model. There are theoretical reasons to believe that the shape of
the dose-response curve 1s apt to be approximately linear at low doses
whenever a chemical Initiates cancer through a change In the DNA of a single
cell, or whenever background cardnogenesls 1s present. (This latter condi-
tion does not require cardnogenesls as the toxic response for Us applica-
bility.) Some experimental data from mutagenesls experiments also support
the concept of low-dose linearity for genotoxlc effects. The use by EPA and
others of low-dose linear models for carcinogenic risk assessment reflects
the point of view that the true dose response curve 1s likely to be linear
1n the low-dose region, and that curve shapes which predict appreciably
higher risks than those predicted by a linear model seem very unlikely; thus
1t seems reasonable to calculate lower limits on safe doses Trom a model
which 1s linear at low doses. The multistage and one-hit model used pre-
viously by EPA are linear at low dose.
For nongenotoxlc events, a linear dose-response seems more unlikely than
for genotoxlc events. Although a linear model still would define upper
bounds to low-dose risks, these upper bounds might overestimate the true
risk by exceedingly large amounts 1n some Instances. There are arguments
which suggest that thresholds may exist 1n some cases. Thus the uncertainty
-7-
-------
as to the true shape of the dose-response curve for non-genotox1c effects
and the fact that different dose-response curves can give vastly different
results constitute a disadvantage to the extrapolation approach for non-
genotoxlc effects.
A second disadvantage to the extrapolation approach 1s that toxlcologl-
cal experiments are frequently not designed or reported 1n a manner which
facilitates the use of model-fitting approaches. Frequently, the doses are
selected too far apart to adequately describe the dose-response curve.
Sometimes the data necessary for fitting a model are not reported 1n the
open literature. However, the experimental design and reporting of data
could be Improved 1n future toxicologlcal experiments once a model-fitting
approach was adopted.
Option 3:
Definition. Fit a mathematical model to dose-response data and, using
statistical confidence limits, calculate a lower confidence limit on the
dose corresponding to a risk of 0.01; then apply a safety factor to this
dose which reflects the severity of the toxic effect and the thoroughness of
the toxicologlcal study.
Advantages. This approach 1s Intermediate between the first two
options. It shares some of their advantages while avoiding some of their
disadvantages. Like Option 2, but unlike Option 1, 1t takes the shape of
the dose-response curve explicitly Into account. It would reward good
experimentation 1n that larger, better-designed experiments should yield
higher lower confidence limits and thereby higher allowable human exposures.
However, 1t would avoid much of the problem associated with Option 2 regard-
Ing the choice of the mathematical model for risk assessment. This 1s
-8-
-------
because there will be far less disagreement among various models 1f extrapo-
lation 1s only carried out down to a risk of 10~2. The size of the safety
factor to be applied could then reflect the severity of the toxic effect,
the thoroughness of the lexicological study, and possibly also Information
on mechanisms of action.
Disadvantages. This approach would share the disadvantage of Option 2
with respect to Inadequate experimental designs and reporting of data.
Dr. Rolf Hartunq:
Thresholds
Discussion of whether or not carcinogens or systemic toxicants elicit no
response at some dose above zero (I.e., a practical threshold) centers pri-
marily on what a threshold means biologically. Several scientists suggest
that when xenoblotlcs act through a mechanism such as enzyme Inhibition,
depletion of required substances, or Inhibition of transport mechanisms,
then the production of an effect might be thought to depend on the Inter-
actions of: 1) the concentration of the xenoblotlc; 2) hhe reaction with
the receptor; 3) the reserve capacity of the affected system; and 4) the
turnover time of the affected enzyme or the capacity of the repair system.
All of these are mechanistically one step removed from the hypothesized
mechanism of action for genotoxlc carcinogens -- direct one-to-one Inter-
action with DNA -- and therefore lead to a mechanism having a threshold.
Any reasonable health risk assessment approach should make use of as
much of the available data as possible, and should also make use of the
theoretical knowledge which has been accumulated 1n toxicology. This 1s
exactly what enters Into the so-called judgmental evaluations of a toxl-
cologlst, and 1n the following paragraphs I will try to outline some of
-9-
-------
these thought processes to make them more amenable to quantitative evalua-
tions. The only responses which will be considered 1n this discussion are
those that can be measured, recognizing that the visibility of a response 1s
partially dependent on experimental design.
The biological responses to Insult from foreign chemicals follow a
series of progressions which can be presented as generalizations as follows:
1. At sufficiently low exposures for full life-times, no effect of any kind
will be found 1n any tested organism, no matter how sophisticated the
experimental design (this statement avoids theoretical considerations of
the presence or absence of thresholds).
2. At higher concentrations for full life-times, subtle effects may be
noted 1n a small proportion of exposed Individuals. Such effects may be
adaptive In nature, or may represent responses whose harmfulness 1s
subject to honest debate. These concentrations are still sufficiently
low, so that no experimental design can measure severely adverse effects
1n the exposed population. Similar circumstances can be produced by
reducing the duration of exposure to less than life-time and Increasing
the concentration to offset the Impact of shortened durations of
exposure.
3. As the concentration-duration of exposure parameters Increase, a greater
proportion of the population will show the subtle effects noted under
generalization 2, above, and a small proportion of the population will
demonstrate effects which are slightly adverse. The early subtle
effects have under many circumstances been considered to be "critical
effects", meaning that 1f one protects the population from these early
subtle effects, then no harmful effects will occur 1n the exposed
population.
-10-
-------
4. As the exposure parameters Intensify, the Incidence and severity of
adverse effects 1n the population will Increase. More and more of the
population will become Involved, and the occurrence of non-responders
will decrease significantly as the exposure parameters Increase. At
some combination of exposure parameters, the entire measurable popula-
tion will respond, some with more severe effects than others. Eventu-
ally even very simple experimental designs will Indicate severe adverse
effects 1n the exposed population In comparison to controls.
Suggested Approach
These known progressions of toxlcologlc responses form the basis of all
judgmental evaluations of toxlcologlc risks. Operationally, this approach
might entail the quantitative evaluation of all suitable animal and human
data 1n terms of dose-response for a given exposure duration. The statisti-
cal model chosen for this evaluation needs to fit the experimental data with
a minimum of parameters to be fitted, using a set of assumptions which are
compatible with at least a portion of the biologic responses observed. For
ease of computation, I would suggest the logistic model proposed by Berkson
(1944). The question of ease of computation may become Important, since I
am proposing that all responses found 1n animals or humans which can be
evaluated quantitatively should be formulated 1n terms of the logistic
regression equation describing the response. The results would be a large
set of regression equations for each chemical, describing the relationship
of various exposure scenarios to response rates for a wide range of effects
for each chemical. Knowing the exposure scenarios for various dumps, these
regression equations could then be used to evaluate the likelihood that a
specific exposure could result 1n a specific effect (subtle or otherwise) 1n
a hypothetical test organism living on or near the dump site. It Is likely
-11-
-------
that the Incidence of subtle effects would be tremendously greater than the
Incidence of severe effects. When combinations of chemicals are evaluated,
1t may become obvious that a specific mix 1s likely to have a combined
effect on one organ system, say the liver.
The evaluative scheme outlined above should not be construed as provid-
ing quantitative risk assessments, using the logistic model. Rather the
scheme 1s Intended to be used to uncover which dump site 1s producing the
higher comparative risk, and what 1s the likely target organ site and effect
at the exposure scenario which has been postulated as having occurred near
that site. The Intent would be to look for those effects 1n the exposed
human population to uncover what the actual risk of sustaining subtle
effects and untoward effects was. Even 1n the absence of any measurable
effects the potential for adverse Impact of various dump sites could be
compared. Following such an approach has several advantages:
1. Dump sites, or other exposure sources, could be prioritized
according to toxlcologlc responses found 1n animals or 1n
humans, and policy decisions could be made on the basis of
such data.
2. For the worst sites 1t may be possible to correlate animal
responses with human responses. Although the occurrence of
human effects would clearly represent a past failure of needed
protective mechanisms, the evaluation of such events could
provide opportunities for any adjustment of present regulatory
approaches and allow evaluation of the scientific basis for
risk assessment.
Dr. Ellen O'Flaherty:
Thresholds
The biological basis for the existence of thresholds for the action of
systemic toxicants 1s simply that one molecule of a systemic toxicant 1s
Incapable of causing an adverse or even a measurable effect. On the other
hand, one molecule of a genotoxlc carcinogen 1s potentially capable of
causing a transformation that will eventually be manifest as a tumor. This
-12-
-------
distinction 1s absolute. It Is Independent of considerations as to the
efficiency of operation of detoxification and other protective mechanisms.
It provides a firm conceptual basis for differentiating between threshold
and non-threshold toxicants.
How, and whether, a threshold may manifest Itself In an experimental
study 1s a separate question. It leads directly to the concept of the
operational or practical threshold which has been used by the U.S. EPA 1n
developing the existing guidelines based on various no-observed-effect
levels. There are several features of the no-observed-effect level that
could be more fully developed here, however.
1. All these practical thresholds are dependent on the population size.
The larger the study group size, the lower the highest NOEL 1s likely to
be. This observation should Influence the selection of a NOEL from
among multiple available NOELs. As the guidelines are presently
written, 1t does not.
2. There 1s a sequence of response levels, as recognized and discussed by
the U.S. EPA. However, this sequence may vary with the organ or organ
system under consideration. For example, an effect occurring early or
at low exposure 1n the liver may have little relationship to development
of an ultimately fatal nephropathy. The distinction between adverse and
non-adverse effects 1s useful here, but 1s not sufficient. The critical
effect should be clearly defined and Us relationship to adverse and
non-adverse effects discussed. The critical effect could be non-
adverse, 1n the sense of the U.S. EPA's current definition of "adverse".
For most compounds, there will be Insufficient Information to allow the
critical effect or critical organ to be Identified, and safety evalua-
tion will have to fall back on classification of effects as adverse and
-13-
-------
non-adverse. Nonetheless, the concept of critical effect Is Important,
particularly since it relates directly to the Issues surrounding risk
assessment for lifetime versus partial lifetime exposure.
3. Irrevers1b1l1ty of effect 1s less Important, from the standpoint of
establishing a threshold level, than magnitude or severity of effect.
4. In spite of the conceptual distinction between threshold and non-thresh-
old toxicants, thresholds observed 1n experimental studies with non-
carcinogens may not represent "real" thresholds In hypothetical dose-
response curves. At the relatively high doses used 1n toxldty studies,
a threshold 1s likely to be observed In the dose range within which at
least one critical protective mechanism 1s overwhelmed, abruptly alter-
ing the slope of the dose-response curve. In the scheme
1 2 3
D-* CB-»CRS + E,
where D represents dose, Cg concentration 1n the blood, CRS concen-
tration at the receptor sites, and E effect, dose-disproportionate
alterations at steps 1 or 2 could generate a practical threshold
Independent of the relationship between CRS and E, or between CR$
and the fraction of the population exhibiting a specified effect. The
observation that many practical thresholds are probably caused by shifts
1n the balance of absorption, distribution, elimination and detoxifica-
tion mechanisms cannot, however, be used to support the thesis that
"real" thresholds are Illusory.
One useful application here would be the Identification of pharma-
coklnetlc or other endpolnts that could be monitored 1n humans and that
might signal close approach to a threshold exposure range.
-14-
-------
Suggested Approach
NOELs of various kinds are, and win continue to be, useful, especially
where the mechanism of action and progression of effects are not well under-
stood. The definition of adverse effects given by EPA 1s basically sound,
and 1s sufficiently specific to provide guidance while allowing reasonable
scope for scientific judgment. Continued application of uncertainty factors
1s justified on the basis that their past use appears to have provided
protection. Table 1 from Dourson and Stara (1983) Is a helpful Inclusion.
Development of guidelines for estimating dose-associated risk to human
populations on the basis of experimental animal data Is not likely to be
productive, 1n my opinion, because:
1. If an adverse effect (or, better, a critical effect) has been Identi-
fied, 1t should be sufficient to act to prevent, as nearly as possible,
the occurrence of that effect.
2. Any prediction of human response to systemic toxicants based on animal
data and on our present understanding of dose-response relationships
would be questionable at best. For the action of genotoxlc carcinogens
there 1s a model consistent with what 1s now understood about the
mechanism of carcinogen action which, however Imperfect 1t may prove to
be and however 1t may require modification 1n the future, at least can
be used to construct dose-response curves for human populations. Since
the model does not Include a threshold, 1n practical terms this means
that we have a means of adjusting the slope on a specles-by-specles
basis (by assuming that the mechanism 1s unchanged and adjusting the
dose on the basis of body weight or surface area). For systemic
toxicants there Is no such model. The slope of the dose-response curve
1n the region of Interest 1s thought to be determined by the range of
-15-
-------
TABLE 1
Guidelines, Experimental Support and References for the Use of Uncertainty (Safety) Factors3
Guidelines6
Experimental Support
References0
cr
i
1) Use a 10-fold factor when extrapolating from valid experimental re-
sults from studies on prolonged 1ngest1on by man. This 10-fold
factor protects the sensitive members of the human population esti-
mated from data garnered on average healthy Individuals.
2) Use a 100-fold factor when extrapolating from valid results of long-
term feeding studies on experimental animals with results of stud-
ies of human 1ngest1on not available or scanty (e.g., acute expo-
sure only). This represents an additional 10-fold uncertainty
factor In extrapolating data from the average animal to the average
man.
3) Use a 1000-fold factor when extrapolating from less than chronic re-
sults on experimental animals with no useful long-term or acute hu-
man data. This represents an additional 10-fold uncertainty fac-
tor In extrapolating from less than chronic to chronic exposures.
4) Use an additional uncertainty factor of between 1 and 10 depending
on the sensitivity of the adverse effect when deriving an ADI from
a LOAEL. This uncertainty factor drops the LOAEL Into the range
of a NOAEL.
Log-prob1t analysis;
Log-probH analysis;
Composite human sensitivity
Body surface area/dose equivalence;
Toxldty comparison between humans
and rats or dogs
Subchronlc/chronlc NOAEL comparison;
Subchronlc/chronlc NOAEL or LOAEL
comparison
LOAEL/NOAEL comparison
Mantel and Bryan, 1961;
Hell, 1972;
Krasovskll, 1976
Rail. 1969;
Evans et al., 1944;
Hayes, 1967;
Lehman and FUzhugh, 1954
McNamara, 1976;
Hell and McCollister,
1963
Hell and McCollister,
1963
aThese factors are to be applied to the highest valid NOAEL or NOEL which does not have a valid LOAEL equal to or below 1t, 1n calculating an
ADI when no Indication of carclnogenlclty of a chemical exists.
bGu1delines are 1n bold print. Guidelines 1 and 2 are supported by the FDA and the HHO/FAO deliberations (Lehman and FUzhugh, 19S4; Blgwood,
1973; Vettorazzl, 1976, 1980); Guidelines 1-3 have been established by the NAS (1977) and are used In a similar form by the FDA (Kokoskl,
1976); Guidelines 1-4 are recommended by the U.S. EPA (1980).
cTab1e adapted from Oourson and Stara (1983). See this paper for references.
-------
sensitivities of Individual population members; the magnitude of this
range varies with the toxicant. To undertake quantitative risk assess-
ment, 1t would be necessary to stipulate both a threshold dose and a
dose-response slope for humans. At the present time, lacking actual
human data, we have no means of doing the latter. Data showing how (or
whether) the slopes of dose-response curves 1n animal and human popula-
tions are related when the toxicant Is the same could be very useful,
but lo my knowledge have not been tabulated.
Dr. Robert Tardlff:
The present approach to health risk estimation of systemic toxicants
relies on four concepts related to response levels (I.e., NOEL, NOAEL, LOAEL
and PEL) and applies uncertainty factors whose magnitude 1s determined by
the quality of the data. Several additional aspects to this approach should
be considered. First, there must be a recognition that the dose-response
relationship 1s a continuum rather than a sequence of separate curves.
Second, the analytical power of the NOEL and NOAEL 1s quite limited for
three reasons: 1) toxldty studies utilize relatively few animals and,
therefore, have relatively poor statistical sensitivity; 2) toxldty studies
utilize genetically homogeneous Individuals whose distribution of response
1s likely to be much narrower than that of the much more heterogeneous human
population; and 3) none of the positive dose-response data are taken Into
account. Consequently, the NOEL and NOAEL are quite artificial and can only
be considered operational thresholds of the experiment and are not to be
confused with human population thresholds. Third, the entire dose-response
curve for toxicants should be used rather than only a single point. That
can be accomplished by using an approach that fits the data and can even be
extended beyond the data points. For simplicity, the probH or loglt models
(which have been used extensively to structure dose-response relationships
-17-
-------
1n biology 1n general and pharmacology 1n particular) should be utilized
unless toxlcologlc mechanisms prescribe otherwise. Similarly, thresholds of
acceptability or of risk toleration could be selected on the basis of the
severity of the effects, again unless mechanistic data Indicated the bio-
logic threshold region 1n humans. Provided that there were some flexibility
1n selecting risk levels on the basis of severity of effect, this dose-
response modeling approach would be far superior to the use of uncertainty
factors.
ECAO-C1n Staff:
A possible approach to health risk estimation of systemic toxicants 1s
to use a threshold multistage model to fit a chosen human or animal data set
and to extrapolate this model to a 10"2 (1%) risk. The choice of one
model over another does not really matter, since the majority of mathemati-
cal models give similar results at a lower 95% confidence limit (CL) on the
dose associated with a 10~2 risk. A lower 95% CL on the maximum likeli-
hood estimate (MLE) of the dose 1s then used for further adjustments Lo
estimate an ADI. Implicit 1n this calculation Is the assumption that
systemic toxicants will elicit no response at some dose above zero. This
may be regarded as a practical threshold.
The adjustments to the lower 95% CL are outlined and justified as
follows:
1. Multiplication by (le/Le) x (Le/L) where le 1s the length of exposure,
Le 1s the length of observation and L 1s the assumed Hfespan of the
mammal. These adjustments attempt to estimate an equivalent lifetime
dally Intake when exposure and observation are less than lifetime.
These adjustments are used 1n a similar form when estimating an equiva-
lent lifetime dally dose for genotoxlc carcinogens. The justification
for their use has been previously given (45 FR 79351-79352).
-18-
-------
D /TO
2. Division by the cube root of the body weight ratio, \~TT»
where 70 represents the assumed average adult weight and w the weight of
the animal, accounts for differences 1n dose as measured 1n mg/kg body
weight when dose as measured 1n unit of body surface area is assumed to
be equivalent among species (Mantel and Schnelderman, 1975). This
adjustment 1s also used for genotoxlc carcinogens and 1s more fully
described elsewhere (45 FR 79351).
3. Division by one or more uncertainty factors.* The magnitudes of these
uncertainty factors can be justified categorically; together they can
vary between 10 and 1000. These categories of uncertainty are:
The first area of uncertainty, associated with a value of 10,
1s justified by any lingering uncertainties 1n adjusting the
response from animal data, both because of the wider variabil-
ity 1n the human population when compared to the experimental
animal, and because of differences 1n species sensitivity to
adverse effects of a chemical. For example, the lower 95% CL
reflects the sampling error on the MLE and the variability
Inherent 1n the experimental population. It does not represent
the sensitive Individuals and should not be misconstrued to be
protective. The cube root of the body weight ratio assumes
that dose, relative to body surface area, Is equivalent among
animals and humans. It does not account for any differences 1n
variability or sensitivity among species to the adverse effects
of the chemical. When human data are used, a dose reduction of
10 would still be advisable because of uncertainties In the
exposure estimate.
A second area of uncertainty associated with a value of between
1 and 10 accounts for extrapolating from a projected 10~2
risk, which can be considered a LOAEL, to a comparable NOAEL as
per EPA guidelines (45 FR 79353). Although a projected 10~2
excess risk derived by mathematical extrapolation 1s suffi-
ciently low as to be undetectable 1n practical experimentation
and, therefore, might be considered as a NOAEL 1n the classic
sense of the acronym, such an Incidence rate of adverse effects
1s clearly unacceptable In the human population and thus must
be considered as an adverse effect level. The dose reduction
*Note: These uncertainty factors were developed solely for use of this
procedure and are not to be confused with the standard uncertainty factors
used for toxics -
-19-
-------
of between 1 and 10, because of this category of uncertainty,
should thus be thought to extrapolate from this projected
effect level to a level which 1s below threshold, hence a no-
effect level. Furthermore, the Incidence extrapolations are
sensitive to minor changes 1n the Incidence data even at the
10~2 risk levels (although the CL 1s less sensitive than the
MLE). A m1sclass1f1ed animal could lead to a higher projected
10~2 level and thus a higher ADI.
However, certain data bases could be used to support the
extrapolated estimate such that this category of uncertainty
would be reduced. For example, 1f more than one good animal
study 1n more than one species support the range of adverse
effect and lack of effect at lower dose levels, one could
assume threshold has been reached and reduce the value of 10
for this category of uncertainty accordingly.
A third area of uncertainty associated with a value of between
1 and 10 reflects the degree of evidence of genotoxldty.
EPA's Reproductive Effects Assessment Group (REAG) has classi-
fied the evidence of genotoxldty for different compounds Into
five areas. Below 1s a scheme that assigns different values of
this uncertainty factor to different degrees of evidence for
genotoxldty. Although the assignment of values 1s arbitrary,
the approach seems reasonable 1n light of the uncertainties
Involved:
Positive 10
Suggestive 7
Inadequate 5
Inconclusive 3
Negative 1
One Interesting aspect of this recommended approach 1s that 1f an uncer-
tainty factor of 10 1s assigned 1n this latter category because of positive
evidence of genotoxldty, the end result 1s similar to the present methodol-
ogy for carcinogens at a 10~s risk level. The data of Kodba (1977) can
be used to Illustrate this point. During a 2-year hexachlorobutadlene feed-
* 1ng study, Kodba (1977) observed renal tubular adenomas and carcinomas 1n
male rats with significantly higher Incidence 1n animals fed 20 mg/kg/day
than controls. Doses of 2.0 and 0.2 mg/kg/day showed no Increase 1n tumor
Incidence. The dose of 2.0 mg/kg/day, however, elicited evidence of kidney
toxldty 1n both male and female rats, whereas the dose of 0.2 mg/kg/day
showed no evidence of toxldty 1n either sex.
-20-
-------
The Kodba (1977) study served as a basis for estimating the ambient
water quality criterion for hexachlorobutadlene using the linearized multi-
stage model (45 FR 79351-79353) (I.e., the present method). The pertinent
data are listed below:
Dose Incidence
(mg/kg/day) (No. responding/No, tested)
0.0 1/90
0.2 0/40
2.0 0/40
20.0 9/39
le (length of chemical exposure) = 669 days
Le (length of observation) = 730 days
L (assumed Hfespan of animals) = 730 days
w (animal weight) = 0.610 kg
R (bloconcentratlon factor) = 2.78 l/kg
With these parameters the carcinogenic potency for humans, q * was calcu-
lated to be 0.07752 (mg/kg/day)'1. As a result, the recommended ambient
water concentration was 4.6 vg/l 1n order to keep the Individual life-
time risk below 10~5.
The recommended procedure uses this data set and calculates a lower 95?4
CL of the dose rate associated with a 10~2 excess cancer Mskf found
with the threshold multistage model of 0.69 mg/kg/day. An ADI calculated
from this procedure would be:
/669 dav\ /730 dav\
V730 day/ V730 day/
ADI - 0.69 x mg/kg/day x x 70 kg
3 / 70 kg
y 0.61 kg
x 335
>1 kg
-0.027 mg/day
tExcess cancer risk 1s used here 1n Heu of other systemic toxldty for
Illustrative purposes only. This procedure 1s not recommended for
carcinogens.
-21-
-------
where 669, 730 and 730 are the le, Le and L values 1n days as before, 70 and
0.61 are the respective weights for the average man and the rats 1n this
study, and the 335-fold uncertainty factor represents a 10-fold because of
area 1, a 6.7-fold due to area 2, and a 5-fold based on REAG's classifica-
tion of the genotoxldty evidence as Inadequate. This value can be used to
determine a criterion by:
c _ 0.027 mg/day
2 l/day + (0.0065 kg/day x 2.78 a/kg)
-0.013 mg/8., or 13 pg/it.
If the evidence for the genotoxldty of hexachlorobutadlene was strong, an
uncertainty factor of 10 Instead of 5 1n the ADI derivation would put the
result 1n the range of the recommended ambient water quality criterion at a
10"s excess lifetime cancer risk, I.e. (13 yg/8. x 5) •=• 10 = 6.5
pg/s,, as compared to 4.6 yg/8.. If hexachlorobutadlene was consid-
ered not to be genotoxlc, an uncertainty factor of 1 Instead of 5 In the ADI
derivation would result In an ADI of 0.136 mg/day and an ambient water qual-
ity criterion of 65
-22-
-------
CRITIQUES
DR. THOMAS CLARKSON
Introduction
The purpose of this discussion 1s to seek ways of Improving the tradi-
tional methods for calculating acceptable dally Intakes. It 1s recognized
that a new scientifically Impeccable approach 1s beyond our reach at Ihe
present moment. Thus the emphasis 1s on Improvement of the current methods
and, Indeed, not changing current procedures unless there are solid reasons.
"Pseudo-NOAELs"
Most of our discussion since the ECAO workshop on "Review of Guidelines
for Ambient Water Quality Criteria for Carcinogens" 1n Washington, DC 1n
February, 1982, led to agreement that all the positive data should be used.
Crump has summarized the reasons for this: that 1s to say that a dose-
response model will be used to calculate a NOAEL associated with a risk of
1%. Unfortunately, previous discussions have left the choice of the end-
point somewhat arbitrary — risk levels of 1%, 4%, or even 10% have been
considered to define this "pseudo-NOAEL." At the last meeting an alterna-
tive approach was suggested — to use segmented linear regression analysis.
This will determine an "Inflection" or "break" point where the effect due to
the agent emerges above the background frequency. The dose associated with
the "break" point 1s referred to as a "practical threshold" value and Is
equivalent to a "pseudo-NOAEL" (Figure 2). A probH or logit analysis that
takes Into account a background frequency Indicates a risk level at the
break point of about 4%. The segmented linear regression analysis has the
advantage that 1t does not require an arbitrary choice of risk level.
-23-
-------
100^
80-
60-
m
V)
1
00
HI
cc
u
z
HI
O
LLJ
cc
LL
100-, B
80-
60-
40-
20-
ESTIMATED BODY BURDEN {mg Hg/kg)
FIGURE 2
The frequency of signs and symptoms of methylmercury poisoning versus
the maximum estimated body burden of methylmercury 1n adults. A. Data
plotted according to "Hockey Stick" method. B. Data plotted according to
loglt analysis.
Source: Data taken from Baklr et al., 1973 (Copyright permission granted)
-24-
-------
Statistical models are well developed for segmented linear regression.
The confidence limit for the break point value can be calculated.
I disagree with the claim of the ECAO-C1n staff that the "pseudo-NOAEL"
1s a LOAEL. We are dealing with animal data at this point, and a 4% risk 1s
below a measurable value.
Extrapolation to Man
The question arises whether the maximum-likelihood estimate of the
"pseudo-NOAEL" or its 95% lower limit be used as the starting point for
extrapolation to man. The latter would have the merit of taking Into
account the quality of the data.
Expression of the dose 1n units of surface area rather than the tradi-
tional units of body weight does not seem to be well justified. Very little
data exist to Indicate that surface area conversions reduce Interspedes
differences. In fact, for extrapolations from mice and rats, the use of the
surface area units creates a safety factor of 10 over units based on body
weight. For larger animals, this "surface area" safety factor would be less
and could actually be less than unity, even though there Is no guarantee
that larger animals are more similar to man than small rodents.
Dourson (1982) has summarized the rationale for the use of safety
factors and has reviewed evidence supporting the magnitude of these safety
factors. For Interspedes conversion, I.e., from all animals to man, a
maximum value of 10 would appear to be appropriate. This factor has been
designated 101 by Dourson. In the absence of relevant Information, this
factor would normally be used. However, 1f evidence exists that a certain
species 1s similar to man for a given chemical, expert judgment should be
used 1n choosing the actual value.
-25-
-------
A maximum value of 10 also seems appropriate for a second safety factor
to cover differences 1n human susceptibility. This has been designated
10p by Dourson. He has summarized evidence from the literature concerning
the distribution of LOEL values 1n human populations to Indicate that a
factor of 10 would cover most of the variance 1n human threshold values.
Again, In the absence of other Information, the maximum value of 10 would
normally be used. However, 1f dose-response data or mechanistic data
Indicate a narrowed distribution of threshold, the actual value could be
less than 10 based on expert judgment.
The possibility of applying a third safety factor of 10, 10,., has been
discussed by EPA and others. The Idea 1s to take Into account the possibil-
ity of other factors and uncertainties not covered by the first two factors,
such as the quality of the data, the duration of the study, and even the
severity of the effect. However, the need for this safety factor might be
avoided 1n many cases 1f the lower 95% confidence limit in the NOAEL Is used
as the starting point for extrapolation to man.
DR. HARRY SKALSKY
As Dr. Dourson and Dr. Clarkson have discussed, the quantHation of a
safe dose 1s a difficult task that requires considerable judgment. As
lexicologists, we are constantly aware of a paucity of proven scientific
facts concerning safety assessment. It is satisfying that Dr. Dourson's
paper has demonstrated a biologic basis for our traditional safety factor
approach.
There are two distinct areas of quantitation being discussed: the
statistical Issues surround the shape of a particular dose-response curve,
and the precision Involved in extrapolating from animal to man.
There are three basic pieces of Information that may be gleaned from a
proper dose-response experiment.
-26-
-------
No-Observable-Effect Level (NOEL)
The NOEL may be practically defined as the point at which the measured
response can no longer be distinguished from the controls. This Is a
«
statistically measurable entity that can, as Dr. Dourson has discussed, be
determined.
Margin of Safety
This term may be defined as the magnitude of the range of doses Involved
1n progressing from a no-effect dose to the maximum effective dose. In
general, the slope of the dose-response curve may be considered an "Index"
of the margin of safety. It provides one with a general Indication of
NOEL's "resistance" to change 1f a particular experiment 1s repeated.
Comparative ToxUHy
Compounds may be ranked or compared by their relative activity within a
uniform biological specimen. As you are aware, the traditional LD™
(LHchfleld and WUcoxon, 1949) approach has proven to be very valuable 1n
distinguishing the relative tox1cH1es of a great variety of compounds.
Obviously, there are a great many mathematical ways to depict dose-
response data. It 1s always prudent to remember that dose-response data
originates from a cumulative frequency distribution which may be unique.
Observers sometimes allow these mathematical manipulations to extend their
conclusions beyond the scope of biological data. There are many biologic
observations Intrinsic to the animal bloassay that are not expressed by the
dose-response curve.
The mathematical alternatives {multl-hlt/safety factor approach) being
considered by EPA do not appear to offer any clear advantage over the tradi-
tional NOEL approach. Since there are many biologic observations Intrinsic
to the animal bloassay that are not expressed by the dose-response curve,
-27-
-------
the EPA alternative factor might obfuscate the professional judgment that
has been an Integral part of the NOEL-safety factor process. At present,
there 1s no theoretical basis for the use of a non-threshold model 1n calcu-
lating no-effect levels for noncardnogenlc chemicals. Thus, on scientific
grounds, any serious consideration of the mulstl-state alternative 1s not
warranted. In the practical sense, 1t would not be prudent to replace the
traditional NOEL-safety factor approach with a "novel" model that offers no
substantial advantages.
If advancements are to be made, we must not dwell on the manipulation of
dose-response data. Instead, we must better address the second area of
quantltatlon: the extrapolation from animal to man. Success 1n this quan-
tHatlon can be measured directly by the ability to predict human responses
from animal data. It 1s 1n this area that toxicology has obviously lagged
behind the science of pharmacology.
As you can see (Table 2), there are a large number of physiologically
based pharmacoklnetlc models that have attempted to quantify the Inter-
species Issue 1n their prediction of various drug effects. Each of these
models has addressed the animal-to-man Issue with various mathematlc assump-
tions. The accuracy of some of these models can be Illustrated by Figure 3.
The solid lines on this graph represent mathematlc predictions of human
serum concentrations of cytoslne arablnoslde (ARA-C) and Its metabolite
(ARA-U) based only on animal and in vitro experiments. These predictions
were made prior to the collection of human data. However, as you can see
when the human experiment was performed (graphically depicted by dots and
triangles), the predictions were very good.
-28-
-------
TABLE 2
Physiologically Based Pharmacok1net1c Models*
Drug
Species
Reference
Thlopental
Methotrexate
Cytarablne (ARA C)
Adr1amyc1n
Cyclocyt1d1ne
D1gox1ne
Ethanol
Mercaptopurlne
L1doca1ne
Sulfobcomophthaleln
dog, human
mouse, rat, man
mouse, monkey, man
rabbit, man
man
rat, man
man
rat, man
monkey, man
rat, man
Blschoff, 1968
Blschoff, 1970,
1971; DedMck,
1973, 1975, 1978
DedrUk, 1973,
1978; Morrison,
1975
Harris, 1975
Hlmmelsteln, 1977
Harrison, 1977
Dedrlck, 1973
Trerllkkls, 1977
BenowHz, 1977
Chen, 1978
*Source: Adapted from Hlmmelsteln and Lutz, 1979 (Copyright permission
granted)
-29-
-------
10-
1.0-
LLJ
u
8
<
CO
0.1-
ARA-C +ARA-U
i
20
i
40
60 80
TIME (minutes)
100
170
140
FIGURE 3
Predicted human serum concentrations of ARA-C and Us metabolite ARA-U
compared with experimental data. (All kinetic parameters based on in vitro
work; all anatomic and physiologic parameters based on animal data.)
Source: DedMck, 1973 (Copyright permission granted)
-30-
-------
The type of data that these models utilized (Table 3) has been available
for some time and 1t Is apparent that we toxlcologlsts have not made full
use of 1t. Most of the comparative anatomic and physiologic data have been
available since the late 1940s (Adolph, 1949; Guyton, 1947, etc.). The
thermodynamlc and transport data have been more recent developments as have
the perfuslon techniques (Wlersma and Roth, 1980; Rane et al., 1977), which
have provided Important Information on tissue-specific metabolism.
The physiologically based pharmacoklnetlc models are constructed by
compartmentalizing the basic biologic data (Figure 4). Then the mathematU
equations are constructed to explain each compartment and their Interrela-
tionships. As you can see, some of these models can become quite complex.
The complexity and sophistication of these models appear to be limited
only by the available data. For example, Roth and Welrsma (1979) have
attempted to predict the comparative clearance of benzo(a)pyrene (Figure 5)
from tissues (liver and lung) 1n the basal and the Induced metabolic state.
If such complex metabolic relationships can be predicted, the future of
these models looks bright.
In summary, I would like to offer the following three comments: First,
I believe that any ADI established by EPA can result directly or Indirectly
1n a "regulatory" number. For this reason, a minimum data base for setting
an ADI must be defined. For a noncardnogenlc endpolnt, no less than a
well-designed 90-day study should be acceptable. This concept was not
Included 1n any of the meeting materials and 1s not on the agenda for
discussion. I believe, however, that 1t 1s necessary for EPA to address
this Issue and, perhaps this group of scientists can aid In that decision.
-31-
-------
TABLE 3
Types of Data Utilized by Physiologically Based Pharmacoklnetlc Models
1. Comparative Anatomical Data Between Species:
a. Organ Sizes
b. Tissue Volume
2. Comparative Physiologic Data:
a. Blood Flow, Urine Output, Ventilation Rate, etc.
b. Basal Metabolism
c. Compound Metabolism
3. Thermodynamlc Data:
a. Protein Binding
b. Tissue Storage
4. Transport Data:
a. Membrane Permeability
b. Tissue Perfuslon
-32-
-------
1 — *•
VENOUS
BLOOD
I
^B
LUNG
'
.._..
LIVER
A 1
1 T
HEAPT
t 1
AD/PJSE
t 1
[OTHERS |
1 4
f 1
ART
BLOOD
SPLEEN
I 1
1 t
OUT
T I
1 T
A 1
1
^*L.:T
KIDNEY
1 1
t 1
MARROW
L^.-,
1
LEAN
L_^_)
t
FIGURE 4
Physiological Schema for Pharmacok1net1c Modeling
Source: Hlmmelsteln and Lutz, 1979 (Copyright permission granted)
-33-
-------
c
1
1 15-
LU
Z
LU
CC
OL
(n
o 10-
N
Z
LU
CO
II
i^
O
LU 5-
0
Z
^^
CC
^J"
UJ
••J
0 0
Liver
..•
.«•
.«•
X
.•
*
/
/
/
»•
/
/
«
*
*
•
1
'Basal'
LU
O
CC
LU
_l
O
Lung
20 40
FLOW (ml/min)
| 15^
LU
Z
LU
CC
>
^ 10-
o
N
Z
LU
CO
5-
3-Mc Induced
Liver
Lung
20 40
FLOW (ml/min)
60
FIGURE 5
Predicted Clearance of Benzo(a)pyrene
Source: Roth and Wlersma, 1979 {Copyright permission granted)
-34-
-------
Second, the traditional NOEL-NOAEL-safety factor approach at setting AOIs 1s
clearly the best of the options being presently considered by EPA. Third,
1t 1s obvious that the pharmacologists have a headstart on us at being able
to predict drug effects In man based on animal data. However, there 1s no
reason why these physiologically based pharmacoklnetlc models would not also
be successful at predicting potential toxic effects of environmental chemi-
cals. The need for these predictive models Is clear. Perhaps now 1s the
time for the Agency to explore these models so that their goals of predic-
tion can be fulfilled 1n the future.
DR. ARTHUR PALLOTTA
Most of this presentation's emphasis has been on pesticides and drugs.
However, most of the chemicals that the Solid Waste Emergency Response
Office must deal with are Industrial chemicals, and the data base for these
chemicals Is poor.
Setting minimum data base requirements 1s an excellent recommendation.
For example, trlchloroethylene has been found 1n 40% of all dump sites, but
different offices used different data to calculate an ADI, resulting 1n a
dilemma, I.e., which ADI should be chosen.
-35-
-------
DISCUSSION
DR. MYRON HEHLMAN
This terminology (I.e., acceptable dally Intake) and some of Its under-
lying assumptions should be reassessed and strengthened. No exposure to
foreign or synthetic chemicals 1s acceptable. It 1s of no benefit whatso-
ever to the person being exposed. Thus the term should be changed to "no
adverse effect from dally Intake."
DR. ROBERT TARDIFF
Development of acceptable dally Intakes (ADIs) for substances In waste
dumps 1s Implausible for several reasons. First, ADIs deal with compounds
Individually and do not take Into account Interactive effects (e.g., addl-
t1v1ty and potentlatlon) from exposures to multiple agents. Second, ADIs
Imply virtual safety, when In fact some degree of risk 1s likely to exist,
and they do not truly account for differential potency of the various sub-
stances. Third, ADIs do not allow for the array of data so that comprehen-
sive decisions can be made, I.e., there could be no organization by quanti-
tative risk estimates such as numbers of substances with specified risk
levels, and no organization by qualitative risk such as the assembly of
compounds toxic to any Individual target organ such as the central nervous
system. A more plausible approach for the simultaneous health risk analysis
of a variety of substances 1s the uniform application of quantitative risk
assessment methodology (similar 1n concept but not necessarily 1n detail to
that applied to carcinogens) to obtain risk estimates for Individual toxic
endpolnts for each substance or mixture. Such uniformity of data should
allow for a more logical selection of critical adverse effects and of risk
levels that have been determined elsewhere to be socially acceptable. By
arraying the data according to target organ risk (or even by mechanisms of
-36-
-------
action If known), there would be at least a hypothetical basis for antici-
pating the addU1v1ty of risks and for selecting classes of acceptable risks
for specific waste disposal sites.
MR. WILLIAM GULLEDGE
The presentation by Dr. Harry Skalsky was most Illuminating and should
be supported. The safety factor approach, properly Implemented, 1s a
definite Improvement over the use of complex risk assessment models.
OR. MAGNUS PISCATOR
The statement by Crump that the slope should be used Is basically sound.
However, occasionally a steep slope may depend on additive effects. As an
example: 1f a nephrotoxlc agent also causes hemolysls, a steep dose-
response curve may be obtained for renal effects within a certain dose
Interval. If lower doses are used and no hemolysls occurs, the dose-
response curve for renal effects may not be so steep, leading to a lower
safe level. This also Implies that all effects must be taken Into account
when looking at the dose-response curve for one effect. This 1s Interaction
of effects, which was not mentioned at the meeting except 1n my comment.
DR. HARRY SKALSKY
It 1s Important that a minimum data base be established with which to
calculate an ADI. For a noncardnogenlc endpolnt, no less than a well-
designed 90-day study should be acceptable. The traditional NOEL (NOAEL)-
safety factor approach at setting ADIs Is clearly the best of the options
that EPA 1s currently considering. Perhaps the physiologically-based
pharmacoklnetic models will provide new Ideas with which to Improve the
process of safety evaluation.
-37-
-------
DR. RICHARD KOCIBA
Conceptually, there 1s considerable merit 1n the use of various safety
factors (uncertainty factors) 1n estimating acceptable dally Intakes for
chemicals. While historically this has been most frequently used 1n dealing
with noncardnogenlc endpolnts, newer scientific Information now supports
the use of safety factors 1n dealing with carcinogenic endpolnts, especially
endpolnts of an eplgenetlc type.
This would allow one to more fully utilize all the data available 1n
setting more realistic levels of control. A paper by Park and Snee (1982)
Illustrates one option that should be considered.
The definitions of NOEL and NOAEL should be revised to give equal weight
to the biological significance 1n addition to the statistical significance.
It 1s not always appropriate to categorically assume that man 1s going
to be 13 times more sensitive than the mouse and 5 times more sensitive than
the rat as based on body surface area ratio. This concept has been based on
alkylatlng agents, and Is not supported by data from other materials. A
paper by ReHz et al. (1978) pertains to this Issue. It would be more
appropriate to deal with each material on a case-by-case basis, and use
mg/kg body weight as the basis of Interspedes conversion where appropriate.
GENERAL COMMENTS
The PEL could be predicted for untested chemicals from structure-
activity models.
The lack of data 1s the driving force for making the safety factor as
low as possible.
There should be a minimum data base requirement before making ADI calcu-
lations. In the absence of these data, a more severe adjustment should
be made.
Minimum study quality parameters should be formally set.
-38-
-------
Since data for multiple exposure to chemicals do not exist, as rigid a
standard as possible should be established 1n view of current technol-
ogy, 1n the hope that 1f the standards are difficult or burdensome to
achieve, they will lead to the development of the necessary data.
The degree of exposure should dictate whether minimum data are used.
Physiologic pharmacoklnetlc models should be explored. Caution should
be used with these models until molecules can be measured at the site of
toxlclty.
Kinetic data on key, commonly occurring chemicals can be used to develop
the necessary equations for Individual compounds to predict the likeli-
hood of unusual reactions due to Interactions.
Monitoring data are needed to establish exposure levels.
An uncertainty factor representing the quality of data could be used for
data taken from an uncertain data base.
Significant biologic bases should be evaluated as well as statistical
significance 1n using the ADI approach. Physical and biologic data
should be used.
Use of surface area adjustment may be a problem, since pathologlsts will
report severity data and not Incidence data.
Surface area adjustment Implies that children and Infants can tolerate
higher doses than adults.
Risk assessments should be validated and updated when additional Infor-
mation 1s available.
Animal data can't always be considered reliable, e.g., neurobehavloral
aspects can't be determined 1n animal models.
Regulatory agencies should aim at setting standards that will prevent us
from getting human effects data.
A discussion of risk assessment should deal with predictive methodology,
not protective methodology.
The ADI approach will establish an overly simplistic situation of
whether a dump site 1s safe or unsafe.
-39-
-------
REFERENCES
Adolph, E.F. 1949. Quantitative relations 1n the physiological constitu-
tions of mammals. Science. 109: 579-585.
Andersen, M.E. 1981. Saturable metabolism and Us relationship to toxlc-
1ty. CRC Cr1t. Rev. Tox. p. 105-149, May.
Baklr, F., S.F. Damlujl, L. Am1n-Zak1, et al. 1973. Methylmercury poison-
Ing 1n Iraq. Science. 181: 230-241. (Copyright 1973 by the AAAS)
Berkson, J. 1944. Application of the logistic function to bloassay. J.
Am. Stat. Assoc. 39: 357-365.
Blschoff, K.B., R.L. Dedrlck, D. Zaharko and J.A. Longstreth. 1971. Metho-
trexate pharmacoklnetlcs. J. Pharm. Sc1. 60: 1128-1133.
Dedrlck, R.L. 1973. Animal scale-up. J. Pharm. Blopharm. 1(5): 435-461.
Dedrlck, R.L. and K.B. Blschoff. 1980. Species similarities 1n pharmaco-
klnetlcs. Fed. Proc. 39: 54-59.
Dourson, M.L. and J.F. Stara. 1983. Regulatory history and experimental
support of uncertainty (safety) factors. Reg. Tox. Pharm. (In press)
Guyton, A.C. 1947. Analysis of respiratory patterns 1n laboratory animals.
Am. J. Physlol. 150: 78-83.
-40- 01/25/84
-------
Harrison, L.I. and M. G1bald1. 1977. Physiologically based pharmacoklnetic
model for dlgoxln distribution and elimination 1n the rat. J. Pharm. Sc1.
66: 1138-1142.
Hlmmelstein, K.J. and R.J. Lutz. 1979. A review of the applications of
physiologically based pharmacoklnetic modeling. J. Pharm. Blopharm. 7:
127-145.
Kodba, R.J., et al. 1977. Results of a 2-year chronic toxlclty study with
hexachlorobutadlene 1n rats. Am. Ind. Hyg. Assoc. 38: 589.
LHchfleld, J.T. and F. WUcoxon. 1949. Simplified method of evaluating
dose-effect experiments. J. Pharmacol. Exp. Ther. 96: 99-113.
Mantel, M. and M.A. Schnelderman. 1975. Estimating "safe" levels: A
hazardous undertaking. Cancer Res. 35: 1379-1386.
MAS (National Academy of Sciences). 1977. Drinking Water and Health. NAS,
Washington, DC.
Park, C. and R. Snee. 1982. Quantitative Risk Assessment: State of the Art
for Cardnogenesis. (Submitted for publication)
Rane, A., 6.R. Wilkinson and D.6. Shand. 1977. Prediction of hepatic
extraction ratio from In vitro measurement of Intrinsic clearance. J.
Pharmacol. Exp. Therap. 200: 420-424.
-41-
-------
ReHz, R., P.J. Gehrlng and C. Park. 1978. Carcinogenic risk estimation
for chloroform: An alternative to EPA's procedures. Food Cosmet. Toxlcol.
16: 511.
Roth, R.A. and D.A. Wlersma. 1979. Role of the lung 1n total body clear-
ance of circulating drugs. CUn. Pharm. 4: 354-367.
Wlersma, D.A. and R.A. Roth. 1980. Clearance of 5-hydroxytryptam1ne by rat
lung and liver: The Importance of relative perfuslon and Intrinsic clear-
ance. J. Pharmacol. Exp. Therap. 212: 97-102.
-42-
-------
SYSTEMIC TOXICANTS
Interspecles Conversion of Dose and Duration of Exposure
Presentation: Dr. Rolf Hartung
University of Michigan
Critique: Dr. Robert Tardlff
National Academy of Sciences
Critique: Dr. Ellen O'Flaherty
University of Cincinnati
-43-
-------
PRESENTATION
DR. ROLF HARTUNG: INTERSPECIES CONVERSION OF DOSE AND DURATION OF EXPOSURE
Accounting for Species Differences
Simple observation demonstrates that species differ In size, food
habits, metabolic patterns, Hfespan, and anatomical features. All these
factors may Influence the relative sensitivity of various species to chemi-
cals. The toxldty of chemicals to various species may be compared on
several bases.
The most common 1s 1n terms of mg/kg of body weight. This assumes that
since the biochemical make-up of various species 1s very similar, the chemi-
cal should Interact on the basis of Its concentration within the organism.
However, most laboratory animals tested have been shown to have a higher
metabolic rate than man. Similarly, smaller animals have been shown to have
higher rates of food Intake, higher water consumption, higher breathing and
heart rates, higher rates of excretion, and possibly higher rates of drug
metabolism than larger animals.
Since the basic metabolic rate of homeotherms correlates well with body
surface area, the comparison between species might be made on the basis of
mg/m2 of body surface. This Is the currently accepted methodology used by
the Agency 1n establishing water quality criteria. One difficulty In using
this approach Is that, for accurate predictive purposes, 1t may be necessary
to know whether the metabolic processes predominantly detoxify the chemical
or activate 1t to a toxic metabolite. If one also considers species differ-
ences 1n metabolic patterns, which may have nothing to do with body size,
then a very complex pattern emerges. In developing criteria, we may need to
pay much more attention than we previously have to the differences or
similarities In the metabolic patterns of different species.
-44-
-------
If one Investigates the problem within one species only, e.g., the dif-
ferences 1n drug sensitivities between children and adults (Wagner, 1971),
then a number of problems become evident. In general, the child 1s less
sensitive to drugs than the adult on a mg/kg basis, and the best approxl-
mator of the appropriate dose for children 1s:
child's dose = adult dose (body surface area of child/
body surface area of adult).
In this case the appropriate dose for the smaller organism 1s predicted from
our experience with the larger organism, and this approach Is compatible
with the general application of the body surface rule currently used by the
Agency. However, Wagner also makes the observation that the dosing regimens
for newborns cannot be predicted, because of differences in the development
of various enzyme systems.
Other means of comparing effects among species have also been suggested.
Thus Harwood (1963) suggested using mg/kg brain weight, presumably for the
evaluation of chemicals with CNS activity.
Accounting for Differences 1n Duration of Experiment and Ufespan
In carcinogenic bioassays, 1t has usually been assumed that the induc-
tion time for cancers over the lifetime of a relatively short-lived rodent
is equivalent to the relatively longer Induction time over the lifetime of
longer-lived animals such as humans. Thus the duration of an exposure has
sometimes been represented in terms of the proportion of that exposure
relative to Ufespan (t/T). The extent to which that concept 1s valid may
require investigation.
-45-
-------
Shorter and longer exposures have often been compared on the basis of
Haber's rule (straight time-weighted dose averaging). Haber's rule appears
to be applicable as an approximation when only slight differences 1n dose or
duration are Involved. A more complex relationship, reviewed by F1lov
(1979), may be more promising.
When one compares relatively acute or subacute phenomena, species
conversions on the basis of Hfespan may have no applicability whatever.
The time to develop liver necrosis or enzyme changes appears to be very
similar 1n man or 1n mouse. Thus, we need to better review the available
knowledge, or to generate more basic data to enable us to be more certain
whether and what kind of temporal relations exist.
-46-
-------
CRITIQUES
DR. ROBERT TARDIFF
The previous presentation reviewed data manipulation techniques.
However, none of the techniques presented has been sufficiently validated to
encourage their broad-based application. An additional aspect, not covered
1n the narrative, 1s the expression of dose as moles rather than as weight
of a compound. The use of moles would provide a more accurate comparison of
potencies of chemicals, for 1t describes the number of molecules required to
Induce adverse effects 1n an organism. Since differential potency 1s of
considerable Importance 1n predicting the health risks from these mixtures,
such an approach 1s far more desirable for chemical waste dump evaluations.
With regard to adjustment for duration of exposure for noncardnogens, the
differences In potency between subchronlc and chronic exposures are gener-
ally negligible -- 1f Well's data (Well and McColllster, 1963; Well et al.,
1968) are to be believed. One exception would be for substances that take
longer than 90 days of exposure to reach equilibrium (e.g., methyl mercury);
then subchronlc data would not be predictive of chronic effects. This would
argue strongly for the use of metabolic data to determine whether to adjust
for duration and, 1f so, by what magnitude. By contrast, for Initiating
carcinogens the Influence of duration of exposure on expression of disease
must almost of necessity be obtained empirically because the primary lesion
1s likely to be acute. For substances that are unequivocally only promot-
ers, less than lifetime exposure would be expected to have a threshold-
curvilinear effect on cancer manifestation (I.e., 10% lifetime exposure 1s
likely to have less than 10% risk of cancer assuming a standard dose rate).
-47-
-------
DR. ELLEN O'FLAHERTY
Interspedes Conversion of Dose
The method of expressing dose on the basis of body surface area has been
widely accepted for two reasons: the good correlation of basal metabolic
rate with body surface area, and the work of Plnkel et al. (1958) and
Frelrelch et al. (1966) showing that, for chemically different antlneo-
plastlc agents, the maximum tolerated dose 1n several different species was
about the same when expressed on a body surface area basis, but varied
widely when expressed on a body weight basis. However, with regard to this
particular class of drugs, D1xon (1976) has shown, using data from Felrelch
et al. (1966) and Scheln et al. (1970), that the ratios between maximum
tolerated doses of more than 30 antlneoplastlc agents 1n different species
were reasonably constant from drug to drug, regardless of whether they were
expressed on the basis of body weight or body surface area.
Wagner (1971) carried out a comparable evaluation of a number of drugs
that apparently were not antlcancer agents, using blood level parameters
rather than toxldty data to assess comparability of exposure. Unfortunate-
ly, he did not Identify the drugs that were Included 1n his evaluation.
However, he concluded that:
Because of the high correlation between calculated body
surface area and body weight, and the high correlation between
blood level parameters (such as area under the curve or peak blood
level) and dose by weight, 1t would be extremely difficult, 1f not
Impossible, with any given drug to prove that a blood level
parameter correlates significantly better with dose per unit body
surface area or dose per unit body weight. The data Indicated that
the choice between mg/kg or mg/m2 correlation would be equivocal
w1 th any given drug.
It appears probable that this Issue 1s one that Is not resolvable on a
scientific basis, no matter what the quality of the data base.
-48-
-------
Duration of Exposure
It 1s Interesting to note that the maximum tolerated doses used by D1xon
(1976) 1n his Interspedes comparisons were calculated by time-weighted
averaging. Certainly this 1s the simplest method by which dose may be
adjusted for duration of exposure. There 1s also a simple variant of time-
weighted averaging that takes Into account the possibility that an apparent
threshold dose exists and/or that there 1s a minimum time to occurrence of
the earliest observable effect.
Straight time-weighted averaging 1s Illustrated 1n Figures 6 and 7,
where X can be either dose rate or time and Y Is the other (I.e., the
expression 1s symmetrical with respect to dose rate and time). In an arith-
metic coordinate system this expression plots as a family of hyperbolas
whose shape depends only on the value of C; that 1s, on the total dose.
Plotting 1n a log-log coordinate system (see Figure 7) produces parallel
straight lines whose slope 1s -1.
If threshold dose and minimum time to effect are Incorporated Into the
expression for total dose (as A and B 1n Figure 8), the family of hyperbolas
1s simply shifted with respect to the axes of an arithmetic coordinate
system. However, 1t no longer plots as straight lines 1n a log-log coordi-
nate system (Figure 9), although the slope 1s still -1 at the midpoint. The
log-log plot 1s the one recommended by F1lov et al. (1979), possibly on the
basis that 1t produces straight lines that can be extrapolated to facilitate
1nterconvers1ons between dose rates. F1lov et al. state, "Comparison of
concentration-time relationships for various substances has shown the slopes
to be different for different substances." As Figure 9 shows, while plots
of segments of these lines might appear to be linear 1n a log-log coordinate
system, the slopes of these segments will be determined by how closely the
-49-
-------
FIGURE 6
Time-weighted dose averaging using an arithmetic coordinate system.
Plotting 1n an arithmetic coordinate system produces a family of hyperbolas
showing the Interrelationships of dose rate and time for four different
total doses C.
-50-
-------
10 A
1.0
4.0
*.o
1.0
3.6
J.O
Y
l.o
.4
.1
-?
.4
.8
.1
.4 -5 -4 .* .1-11-0
1.0 10.0
FIGURE 7
Time-weighted dose averaging using a log-log coordinate system.
Plotting the data from Figure 6 In a log-log coordinate system produces a
family of parallel straight lines.
-51-
-------
FIGURE 8
Time-weighted dose averaging using an arithmetic coordinate system and
Incorporating a threshold dose and a minimum time to effect (A and B, both
equal to 1 In this Illustration). Plotting 1n an arithmetic coordinate
system produces a family of hyperbolas similar to those shown In Figure 6
but shifted 1n position.
-52-
-------
FIGURE 9
Time-weighted dose averaging using
Incorporating threshold dose and minimum
from Figure 8 In a log-log coordinate
lines. Compare with Figure 7.
a log-log coordinate system and
time to effect. Plotting the data
system does not produce straight
-53-
-------
threshold dose or minimum time 1s approached, as well as by the chemical or
physical nature of the substance under consideration. F1lov et al. (1979)
appear to recognize something of this problem:
Furthermore 1t has been shown that, other factors being equal, the
angle of Inclination (e.g., the slope) for one and the same sub-
stance will be smaller the closer Its effect 1s to the threshold
level.
This statement mixes concentrations and effects, but recognizes the practi-
cal existence of thresholds and appears to concede that log-log plots of
dose rate vs. time-to-occurrence may not be linear throughout. Thus, 1t 1s
not advisable to plot dose rate vs. time data as a straight-line segment 1n
a log-log coordinate system. This procedure would make 1t difficult to
Identify threshold dose and minimum time-to-occurrence, should these exist.
Furthermore, If data had been obtained on an "arm" of the curve 1n Figure 9,
the slope assigned would be determined largely by the dose range and would
not be characteristic of the compound or of Its mechanism of action.
Fitting of the equation of a hyperbola with asymptotes not necessarily equal
to zero, as In Figure 8, 1s to be preferred for two reasons: first, the
minima (which may be zero) can be calculated; and second, 1f available data
span the midpoint of the curve, then observance of a slope not equal to -1
at this point Is meaningful, suggesting that dose-dependent factors (Induc-
tion of detoxifying enzymes, saturation of metabolic pathways) or time-
dependent factors (alterations 1n host sensltlvty) are operating to skew the
relationship of dose rate to time.
All these calculations are based on the assumption that the total (Inte-
grated) dose presented to the Individual 1s the sole determinant of effect;
at exposure times much less than the half-life of the compound of concern,
this may be true. In other words, this assumption 1s reasonable for
exposure periods during which processes of metabolism and excretion are
-54-
-------
relatively less Important determinants of body burden than 1s rate of entry
Into the body. At exposure times less than the half-life, body burden
Increases steadily with continuing exposure at a rate roughly proportional
to exposure rate. However, at exposure times greater than the half-life,
the rate of Increase of body burden with continuing exposure declines until
eventually the body burden reaches constancy at a steady-state value. The
above approaches to calculation of total effective dose would not be appro-
priate over such long exposure periods.
This 1s a particularly Important consideration when exposure occurs
repeatedly (I.e., Intermittently). Elimination processes, operating during
the Intervals between exposures, can have a large Impact on total Internal
exposure. For example, Figure 10 (O'Flaherty, 1981) shows the ratio of body
burden after n equal and equally spaced Intravenous doses [BB(n)] to body
burden after one dose [BB(1)] as a function of K At. Clearly, as e!1m1-
"
nation rate constant k 1s Increased with the Interval At between doses
remaining constant, the number of doses required to achieve a stipulated
body burden (I.e., that body burden associated with a specified effect) also
Increases. It 1s clear that processes of elimination are Important determi-
nants of total Internal exposure whenever exposure continues for an extended
period of time, and that they should be taken Into account whenever compari-
sons between shorter and longer exposures are made.
Conclusions
To obtain data over a limited concentration and time range and then to
attempt to linearize them 1s not justifiable.
The log-log transformation approach 1s most applicable to chronic expo-
sures to relatively constant concentrations or dose rates. For Intermittent
exposures, other considerations need to be made. Acute dose data may be
-55-
-------
BB(n)
BB(1)
n= 1
0.1
0.2
0.3
0.4
0.5
0.6
ke At
FIGURE 10
The ratio of body burden after n equal and equally spaced doses [BB(n)]
to body burden after one dose [BB(1)] as a function of keAt, for values
of n from 1-7, In the one-compartment body model.
-56-
-------
useful to the toxldty of the compound 1f the critical effect appears very
early relative to the half-life of the compound. If the critical effect
appears later, and exposure 1s Intermittent (as 1s usually the case), then
Ideally we should use the following data:
1. Kinetics of the compound
2. Considerations of thresholds
3. Pharmacodynamlc considerations (method for dealing with different
exposures). It may be that the pharmacodynamlc factors (I.e., the
concentration at the receptor sites and the relationship between that
concentration and effect) rather than pharmacoklnetlc measurements will
be the key determinant of a compound's toxldty.
-57-
-------
DISCUSSION
DR. MYRON MEHLMAN
The approach suggested by Rolf Hartung 1s very Interesting. However, I
would like to see data or a series of tables on at least several hundred
chemicals that can be converted from mg/kg to mg/m2 of body surface. Such
calculations might be very useful for estimating proper toxic dose levels
for different species.
DR. KENNETH CRUMP
I feel that we have not yet arrived at a suitable solution to the dose-
duration problem for systemic toxicants. While the dose-duration graphs can
be useful 1n summarizing data, 1n most cases the data may be too limited to
permit full use of this method. Even when the graphs are available, one
still has the problem of how to use the graph 1n setting ADIs. At this
point, I have no clear Ideas about how to proceed.
MR. WILLIAM GULLEDGE
Given a well-conducted animal study and the consideration of negative
data 1n some manner, the uncertainty of the data base would be kept to a
minimum. In this Instance, the uncertainty associated with extrapolating
laboratory test results to human health effects would be greater. Use of a
10-fold uncertainty factor for each step 1n the extrapolation process may be
Inappropriate for universal application. Lower confidence limits associated
with the data as well as determination of the LOAEL on a case-specific basis
could lower the 10-fold value for several steps. The 10-fold uncertainty
factor 1n comparing animal data to human data appears to be extremely
suspect In many cases.
-58-
-------
DR. RICHARD KOCIBA
The Issue of Interspedes conversion was again addressed, with several
factors Indicating one cannot always use mg/cm2 surface area as the proper
basis of extrapolation (Reltz et al., 1978). One should extrapolate on the
basis Indicated by the Information regarding whether the metabolic processes
lead to activation or detoxification of the material. The comments of Ors.
Hartung and O'Flaherty In their presentations should be reviewed 1n this
regard.
The Issue of adjusting for differences In duration of experiment and
Hfespan must be addressed 1n view of the limitations Inherent 1n any
simplistic approach, such as the use of Haber's rule, which appears to be
applicable only when dealing with relatively minor differences In dose or
duration. For lifetime cancer studies, 24 months should be used as repre-
sentative of the Hfespan of rats, with 18 months for mice.
DR. HARRY SKALSKY
Spedes-speclf 1c metabolism generally presents the most difficult
extrapolation problems. However, 1t appears that j_n vitro techniques are
evolving Into useful predictive tools. On the following page, note 1n the
table the comparative deamlnase activity, used by DedMck and Blschoff
(1980) 1n estimating human serum concentrations of cytoslne arablnoslde from
animal data (Table 4). It would appear that a compendium of this type of
species-specific data would serve as a valuable resource for the Agency, and
be well worth the effort of gathering.
GENERAL COMHENTS
It Isn't necessary to only use linear relationships; we can handle non-
linear relationships for duration of exposure. For Interspedes conver-
sions, Introduction of structural parameters may effectively linearize
the relationships, Increasing the correlation coefficient.
-59-
-------
TABLE 4
Model Parameters for Cytoslne Arablnoslde 1n Several Species*
Parameter
Body weight (g)
Volume (ml)
Blood
Liver
Gut
Heart
Kidney
Lean
Marrow
Blood flow (mst/mln)
Blood
Liver
Gut
Heart
Kidney
Lean
Marrow
Mouse
22
1.67
1.30
1.30
0.095
0.34
10.0
0.60
4.38
1.80
1.50
0.28
1.30
0.83
0.17
Monkey
5000
367
135
230
17
30
2000
133
431
133
125
63
123
67
23
Dog
10,000
670
230
400
54
50
4,300
270
805
270
216
54
216
223
40
Man
70,000
2,670
1,700
3,180
450
1,060
27,000
2,000
4,040
1,430
1,100
240
1,240
930
180
M1chael1s constant
(pg/mst H20)
Deamlnase activity
(vig/g-m1n)
Blood
Liver
Gut
Heart
Kidney
Lean
Marrow
Kidney clearance
(m8,/m1n)
283
4.6
8.3
91.5
0.18
39
1.6
140.2
37.0
71.8
34.3
14
115
39
119
6
20
32
*Source: Dedrlck and Blschoff, 1980 (Copyright permission granted)
-60-
-------
Guidelines similar to those available for carcinogens should be estab-
lished for a data base that could be used to develop measures of risk
for systemic toxicants.
Body surface area cannot be categorically assumed to be the only way to
make Interspedes conversions. The appropriateness of this approach has
been looked Into by Reltz et al. (1978) who evaluated chloroform 1n rats
and mice on the basis of surface area. They concluded that the mouse
data did not accurately predict the effect 1n rats.
One of the most effective ways to extrapolate may be blood level concen-
tration.
Since exposure 1s usually Intermittent, we need some conceptual way of
converting from our knowledge of the effect at a constant dose level to
the likely effect of Intermittent exposure. We need to advance our
knowledge by systematic compilation of the available Information on the
effect of fractlonatlon of doses for relevant toxics.
Fractions do not always add up to the sum of the total due to such
factors as repair and specific repair mechanisms.
When discussing Intermittent vs. long-term exposure, one must take
target tissue Into account since Intermittent exposure to the CNS where
cells are not replaced is totally different than exposure to an organ
which can be repaired, such as the kidneys.
When looking at Intermittent exposures, the concentration Is more
Important than the total dose. Different endpolnts may be affected
differently, making generalizations Impossible.
-61-
-------
REFERENCES
DedMck, R.L. and K.8. Blschoff. 1980. Species similarities 1n pharmaco-
klnetlcs. Fed. Proc. 39: 54-59. [This article based on DedMck, R.L.,
D.B. Forrester, 3.N. Cannon, S.H. El Dareer and L.B. Mellett. 1973.
Pharmacok1net1cs of l-6-D-arab1nofuranosylcytos1ne (Ara-C) deamlnatlon 1n
several species. Blochem. Pharmacol. 22: 2405-2417.]
D1xon, R.L. 1976. Problems 1n extrapolating toxIcHy data for laboratory
animals toman. Environ. Health Perspect. 13: 43-50.
FHov, V.A., A.A. Golubev, E.I. L1ubl1na and N.A. Tolokontsev. 1979.
Quantitative Toxicology. John Wiley and Sons, New York, based on the 1973
Russian edition of Kollchestvennaya Toks1kolog1ya, translated by V.E.
Tatarchenko. p. 50.
FrelreUh, E.J., E.A. Gehan, O.P. Rail, L.H. Schmidt and H.E. Skipper.
1966. Quantitative comparison of toxUHy of antlcancer agents 1n mouse,
rat, hamster, dog, monkey and man. Cancer Chemother. Rep. 50: 219-244.
Harwood, P.O. 1963. Therapeutic dosage In small and large mammals.
Science. 139: 684-685.
O'Flaherty, E.J. 1981. Toxicants and Drugs: Kinetics and Dynamics. John
Wiley and Sons, New York. p. 370-374.
Plnkel, D. 1958. The use of body surface area as a criterion of drug
dosage 1n cancer chemotherapy. Cancer Res. 18: 853-856.
-62-
-------
ReHz, R., P.J. Gehrlng and C. Park. 1978. Carcinogenic risk estimation
for chloroform: An alternative to EPA's procedures. Food Cosmet. Toxlcol.
16: 511.
Scheln, P.S., R.D. Davis, S. Carter, J. Newman, D.R. Scheln and D.P. Rail.
1970. The evaluation of antlcancer drugs 1n dogs and monkeys for the pre-
diction of qualitative tox1cH1es 1n man. CUn. Pharmacol. Therap. 11:
3-40.
Wagner, J.6. 1971. Dosage of drugs In Infants, children and adults. |n_:
B1opharmaceut1cs and Relevant Pharmacok1net1cs, Drug Intelligence Publica-
tions, Hamilton, IL. p. 21.
Well, C.S. and D.D. McColllster. 1963. Relationship between short- and
long-term feeding studies 1n designing an effective toxldty test. J.
Agrlc. Food Chem. 11: 486-491.
Well, C.S., M.D. Woodslde, J.R. Bernard and C.P. Carpenter. 1969. Rela-
tionship between single per oral one-week and 90-day rat feeding studies.
Toxlcol. Appl. Pharmacol. 14: 426-431.
-63-
-------
SYSTEMIC TOXICANTS
Risk Assessment for Less-Than-L1fet1me Exposure
Presentation: Dr. Richard Hertzberg
ECAO, OHEA, U.S. EPA
Critique: Or. Sheldon Murphy
University of Texas at Houston
Critique: Dr. William Nicholson
Mt. S1na1 Hospital
-64-
-------
PRESENTATION
DR. RICHARD HERTZBERG: RISK ASSESSMENT FOR LESS-THAN-LIFETIME EXPOSURE
Exposure durations which are Iess-than-l1fet1me are not clearly defined.
Terms used most often are, 1n order of Increasing duration: acute, short-
term, subchronlc and chronic. These terms have been defined for rodents
only. One overriding concern 1s therefore how to express the duration of an
animal study as an equivalent human exposure duration. Percent lifetime has
been suggested, but has not yet been supported by actual data. The follow-
ing sections discuss reasons for considering exposure duration and explore
the kinds of data needed for quantitative adjustment of animal data for use
In human risk assessment.
Present Approach
There 1s currently no approach for estimating human health effects from
Iess-than-l1fet1me exposures and there 1s no method for estimating a partial
lifetime ADI. Although actual studies are conducted and used for determin-
ing acute water criteria for aquatic organisms, no corresponding programs
exist for estimating acute criteria for humans. The only human health cal-
culation which considers exposure duration divides the subchronlc exposure
level (usually a NOAEL) by 10 to estimate the corresponding chronic level.
The reverse procedure, to estimate a subchronlc level, has not been recom-
mended or used.
Possible Approaches
One obvious approach, similar to that used for aquatic organisms, 1s to
examine the toxldty data on a chemical to see 1f any clear trends are
evident which relate dose/exposure to duration for a similar type of effect.
For example, Figure 11 shows rough relationships between human equivalent
dose (based on mg/day/body surface area) and duration for several categories
-65-
-------
100.000
10,000
EQUIVALENT
HUMAN
DOSE
(mg/d)
1.000
100
10
o
o
O
O O
O o
O
O.07 0.7 7.0
EQUIVALENT HUMAN DURATION (years)
FIGURE 11
Dose-duration Relations Based on Several Toxldty Studies
Effect Level: A = PEL; £ = AEL; Q = NOAEL; <£> = NOEL
Study Quality: A = Good; A = Average; ^ = Poor
O
O
70
FEL
NOAEL
ADI
-------
of health effect severity. No statistical techniques have yet been devel-
oped for these trend lines. A major assumption 1n constructing such graphs
1s that percent lifetime 1s a valid Index of duration across species. Since
each graph summarizes studies by different Investigators, further uncertain-
ties exist, Including differences 1n study protocols and the subjective
judgment 1n determining the severity of each effect.
Some journal articles have examined the relationships among doses for
different durations. The relationships are empirical rather than mechanis-
tic since they are not necessarily for equ1tox1c dose rates. For example,
Well et al. (1969) compared LDcns to subchronlc minimum effect levels.
bll
Thus generalizations to a wider group of chemicals may not be justified.
Several reviewers of our methodology have suggested that, In general, sub-
chronic and chronic studies are likely to show similar effects at similar
doses, and that an acute acceptable dose might be no more than 10 times the
chronic acceptable dose.
Metabolism and pharmacok1net1cs data would alter the above general
approaches. Rapid accumulation to a steady-state tissue or blood level sug-
gests that short-term through lifetime exposures would depend only on dally
dose rate, not duration. The use of uptake and excretion half-times should
be cautious, however, since a chemical may not have a single half-time, but
one which depends on species, dose rate, target organ, other chemicals or
conditions (such as plasma pH), etc. This caution also applies to other
quantitative Information, such as rate of enzyme activation, extent of
disposition 1n tissues and any general adaptation to a constant dose rate.
The extent to which such data might alter the eventual risk assessment can
be Investigated, since tested models exist for multlcompartment kinetics,
single oral and multiple oral dosing, and enzyme Induction. Another factor
-67-
-------
1s the actual disease progression. If effects have a latent period, then
short but high exposures may be quite damaging to young people but much less
so to the elderly. Until more data become available and tested, the above
approaches will most likely remain as only qualitative considerations and
not the quantitative modifications we would prefer.
One area which might be quantifiable 1s exposure during a critical
period of fetal development. If these "windows" of time can be Identified
for experimental species and humans, then short-term or acute exposures to
mothers could be better evaluated.
-68-
-------
CRITIQUES
DR. SHELDON MURPHY
ADI may be estimated without lifetime exposure. MacNamara (1976) showed
that data derived from studies at less than 90-day exposure can be useful.
With 95% confidence, a dosage which would produce no effects during a life-
time of exposure could be predicted from the 3-month, no-effect dose with
the application of a safety factor or uncertainty factor of 10. MacNamara's
conclusion was that 1t 1s possible to predict a NOEL 1n probably all cases
except cardnogenesls and reproductive effects. He also proposed shortened
versions of how to obtain sufficient data to make assessments of these long-
term effects.
Spyker (1975) summarized effects resulting from exposure of embryos to
mercury 1_n utero. Behavioral deviations were only observed 1n maturing
animals. Neurological disorders were only apparent at maturity. Early
senescence was only apparent late 1n life; premature death was only apparent
at death. He concluded that a shorter duration study may not have detected
effects, since these effects only occurred at certain stages of lifetime.
If you have to observe over a significant period of the lifetime to see
effects, 1s there a cost advantage to limiting exposure to 90 days?
In terms of risk assessment and predictability of Interactions, 1f we
know enough about mechanisms, we may reduce risk assessment to an Integrated
series of jji vitro and short Iji v1 vo tests.
Conclusions
1. With the possible exception of cardnogenesls and reproductive effects,
1t appears that a good 90-day (or short-term) study will predict quali-
tatively, and perhaps, 1n most cases, quantitatively long-term effects.
-69-
-------
2. Quite possibly the period 1n life 1n which a short-term exposure occurs
will Influence the type as well as the dose required for effects over a
lifetime observation period.
3. If the objective of short-term tests 1s to reduce the cost of testing,
will this be accomplished 1f long-term or lifetime observations are
still required?
4. With regard to multiple chemical exposures, 1L seems that the objective
should be to determine how the presence of one or more additional chemi-
cals should Influence the hazard assessment or the ADI of Individual
chemicals, not to develop AOIs for mixtures.
5. If we know mechanisms and affinity constants and rate constants for
Interactions of chemicals with critical macromolecules, we should,
through appropriate mathematlc models, be able to make predictions of
possibly Interactive situations.
DR. WILLIAM NICHOLSON
Use of a factor of 10 for extrapolating short-term to chronic toxldty
would appear to be reasonable based on the ADI data of Well and McColHster
(1963) 1n which most subchronlc to chronic effect ratios were less than 6;
only 5% exceeded the value of 10. Because of the possibility that there
could be significant differences between chronic and subchronlc ADIs, I
would suggest that a factor no less than 10 be utilized. Just to emphasize
an obvious point, the Inverse extrapolation from chronic to subchronlc
should utilize no alteration 1n the level. While the subchronlc and acute
time periods are somewhat 111 defined 1n the context of the extrapolations
to lifetime exposures, I would suggest that a factor of 10 be applied to
subchronlc exposure circumstances that are no less than 10% of the lifetime
of the experimental animal.
-70-
-------
In certain cases, for example lung cancer 1n asbestos workers, age at
exposure has a bearing on the health Impacts. For older people, the effects
of asbestos exposure can be more serious 1n shorter periods of time than
would be predicted by the model for mesothelloma. People exposed at a young
age might not contract lung cancer until age 50, but 1f an older person,
already at high risk from cigarette smoking, Is exposed at age 45 or so,
lung cancer can appear In 5-10 years. In the case of anglosarcoma, older
animals are more susceptible.
An understanding of the mechanism of action 1s necessary before applying
mathematical models. Short-term effects seen 1n animals may not appear
after chronic exposure. Human chronic effects may not be observed 1n
animals. Multiplicative safety factors are needed and should continue to be
used until we have a sufficient data base to use other models.
-71-
-------
DISCUSSION
DR. MYRON MEHLMAN
It is now clear that subchronlc studies (90-180 days) have a high degree
of predictability for chronic systemic toxldty studies and could be
utilized to assess Injury from long-term exposure. In addition, based on
recent data, cardnogenldty may possibly be predicted from early subchronlc
studies. This requires extensive pathology experience.
DR. MAGNUS PISCATOR
The Influence of plasma pH on half-time 1s probably Insignificant. Even
1n conditions of addosls and alkalosls, the plasma pH shows small changes.
I suggest that "add-base status" 1s used Instead.
DR. RICHARD KOCIBA
My comments on this are reflected 1n the previous chapter (Interspedes
Conversion of Dose and Duration of Exposure). Overall, the empirical basis
for relating acute to subchronlc, and subchronlc to chronic durations via
use of appropriate uncertainty factors has been found to be a useful tool
and should be continued.
GENERAL COMMENTS
In connection with MacNamara's data (MacNamara, 1976), there 1s no trend
that points out pharmacokinetic relationships of the ratio of LDso/
lifetime exposure. Biological effects need to be considered as well as
the concentration.
High doses for a short period may not equal low doses for a long period.
Progressive increases in body burden would result from bone-seeking
compounds and some lipld seekers.
For unknown compounds, a battery of tests that produce no false nega-
tives is needed.
We must have data to estimate the risk of false negatives.
Short-term tests should be used as indicators for doing long-term test-
ing. Decisions regarding chronic toxldty should be based on chronic
tests, not short-term tests.
-72-
-------
Testing should deal with the problem of estimating human risk from
Intermittent exposure.
Occupational exposure data could be used to estimate human exposure.
-73-
-------
REFERENCES
MacNamara, B.P. 1976. Concepts 1n health evaluation of commercial and
Industrial chemicals, Chapter 4. In.: New Concepts 1n Safety Evaluation,
M.A. Mehlman, R.E. Shapiro and H. Blumenthal, Ed. Hemisphere Publishing
Corporation, Washington, DC. 455 p.
Spyker, J.M. 1975. Assessing the Impact of low-level chemicals on develop-
ment: Behavioral and latent effects. Fed. Proc. Fed. Am. Soc. Exp. B1ol.
34: 1835-1844.
Well, C.S. and D.D. McColllster. 1963. Relationship between short- and
long-term feeding studies 1n designing an effective toxlclty test. J.
Agrlc. Food Chem. 11: 486-491.
Well, C.S., M.D. Woodslde, J.R. Bernard and C.P. Carpenter. 1969. Rela-
tionship between single per oral one-week and 90-day rat feeding studies.
Toxlcol. Appl. Pharmacol. 14: 426-431.
-74-
-------
SYSTEMIC TOXICANTS
Incidence and/or Severity of Effects
Presentation: Dr. Kenneth Crump
Science Research Systems
Critique: Dr. Robert Neal
Chemical Industrial Institute
of Toxicology
Critique: Dr. Ronald Wyzga
Electric Power Research Institute
-75-
-------
PRESENTATION
DR. KENNETH CRUMP: HOW TO UTILIZE INCIDENCE AND/OR SEVERITY-OF-EFFECT DATA
IN SETTING ALLOWABLE EXPOSURES
Sublssues
1. How to account for severity of effects (acute lethality, cancer, weight
loss, changes 1n blood pressure or plasma enzyme levels, etc.).
2. How to utilize different types of data Including: Incidence data
(number of animals dead or with tumors, etc.); "continuous" data
(average levels with standard errors, etc.); limited or graded data
(severe, moderate or no liver necrosis, etc.).
Possible Options
1. (Used previously to set water quality criteria.) If carcinogenic,
extrapolate using linearized multistage model. If not, use the safety
factor approach (apply a safety factor to a NOEL, NOAEL or LOAEL).
Pro: Minimal data requirements.
Has been tested and 1s familiar to most.
Relatively simple to apply.
Con: Safety factor approach doesn't fully utilize shape of dose-response
curve.
With safety factor approach, smaller studies tend to yield higher
allowable exposures, which 1s Illogical.
Choice for safety factors 1s largely judgmental.
Inconsistencies may arise from applying different methods to cancer
and non-cancer data.
2. Extrapolate both Incidence and continuous data to low doses using mathe-
matical models. Continuous data could be extrapolated to a dose corre-
sponding to a certain percent change 1n normal levels or a certain
fraction of the standard deviation within a normal population. Extrapo-
lation to different levels could account for differing severity of
-76-
-------
disease (e.g., extrapolate cancer data to 10 5 lifetime risk and
weight loss data to 10~2). The smallest allowable exposure obtained
from any given health effect could be selected as the standard.
Pro: Accounts for shape of dose-response curve and utilizes all the
experimental data.
"Rewards" larger experiments and those with better experimental
designs (1f confidence Intervals are used).
More objective than safety factor approach.
Is not strongly dependent upon choice for mathematical model.
Con: Choice of extrapolation model 1s judgmental.
Has greater data requirements than Option 1.
Marginally more costly to Implement than Option 1.
3. Use mathematical models to estimate dose corresponding to lO'1 or some
other level 1n the "observable range", and apply a safety factor
reflecting the serverlty of the health Impairment and possibly the
nature and extent of the data.
Pro: Accounts for shape of dose-response curve and utilizes all the
experimental data.
"Rewards" larger experiments and those with better experimental
designs (1f confidence Intervals are used).
More objective than safety factor approach.
Is not strongly dependent upon choice for mathematical model.
Con: Choice for safety factor 1s large judgmental.
Has greater data requirements than Option 1.
Marginally more costly to Implement than Option 1.
-77-
-------
CRITIQUES
DR. ROBERT NEAL
In Dr. Crump's presentation, he Indicated he would not dlsucss extrapo-
lation from animal data to man. However, 1n my opinion the entire purpose
for setting allowable exposures 1s to protect man. Therefore, my comments
on Dr. Crump's presentation will assume that the primary purpose for setting
allowable exposure levels 1s to protect man and will therefore Include
considerations of extrapolation from animal data to man.
Dr. Crump has proposed three options for setting allowable exposures to
toxic chemicals. Option 1. For cancer-causing compounds he proposed the
use of linearized multistage models for extrapolating data obtained 1n the
observable range using experimental animals to low levels of exposure
experienced by man. Alternatively, 1f the compound 1s not carcinogenic, a
safety factor approach should be used, I.e., a safety factor applied to the
NOEL, NOAEL or LOAEL.
Option 2. Alternatively, he proposed that the risk estimation for both
carcinogenic and noncardnogenlc compounds could be carried out using mathe-
matical models to extrapolate to different levels of risk depending upon the
severity of the disease (I.e., more risk for cancer, less risk for weight
loss).
Option 3. Finally, he proposes using mathematical models to estimate
the dose which provides a calculated Incidence level for both carcinogenic
or noncardnogenlc effects (e.g., 10% or 1% Incidence) and then applying a
safety factor to the calculated dose. In this case the safety factor could
vary depending on the severity of the chemically-Induced disease.
-78-
-------
It 1s my opinion that the use of safety factors applied to a NOAEL for
toxic effects other than cancer 1s the most appropriate approach. This
approach has been used for a number of years In regulating human exposure to
potentially toxic compounds. The data to date suggest that this approach
has served us well 1n that there 1s no evidence of substantial adverse human
health effects from exposure to regulated levels of compounds determined
using these procedures. The most appropriate safety factor to use 1n deter-
mining the allowable exposure level should range from 10-100 depending upon
the severity of the effect.
In a report of their Safe Drinking Water Committee, the National
Research Council (NRC, 1977) has proposed that a 1000-fold safety factor be
applied to compounds for which the data are Incomplete. In my view, the
lack of data does not warrant this more conservative safety factor. The
choice of an appropriate safety factor should be dictated by the degree to
which we understand 1) the mechanism of the observed toxic effect and 2) the
applicability to man of the data generated 1n experimental animals. In
those cases where we do understand the mechanism of toxlclty and applicabil-
ity of the animal data to man, a smaller safety factor should be used. For
example, we understand the mechanism of acetylchollnesterase Inhibition by
organophosphate and carbamate Insecticides to a reasonable degree. We also
have knowledge of the applicability to man of quantitative acetylchollnes-
terase Inhibition data generated 1n rats and mice to man. Therefore, a
safety factor of 10 1s often applied to the NOEL for chollnesterase Inhibi-
tion 1n rats and mice 1n calculating the allowable exposure to man to
compounds Inhibiting this enzyme. Thus for compounds where the mechanism of
toxldty 1s known and the applicability of the data from experimental
animals to man Is also known, a safety factor less than 100 should be
-79-
-------
considered. In cases where these factors are not well understood, a more
conservative safety factor should be applied, particularly when the adverse
health effects are essentially Irreversible (cancer, neurotoxldty, terato-
genldty, reproductive toxldty).
The key missing element for extrapolating 1s a sufficient data base for
dose/response, namely, the raw data on the numbers of animals that have
certain effects. Without this we lack confidence In using mathematical
models for extrapolating to low Incidence rates. We can use a lower safety
factor 1f we know the toxldty mechanism and have confidence 1n the predict-
ability of our animal models. We can use a higher safety factor 1f we don't
know the toxldty mechanism and don't have confidence 1n the predictability
of animal models.
DR. RONALD WYZGA
I agree with Dr. Crump's general description of the pros and cons of the
three options. I would add or emphasize an additional con for the first
option -- the NOAEL-plus-safety-factor approach. The result 1s dependent
upon the number and kind of dose levels used 1n the available experiments.
If, for example, experiments were carried out at only high dose levels, the
criteria could differ significantly from experiments In which only low dose
levels were used. Hence under Option 1, point X2 In Figure 12 could be
defined as a NOAEL and a safety factor could be applied. If, however, the
dose corresponding to X2 were not Included 1n the experiment, the NOAEL
would be Xq, thus demonstrating the dependence of NOAELs upon the dose
O
levels of the underlying experiments.
I agree with Dr. Crump's assessment of Option 2, but would emphasize
that we'll never be able to discriminate among models at dose levels corre-
sponding to low risk, and any choice of a model remains arbitrary and poten-
tially controversial, particularly when we are dealing with noncarclnogens.
-80-
-------
SIGMOID CURVE
MODEL A
,— MODELS
• DATA POINT
FIGURE 12
Alternative Dose-Response Curves
-81-
-------
In the latter case, many different physiological mechanisms are Involved,
and we don't understand the underlying models for most of them. Moreover,
different mathematical models are most likely appropriate for different
toxic effects; hence there can be no justification for choosing the same
model for all Impacts. In Figure 12 for example, Models A and B fit the
data reasonably well, yet the differences between them at low dose levels
are very significant.
In my opinion, the third option 1s preferable. It attempts to standard-
ize the first by setting the "NOAEL-equlvalent" to the 10% response level
before applying a safety factor. I believe, however, that this approach
needs some modifications for the four reasons given below.
First of all, I have a problem with the exclusive use of one model to
estimate "dose corresponding to 10"1 or some other level 1n the observable
range" because of the great uncertainty about the appropriate model to use,
and because models can differ considerably at 10"1.
Secondly, the choice of KT1 1s somewhat arbitrary. If there are no
data near that response level, the use of models to estimate that point
could be misleading. Alternatively 1f data were present at lower response
levels, one might have more confidence at lower dose levels than those
corresponding to KT1, and this Information should be used.
Thirdly, the use of safety factors remains arbitrary, and arbitrariness
could therefore be reduced by avoiding their use.
Finally, the method needs to have sufficient flexibility when supple-
mentary Information exists, such as evidence of a threshold, contradictory
observational (human) data, recovery mechanisms, and detailed pharmaco-
klnetlcs.
-82-
-------
I'd like to suggest a modification which might be explored as a way to
overcome some of the above points. First of all, rather than use 10"1 as
a point of departure before extrapolation, I suggest one use the lowest
response point at which major models converge or the lowest observed point
as the departure point. Hence 1f there 1s significant variation among
models at levels greater than 10"1, the higher level of convergence should
be used as the departure point, thereby eliminating uncertainties Involved
1n choosing which model to use to reach 10"1. On the other hand, 1f model
agreement goes below 10"1, one can take advantage of this fact. From the
prior ECAO workshop held on February 5, 1982 1n Washington, DC, we have seen
that model agreement can occur at various points Including levels below
10"1 and levels much higher than 10"1. This proposed approach considers
the consistency of the behavior of a data set with regard to the model
structures. If the behavior 1s consistent, we can make greater use of the
models. If the behavior 1s Inconsistent, we then use the lowest non-zero
response observation as the departure point.
I would then extrapolate downwards from the departure point to zero.
This 1s less arbitrary a procedure than using safety factors, particularly
when the point of departure can vary. (If, however, the point of departure
were uniform, a safety factor could be defined which 1s equivalent to a
linear extrapolation.) The linear extrapolation 1s Initially conservative
and protective because It can be thought of as an upper bound to the class
of slgmold curves that usually fit biological data. In the data range
treated here, these curves are concave and thus bounded by a straight line.
The straight-line extrapolation 1s also consistent with the approach used
for carcinogens, since the upper 95% confidence Interval of the multistage
model, as formulated by Dr. Crump, 1s essentially a linear extrapolation to
zero.
-83-
-------
In the absence of further Information, one could then determine the
criteria associated with a particular risk level, e.g., 10~5 or 10~6. I
think 1t 1s Important, however, that the method be sufficiently flexible to
Incorporate new or additional Information when available, and I suggest that
the estimated curve be adjusted when warranted.
For example, 1f one believed or had evidence that a threshold existed
for a toxic substance, one could replace the zero at the lower end of the
extrapolation by the threshold point or Us lower confidence Interval. One
could do likewise at those levels where human observational data contradict-
ed the animal model Inferences. Other mechanisms could be found for treat-
Ing recovery mechanisms or pharmacoklnetlcs.
The proposed modification offers several advantages:
It accounts for the shape of the dose-response curve to the extent that
the behavior of the available data 1s consistent with the curve(s).
It accounts for model variability: when models are consistent with the
data, they determine the point of departure; 1f there 1s wide disagree-
ment among models, then no one model 1s anointed without evidence.
It uses all the data 1n trying to find the best convergence point for
the models.
It "rewards" larger and better experiments by obtaining better model
fits or more data that show the results are Inconsistent with model
structure. It also can yield a lower departure point when more data are
present.
It 1s more objective than methods that use safety factors.
The procedure 1s conservative when Information 1s scanty, yet 1t 1s
flexible when additional Information 1s available.
Because 1t 1s flexible, it provides Incentives to develop more Informa-
tion, which can lead to better decisions.
The proposed method does, however, have some drawbacks, and these need
to be addressed 1n further detail.
-84-
-------
First, because H Is a new approach, several details of Implementation
need to be worked out:
What models should be considered for convergence? These should be the
best candidates that presently would be considered under Options 2
and 3. We probably need more work here 1n examining alternative models
for various sets of data for noncardnogens.
What should be the criteria for model convergence? One could consider
the overlap of some small confidence Interval, such as the 50% level for
the models, or require that they will be within a factor of two or so of
each other. Again, more thinking needs to be applied to this area;
however, 1t clearly 1s an obstacle which can be overcome.
Criteria for adjustment need be established. Again reflection about
potential adjustments could lead to a reasonable set of criteria.
Adjustment procedures need to be worked out. They need to be tailored
to the purpose of the adjustment, but can 1n principle be designed.
See, for example, the suggestions given above for threshold behavior and
human data.
The proposed method also has greater data requirements than Option 1,
but 1t also makes use of more data and Information when they are available.
The method 1s more costly to Implement than the other options, but the
Implementation costs are minlscule compared to the costs associated with the
decisions that need to be made.
The proposed method may be too complex. This concern can probably best
be addressed only after attempts have been made to apply 1t.
-85-
-------
DISCUSSION
DR. WILLIAM NICHOLSON
Safety Factors for Human Exposures
The use of safety factors as discussed 1n the documents supplied to us
appears quite appropriate. I particularly would support Option 3 proposed
by Dr. Crump 1n which the full set of data Is utilized to estimate a risk of
10~2 for animals, and then safety factors relating to animal-to-human
extrapolation, sensitivity, and short-term vs. long-term exposure are
utilized. I would suggest that the terms NOEL, LOEL, etc. In Figure 1 are
somewhat misleading. If one defines a level of 1% for the LOEL that level
also 1s the maximum NOEL. Figure 13 provides slightly revised definitions
of these terms.
I have a strong reservation about what constitutes a NOAEL. In Figure 1
1t was suggested that a slight body weight decrease may be considered such a
non-adverse effect. While this may appear true for animals, 1t may not be
so at all for human beings. For example, nutritional defects at an early
age leading to slight weight losses have also been shown to significantly
affect mental performance, and thus what cannot be measured In an animal
(e.g., mental performance -- IQ, If you wish — behavioral performance
reflecting the Integrated capacity of the nervous system, etc.) 1s extremely
Important for humans. Thus, the application of the safety factors discussed
should be applied only to NOELs and not to NOAELs. If the latter are uti-
lized, an additional safety factor, taking Into account the non-comparabil-
ity of measured animal functions with human functions, should be applied.
Consider what might have been the consequences had thal1dom1de led to a
decrease of 20 units 1n an IQ test and not to the obvious birth defects seen.
-86-
-------
Max LOAEL
MinFEL
iu
I
UJ
oc
MaxNOAEL
Win LOAEL
Max NOEL
MinNOAEL
A Slight Body Weight Decrease
B Liver Necrosis
C Mortality
20
DOSE (ARBITRARY UNITS)
FIGURE 13
Response levels considered 1n defining threshold regions In toxlclty
experiments.
-87-
-------
DR. SHELDON MURPHY
I think that the question of precisely what extrapolations can or should
be made will never be answered to everyone's satisfaction. I liked the
approach proposed by Dr. Crump to extrapolate doses to estimate to a 10% or
1% risk of an effect and then apply a safety factor that appropriately
considers the nature and severity of effect. As was pointed out by several
speakers and discussants, the NOEL/safety factor approach has served us
quite well for Individual substances. For the most part, harmful chemical
exposures have resulted from Insufficient toxUologlcal data rather than
from failure to apply appropriate safety factors (a possible exception being
male antlfertHHy effect of DBCP).
DR. RICHARD KOCIBA
I endorse the present NOEL/safety factor approach as opposed to mathe-
matical modeling of the Incidence data and graded data. In most toxlclty
studies, the hlstopathology data are the most sensitive parameters defining
the NOEL. However, the hlstopathology data described 1n literature refer-
ences Is generally more qualitative than continuous. Thus, any mathematical
modeling of this 1s not appropriate and would Introduce unnecessary compli-
cating factors.
DR. HARRY SKALSKY
Dr. Kodba's comments concerning hlstopathology were extremely perti-
nent. His point was that the NOEL or NOAEL 1s frequently defined by patho-
logical parameters. These parameters are qualitative judgments and cannot
be quantified. This fact supports the traditional safety factor approach
and Indicates that the "mathematical" model approach would unnecessarily
complicate the Issue.
-88-
-------
DR. IAN NISBET
I support 1n principle the use of the procedure, proposed by Dr. Crump,
Involving the use of a model to extrapolate down to the 1% risk level
(equivalent to a standardized LOEL), followed by application of safety
factors of 100, 1000 or more. As was pointed out, this 1s operationally
equivalent to the use of a low-dose linear model to predict doses associated
with risk levels of 10~4, 10~s and lower. Hence, this procedure would
have the Important advantage of blurring or eliminating the distinction
between threshold and nonthreshold effects.
GENERAL COMMENTS
Model fitting to define a "take-off point" may be advantageous to the
regulatory process.
The model needs to take Into account different kinds of data -- dlchoto-
mous or graded.
There 1s no reason to convert to Incidence data; no reason why the model
can't be applied directly to graded data. Dr. Wyzga's approach 1s an
"Interesting extension" of Option 3, equivalent to taking a multistage
approach which has been used for carcinogens. However, this approach
might lead to Inordinately low levels.
Hlstopathology will be the limiting factor, not the graded data. There
may be a problem applying a mathematical model to qualitative hlsto-
pathologlc data. Although 1t 1s not Impossible to quantify hlstopatho-
loglc data, detailed hlstopathologlc data 1s usually not be accepted by
journals.
Several people have stated that safety factors have served us well and
the approach shouldn't be changed. This may not be true. There may
have been effects for which we may not have seen the association 1n the
human population.
Dose-response models have not been proven to hold across species or for
extrapolating within species to lower rates.
-89-
-------
REFERENCES
NRC (National Research Council. 1977. Drinking Water and Health. Vol. 1.
Safe Drinking Water Committee, NAS, Washington, DC. 939 p.
U.S. EPA. 1982. Examination of Options for Calculating Dally Intake Levels
(DILs). Prepared by Dr. Kenneth Crump under EPA P.O. No. C2171NAST.
January 2, 1982.
Well, C.S. and D.D. McColllster. 1963. Relationship between short- and
long-term feeding studies In designing an effective tox1c1ty test. J.
Agrlc. Food Chem. 11: 486-491.
-90-
-------
SYSTEMIC TOXICANTS
Route-to-Route Extrapolation and the Pharmacok1net1c Approach
Presentation: Dr. James WHhey
Food Directorate, Bureau of Chemical
Safety
Critique: Dr. Ellen O'Flaherty
University of Cincinnati
Critique: Dr. Julian Andelman
University of Pittsburgh
-91-
-------
PRESENTATION
DR. JAMES WITHEY: ROUTE-TO-ROUTE EXTRAPOLATION AND THE PHARMACOKINETIC
APPROACH
In deriving the criteria for ambient water quality, the U.S. EPA noted
that, for some substances, no data were available 1n which the appropriate
oral or 1ntragastr1c route of administration was used (Federal Register,
1979). In these cases, data from Inhalation studies or the American Confer-
ence of Governmental and Industrial Hyg1en1sts (ACGIH) threshold limit
values (TLVs) were used 1n the manner suggested by Stoklnger and Woodward
(1958). As an example of the application of this principle, the derivation
of the ambient water criteria for barium was obtained from the value for Us
TLV (0.5 mg/m3) In the following manner.
Inhalation
TLV = 0.5 mg/m3
Volume Inhaled = 10 m3 (per 8-hour day)
Total amount Inhaled = 10 x 0.5 mg/day
Absorption factor (Inhalation) = 0.75
Amount reaching systemic circulation = 10 x 0.5 x 0.75 = 3.75 mg
Equivalent Oral Intake
Maximum dally water Intake = 2.00 8,
Absorption factor (drinking) = 0.9
x or 4.17 mg/day may be consumed
0.9
4.17 mg In 2.9 il of water Is 2 ppm
Limitations of the Stoklnger-Woodward method Include the following:
1) precisely measured absorption factors are lacking; 2) the TLV may not be
based on systemic toxlclty; 3) extensive hepatic metabolism (detoxification)
may reduce the systemic toxUHy by the oral route (first-pass effect) and
reduce the toxUologlcal Insult compared to an equivalent dose given by
another route; and 4) temporal relationships of blood levels, post-adminis-
tration, are not considered.
-92-
-------
PharmacoklnetU considerations allow some judgment as to the validity of
assuming equivalent Insult for the same dose administered by different
routes. An excellent example of an almost Identical effect arising from the
administration of valprolc add by the oral and l.v. routes 1n six different
subjects has recently been published (Perucca et al., 1978). The similarity
of blocd-level/tlme curves following the administration of vinyl chloride
monomer 1n aqueous solution by the 1ntragastr1c or Intravenous routes has
also been pointed out (WHhey, 1976).
Temporal relationships of blood concentrations during and following the
administration of a dose are useful 1n the assessment of both the magnitude
and duration of effect.
Vapors or gases are usually absorbed Into the systemic circulation at a
constant rate, described by a zero-order process, since the atmospheric
concentration of the toxicant remains constant throughout the exposure.
Substances administered orally or intragaslrlcally are absorbed by a first-
order process, although the uptake mechanlsm(s) and Interactions with other
substances within the gastrointestinal tract may be very complex (Barr,
1968). Elimination, which includes metabolism, will usually Involve a
series of simultaneous and consecutive first-order processes which may be
described by mul ticompartment mathematical models.
It is Instructive to compare the concentration of a toxicant in the
central pharmacokinetic compartment after dosing by different routes. In
this presentation, the effects of a 10-hour vapor phase exposure will be
compared to two oral exposures given as four equal divided doses over 10
hours (dose interval = 2.5 hour) and 20 equal divided doses over 10 hours
(dose Interval = 0.5 hour). In all three exposures, the total administered
dose is equal and absorption 1s assumed to be 100%. For oral exposures, the
elimination rate Is assumed to be 10 times slower than the absorption rate.
-93-
-------
In the simplest case, 1f a zero-order uptake 1s assumed for the 10-hour
vapor phase exposure and first-order uptake and elimination for one-compart-
ment model after oral dosing, the magnitude of the steady state concentra-
tions on repeated dosing will depend upon the magnitude of the dose and the
dose Interval (WHhey, 1983). Figures 14, 15, 16 and 17 show how blood
concentration varies with different routes, dosing Intervals and the kinetic
parameters for absorption and elimination. These parameters are given 1n
Table 5.
In the first case (Figure 14), vapor phase exposure results 1n a rapid
achievement of steady state (-30 minutes) and a rapid return to zero levels
after termination of the exposure (-90 minutes). The four oral doses yield
rapid excursions to high peak concentrations (28 pg/ms.) and a return to
zero levels after administration of each dose. The twenty-dose series gives
steady state levels with maxima and minima oscillating about the steady
state level for the inhalation dose. Clearly, in the case of the four-dose
regimen there could be situations in which thresholds would be exceeded
which might elicit different toxic effects than those seen with the other
protocols. There is no evidence of bloaccumulation if these protocols are
extended over several days.
Figure 15 (Case 2) shows blood concentrations that occur when the elimi-
nation rate is reduced to about one-tenth of that in Case 1, and the oral
absorption rate 1s kept at 10 times that of the excretion rate. It is
evident that steady state conditions do not exist at any time during the
exposures and that the twenty-dose regimen very closely approximates the
vapor phase exposure. Again, there 1s no suggestion of a potential for
bloaccumulation.
-94-
-------
10
cn
i
Dose Interval
20-
[I
ll
Case 1
Dose
Interval
i!
) * '* " i * \* , \ f IH l^ i '' " * V'1 i 'I ^
't \\ '\ " !' ll' \i' "• «' " li'1 ll !' il I* il1 at 'l !' 'I
» iLi' ."i it J 'i I\ . '. t\ .". 1. i\ I*. ' * ' \ !*'. t • l\ i * 1 ' ,
^vv'f^v^vwNiviwvvx
I ^ i >w i >w ! ^ *^3
r
) 2
1
4
1
6
1
8
1
10
I
12
TIME (hr)
FIGURE 14
Case 1. Temporal blood concentration relationships for uptake by Inhalation and gastrointestinal
routes. ( ) 10-hour Inhalation exposure. ( ) four equal divided doses, orally over 10 hours.
( ) 20 equal doses, orally over 10 hours. The rate coefficients for absorption and elimination for
Case 1 are given 1n Table 5.
Source: WHhey, 1983
-------
Dose Interval
er>
i
^s
o>
30n
O
C 20
O
Z 10
O
O
O
O
O
*i o
f\
*
. Dose /
\ Interval
f~\
!\
Case 2
I
2
I
4
I
6
I
8
I
10
I
12
I
14
TIME (hi)
FIGURE 15
Case 2. Temporal blood concentration relationships for uptake by Inhalation and gastrointestinal
routes. ( ) 10-hour Inhalation exposure. ( ) four equal divided doses, orally over 10 hours.
( ) 20 equal doses, orally over 10 hours. The rate coefficients for absorption and elimination for
Case 2 are given In Table 5.
Source: WHhey, 1983
-------
200 -I
150 -
(A
-O
I
E
O>
z
g
<
Z 100 J
UJ
o
z
o
o
O 50
O
O
Dose
Interval
Case 3a
Dose Interval
I
) 5
I
10
I
15
I
20
I
25
I
30
I I
35
TIME (hr)
FIGURE 16
Case 3a. Temporal blood concentration relationships over 37 hours for uptake by Inhalation and
gastrointestinal routes. ( ) 10-hour Inhalation exposure. ( ) four equal divided doses,
orally over 10 hours. { ) 20 equal doses, orally over 10 hours. The rate coefficients for absorption
and elimination for Case 3 are given In Table 5.
Source: HHhey, 1983
-------
200 -i
E
O)
150 -
<
QC
2 100 -
UJ
O
oo
I
O
O
O
O
O
to
50 -
Case 3b
1
5 1
I
2
I
3
n
4
\ r
n + 1 n + 2 n + 3
TIME (days)
FIGURE 17
Case 3b. Temporal blood concentration relationships to steady state for uptake by Inhalation and
gastrointestinal routes. ( ) 10-hour Inhalation exposure. ( ) four equal divided doses,
orally over 10 hours. ( ) 20 equal doses, orally over 10 hours. The rate coefficients for absorption
and elimination for Case 3 are given 1n Table 5.
Source: WHhey, 1983
-------
TABLE 5
Significant Parameters for the Dosing Routes and Regimens Illustrated 1n Figures 14-17
Parameter
Absorption Rate Coefficient (hr-1)
t]/2 (m1n.)
Elimination Rate Coefficient (hr-1)
t]/2 (rain.)
Total AUC (0-24 hours)
AUC {0-10 hours)b
Limiting Maximum Concentration (pg/ral)
Limiting Minimum Concentration (pg/ml)
Oral
innaiation
Case 1
11. 2«
—
4.61
9.0
24.3
23.77
2.43
2.43
Case 2
11.23
—
0.599
69
186.9
55.8
18.69
18.69
Case 3
11. 2*
—
0.0288
1440
3888
1642
194.3
129.8
Case 1
46.1
0.9
4.61
9.0
24.30
24.30
21.68
0.0003
4 doses
Case 2
5.99
6.9
0.599
69.0
186.9
172.0
28.72
8.96
Case 3
0.288
144
0.0288
1440
3888
1665
182.1
139.1
Case 1
46.1
0.9
4.61
9.0
24.30
24.30
4.87
0.69
20 doses
Case 2
5.99
6.9
0.599
69
187.0
157.4
19.36
17.49
Case 3
0.288
144
0.0288
1440
3888
1558
181.7
143.1
aThe units for the Inhalation uptake rate are mg/l/hr.
DThe units for AUC are pg/hr/ml and the areas were determined for steady state conditions,
-------
In Figures 16 and 17 (Case 3a and 3b), the elimination rate has further
been reduced by a factor of 20 compared to that of Case 2. There 1s very
little difference 1n the temporal relationship of blood levels after dosing,
although steady state levels are not reached for 4 or 5 days of repeated
dosing (see Figure 17). There 1s, In this example, evidence for bloaccumu-
latlon.
These Illustrations, which could be extended to accomodate specific
compounds with known pharmacoklnetlc parameters for any pharmacoklnetic
model, suggest that the larger the number of oral doses, the more closely
the concentration time curve will correspond to an Inhalation exposure.
Route extrapolation will be more applicable for compounds w1 th very slow
elimination rates, like some pesticides and heavy metals, than for sub-
stances that are rapidly metabolized or excreted. It should be noted that,
1n every case where the different routes and regimens were compared, the
area generated under the blood-level/tlme curve was the same. The area
under the blood-level/tlme curve may not be a good Indicator of equivalent
systemic toxldty, especially when elimination 1s rapid. Finally, where the
toxic Insult 1s generated at the site of uptake rather than as a consequence
of uptake to the systemic circulation, as 1n the case of pulmonary exposure
to arsenic or manganese, route extrapolation using any kind of modeling
approach 1s precluded.
-100-
-------
CRITIQUES
DR. ELLEN O'FLAHERTY
The Stoklnger-Woodward method, which has been used to convert exposure
data from Inhalation studies to comparable oral or 1ntragastr1c exposures,
has an additional limitation that was not specifically discussed by Dr.
WHhey, although he has alluded to the paucity of precisely measured absorp-
tion factors. It 1s well known that deposition, and subsequently absorp-
tion, are strongly dependent on particle size. Figure 18 1s taken from the
report of an extensive and carefully Integrated series of studies of human
absorption and deposition of lead, especially of lead from automobile
exhaust (Chamberlain et al., 1978). It shows deposition In lung of wind
tunnel aerosols generated by burning gasoline containing radlolabeled
(203Pb) tetraethyllead 1n an automobile engine. Particle size of the
aerosols was varied by varying the rate of dilution and entry of the engine
exhaust Into the wind tunnel air flow. The diffusion mean equivalent
diameter (DMED) 1s given for each of the three deposition curves. The
percentage of the Inhaled aerosol deposited 1n the lung was measured by two
different methods, the results of which agreed well: by difference (Inhaled
203Pb minus exhaled 203Pb), and by external gamma ray spectrometry of
the 203Pb deposited 1n the lungs. Figure 18 shows the strong dependence
of deposition both on breathing cycle and on DMED. At the 4-second breath-
Ing cycle of standard man, deposition varied from 24-68% within this size
range of very small particles.
Interestingly, further work by Chamberlain et al. (1978) showed that
once deposited, lead was absorbed Into the systemic circulation at remark-
ably similar rates Irrespective of particle size (1n the subwUron range),
-101-
-------
10
70
I 60
e
E 50
e
V
30
20
10
i r
xx.
0 02*m
0-09/tm
_L
4 6 • 10
•r»alhing cycle (itcondt
FIGURE 18
Deposition 1n Lung of Wind Tunnel Aerosols
Source: Chamberlain et al., 1978 (Copyright permission granted)
-102-
-------
chemical nature, or concentration 1n the lungs. These observations are
Illustrated 1n Figures 19-21. Eight subjects participated 1n these experi-
ments.
Figure 19 1s a comparison of the lung clearances of 203Pb-labeled
aerosols of different chemical composition and DMEO. Clearances are very
similar up to 10 hours, with some differences (related to aerosol type)
appearing at later times. By 30 hours, less than 10% of all aerosols but
the carbonaceous exhaust remained 1n the lung.
Figure 20 shows the appearance of 203Pb In the blood during clearance
of several of the aerosols (see Figure 19) from the lung. These data are
compared with the results of an earlier study (the curve labeled "1973/74
exhaust aerosols"), and with the percentage of a single dose of PbCl_ 1n
saline remaining 1n the blood at different times after Intravenous Injec-
tion. The similarity of the peak percentages, achieved at about 30 hours
after either Intravenous Injection or cessation of Inhalation exposure, was
used by the authors to support their conclusion that much of the -30% of
203Pb originally deposited 1n the lungs but subsequently unaccounted for,
had 1n fact been absorbed Into the systemic circulation before distribution
Into peripheral tissues.
Finally, Figure 21 Illustrates the lack of dependence of lung clearance
on the amount of lead deposited 1n the lungs. For comparison, the authors
noted that continuous human exposure to 1 ^g Pb/m3 with 50% retention
would lead to dally deposition 1n the lungs of 8 Pg of lead. In this
study, the percentage of deposited lead found 1n the blood 48 hours later
was only marginally affected (not significant at the 10% level) by the
amount originally deposited.
-103-
-------
_c
"5
I
Curvt 1 lead nitrate
2 Wind tunnel txhoust (0 02«m)
3 Clean txhoust I 0
4 U/Y exposed exhaust (0-Spm)
5 lead oxide
6 Carbonaceous exhaust
Hours elapsed
FIGURE 19
Comparison of the Lung Clearance of Various 203Pb-Labeled Aerosols
Source: Chamberlain et al., 1978 (Copyright permission granted)
-104-
-------
0-5
12 5 10
Time . otter intokt (h 1
20
SO 100
FIGURE 20
Levels of 203Pb 1n Blood Following Inhalation of Exhaust Oxide
or Nitrate Aerosols, or Injection of
Source: Chamberlain et al., 1978 (Copyright permission granted)
-105-
-------
c
Jt
T
t
S
,0
t«
IKI
'<
i
o
"i
w
x Aggregated(box) exhaust aerosol (tt 11:i^g Pb deposited in lung
• • • PbO M ® 1Wi7D>«g Pb i • i
• Mean curve for wind tunnel exhaust aerosol © 1-5:V3pg Pb deposited in lung
ro
0-1
10
100
Hours elapsed
FIGURE 21
Lung Clearance at Different Levels of Stable Lead Deposition
Source: Chamberlain et al . , 1978 {Copyright permission granted)
-------
DR. JULIAN ANDELMAN
The principal question addressed by Dr. WUhey 1n his presentation was
the extent to which human or animal data on the effects or uptakes via one
route of exposure can be extrapolated to another route. One example
mentioned was the Stoklnger-Woodward approach 1n which AC6IH TLVs for air
exposures have been used to derive similar limitations for oral route uptake
via drinking water. Dr. WHhey discussed the Impact of dose frequency and
absorption and pharmacoklnetlc phenomena on body burdens. He discussed a
paper by Perucca et al. (1978) to evaluate the equivalency of oral versus
Intravenous dosing.
The short-term (repeated dose) equivalencies and fluctuations are of
Interest and Importance, although perhaps more from the point of view of
simply demonstrating that routes can be compared. There are Instances where
an exposure of hours or less could be of Importance, such as a tank car
spill, but a more Important need 1s probably to consider substantially
longer exposures, as In Wlthey's case 3 example, at least when attempting to
assess the Impacts at hazardous waste sites. Fluctuations around a constant
exposure should, however, not be set aside entirely, as they could, for
example, be Important for children playing periodically 1n the vicinity of a
chemical dump site. Nevertheless, the discussion below will focus primarily
on comparisons of routes 1n the steady state; that 1s, when the exposure Is
sufficiently long so that the uptake, metabolism, excretion and other
kinetic phenomena generally occur sufficiently fast compared to the exposure
period. The steady state 1s simply defined here as the achievement of a
relatively constant body burden (e.g., blood concentration) and an equal
rate of uptake and elimination.
-107-
-------
The Important absorption, excretion and pharmacoklnetlc Issues relevant
to extrapolation from one route of exposure to another, particularly Inhala-
tion to oral or vice versa, can be stated as follows:
1. Is a single or two-compartment model with zero- or first-order absorp-
tion kinetics and first-order metabolism and excretion valid? What are
the Implications of such a model?
2. Can LD,-0/LC,-0 ratios be used to estimate other relative effects for
air and water at much lower exposure concentrations; that 1s, do the
kinetics change at lower exposures?
3. Can homologous series or other chemical comparisons be used to estimate
blood levels (and, hence, effects) by different routes of exposure?
4. Are "first pass" effects Important 1n long-term, steady-state systems?
5. If a metabolite 1s the active toxic agent, will a proportionality of
dose and blood concentration be obtained?
All these Issues relate to the proportionality between doses via dif-
ferent routes and the predictability of effects by comparison of routes of
uptake. In order to examine these questions 1t 1s useful to consider the
relevant aspects of the model, with particular emphasis on the single
compartment model. This will be considered for two types of uptake.
Zero-Order Uptake
Here, zero-order uptake will be defined as uptake that Is rapid, com-
plete, and Independent of concentration, I.e., the total amount of chemical
Inhaled or Ingested 1s absorbed across the alveolar membrane or 61 mucosa.
In this case, nevertheless, the dally absorbed amount 1s still proportional
to the pollutant concentration 1n the air or water, simply because the
amount Inhaled or Ingested Is a product of concentration and volume Intake,
typically about 20 m3 air/day for a 70 kg adult male, or 2 8, of drinking
-108-
-------
water per day. These analytical relationships for the single-compartment
model with such a zero-order uptake are shown 1n Table 6. The model assumes
that metabolism and excretion are first order with respect to the blood
concentration. As shown In Table 6, 1n the steady state for either air or
GI tract absorption, the blood concentration 1s proportional to the concen-
tration of the chemical 1n the air. At the same time, 1f one wishes to
compare the steady state blood concentration due to lung absorption versus
GI tract absorption, these are proportional to the relative concentrations
1n air and water. The Implication of this model 1s that for complete
absorption by the two routes, the biological effects associated with blood
concentrations are proportional to the Intake quantities.
First-Order Uptake
The steady state, single-compartment model considered In Table 7 assumes
that the uptake across the alveolar membrane or GI mucosa corresponds to
first-order kinetics; that 1s, the rate of uptake 1s proportional to the
concentration 1n the air or water. It differs, however, from the zero-order
situation 1n that now the proportionality constant Is a true rate constant,
unknown a priori, and not simply equal to the dally volume of air or water
Ingested. In addition to adsorption, the model has also Incorporated a
first-order desorptlon from the alveolar membrane.
For the steady state via the lung route, the blood concentration 1s
proportional to the air concentration. One Important point that derives
from the model 1s that the proportionality constant 1s not simply equal to
that predicted by Henry's Law. Rather, 1t predicts a lower steady state
concentration In the blood than would be expected at a true equilibrium.
This 1s a consequence of the fact that there are other pathways (metabolism
and excretion), 1n addition to simply desorptlon back Into the lungs via the
alveolar membrane.
-109-
-------
TABLE 6
Single Compartment Model - Zero-Order Uptake
1. VIA LUNG
The dally uptake 1s proportional to the air concentration:
la = Ca x Vda
The rate equation for the dally change In blood concentration 1s:
dCb/dt = Ia/Vb - km x Cb - ke x Cb
In the steady-state dCb/dt 1s zero. Therefore,
Cb = (Vda/Vb) * Ca/(km + ke)
2. GI TRACT
In the steady-state one obtains a similar relationship:
Cb = (WVb) x Cw/(km + ke)
3. COMPARING LUNG AND GI ABSORPTION
To obtain the blood concentration ratios via air and water routes, one
simply divides the final equation In 1 and 2:
Cb-alr/Cb-water = (Vda * Ca)/(Vdw x Cw)
Definitions:
C. = the blood concentration of an absorbed chemical
C = air concentration
a
C = water concentration
w
V. = dally volume of air Intake
V. = dally volume of water Intake
dw J
V. = blood volume
b
I = dally mass Intake of a chemical via air route (Ia) or
water (Iw)
k = first order rate constant for absorption (ka), excre-
tion (ke), or metabolism (km). In the case of
absorption one can have either absorption by the lung
(ka_L) or the GI tract (ka_6I).
kd = first order desorptlon constant, usually from the lung
and relevant to the model for first order uptake via
the lung.
-110-
-------
TABLE 7
Single Compartment Model - First-Order Uptake*
1. VIA LUNG
Following the approach shown 1n Table 6:
dCb/dt = ka_L x Ca - kd x Cb - km x Cb - ke x Cb
In the steady-state dCb/dt = 0. Therefore,
Cb = ka-L x ca/(kd + km * ke)
2. GI TRACT
By direct analogy, one can also show:
cb = ka-GI x Cw/(kd + km + ke)
3. COMPARING LUNG AND GI ABSORPTION
To obtain the blood concentration ratios via air and water routes, one
simply divides the final equation 1n 1 and 2:
Cb-a1r/cb-water = (ka-L x ka-Gl) x (Ca/Cw)
*See Table 6 for definitions of terms
-111-
-------
As 1n the zero-order model, the GI steady state system has the same form
as that for air uptake. This then leads to a comparison of blood concentra-
tions for these two routes, as shown 1n Table 7, which Indicates that their
relative blood concentrations are proportional to the ratio of concentra-
tions 1n the air and water. This 1s the same result as 1n the zero-order
case. However, the proportionality constant 1s a true ratio of first-order
rate constants and cannot be predicted a priori.
Implications and Conclusions
Both the zero- and first-order absorption kinetics Indicate a propor-
tionality between blood concentrations and dose 1n the steady state. This
1n turn Implies a proportionality between air and water concentrations that
should remain constant among effects. However, If the absorption kinetics
are first rather than zero order, this proportionality 1s not predictable a
priori. As discussed by Or. Pepelko 1n the following section on multiple
route exposure, an article by Pozzanl et al. (1959) has compared Inhalation-
route LCgQ values to those of oral ID™ for a strain of rat. These
values were not derived from long-term studies, so there 1s some doubt
whether steady state was obtained 1n each case. Nevertheless, about half
these chemicals had relatively equal values of LC5Q/LD50 and four — all
chlorinated chemicals -- had substantially lower values. Such data Indicate
that there would be some value 1n compiling and analyzing similar data, so
as to be able to test further the proportionality of doses by these two
routes, as well as to establish whether there 1s any systematic Impact of
chemical type. It would be of particular Interest to see 1f the proportion-
ality of dose would hold for different effects, particularly those that
might be manifested at substantially different dose levels.
-112-
-------
Another set of related concerns Involves the question of "first-pass"
effects and what Impact there 1s on the kinetic relationships 1f a metabo-
lite 1s the active toxicant. If the chemical of primary exposure 1s also
the toxicant, 1t need not follow 1n chronic exposure that a first-pass
metabolic effect eliminates the concentration 1n the blood completely. That
1s, there may be redrculatlon at a constant concentration, and the kinetic
treatment 1n Table 6 deals with such metabolism. Similarly, an active
metabolite might also circulate at constant concentration. Thus, 1n both
these Instances, the blood and, hence, the toxic effect could be propor-
tional to the concentration of the chemical 1n the water or air to which
there 1s exposure.
Conclusions
1. Both zero- and first-order absorption Imply proportionality between
Intake concentration and blood levels.
2. For zero-order absorption, dose estimation 1s "simple."
3. For first-order absorption, relative route dose estimation requires
either experimental kinetic data or can be deduced from known toxlco-
loglcal data (e.g., LC5Q vs. LD5Q).
4. Henry's Law estimate of blood concentrations 1n equilibrium with air
concentrations may be Invalid 1n steady state (too high a value of CD).
D
5. If metabolism 1s first-order kinetics, conclusion (1) holds. Thus,
"first-pass" effects need only address the location of the target site,
not the question of proportionality. In long-term steady state, this
Issue may be unimportant.
-113-
-------
DISCUSSION
DR. MYRON MEHLMAN
This approach 1s essential for modeling. However, 1n the absence of
data 1t will remain very limited for a long time to come.
DR. MAGNUS PISCATOR
Concerning Dr. WHhey's discussion of the limitations of the Stoklnger-
Woodward method at the beginning of his presentation, 1t should be added
that TLVs are not supposed to protect every worker. There Is generally no
safety factor 1n a TLV. Susceptible Individuals are not protected.
DR. SHELDON MURPHY
Although I view the Issue of this conference as one of assessing the
potential for toxic Interactions resulting from exposure to a mixture of
chemicals, I do not believe we will ever determine ADIs of mixtures.
Instead we will need to know how and how much one or more chemicals will
affect the toxldty of the most biologically reactive or predominant-quan-
tity chemicals 1n the waste dumps and/or the surrounding environment.
Therefore, 1f we know the mechanism of action, the routes and mechanisms of
metabolism, and the rates and routes of absorption and excretion of Individ-
ual chemicals, we may be able to use rate constants, affinity constants, and
potency of Intrinsic activity at sites of Injury for Individual chemicals to
predict the likelihood of more (or less) hazard 1n the presence of other
chemicals. We should strive to obtain or encourage acquisition of appro-
priate basic data. In the meantime, EPA will have to work with the data
available. For many of the common substances 1n waste dumps, there may 1n
fact be more data 1n the published literature on these basic characteristics
than on the actual toxldty of Interactions, and therefore some theoretical
predictions of added hazard from combinations may currently be possible.
-114-
-------
As has been encouraged at this workshop, EPA should strive to use all avail-
able Information to attempt an Integrated assessment of hazard.
DR. RICHARD KOCIBA
The Inherent limitations of the Stoklnger-Woodward Basis for Extrapola-
tion were clearly Identified 1n the discussion and should be accommodated.
Any regulatory plan should have the flexible basis to utilize pharmaco-
klnetlc data when they are available, because the most valid estimate of
Interspedes extrapolation 1s based on the concentrations at the critical
target tissue level, rather than concentration 1n ambient air or other bases.
DR. HARRY SKALSKY
The Stoklnger-Woodward'model has served as a necessary approximation of
route-to-route extrapolation, but as Dr. Wlthey suggests, 1t 1s time to
become more sophisticated. Indeed, a goal should be set to establish models
that can extrapolate on the basis of concentration of toxicant at the criti-
cal target tissue.
GENERAL COMMENTS
Pharmacok1net1c models may fit the data for one species but the varia-
tions can be very large among different species.
The number of apparent compartments required for adequate model fitting
varies depending upon the dose at which the compound 1s administered.
For low doses, one-compartment models may be appropriate. At the high-
est tolerable dose, a three-compartment model may be appropriate.
The continued use of the Stoklnger-Woodward model to derive limits on
uptake from water from TLVs for air exposure should be questioned. TLVs
may be too high when applied to the general population, since TLVs are
defined as that level which will not cause some degree of harm to the
working population.
It should be noted that Dr. Stoklnger did not expect his paper to be
taken so seriously; the model was only suggested for an emergency
situation.
-115-
-------
REFERENCES
Barr, W.H. 1968. Principles of blopharmaceutlcs. Am. J. Pharm. Educ. 32:
958-981.
Chamberlain, A.C., M.J. Heard, P. Little, D. Newton, A.C. Wells and R.D.
Wlffen. 1978. Investigations Into Lead from Motor Vehicles. Report No.
AERE-9198. Environmental and Medical Sciences Division, AERE, Harwell,
England.
Federal Register. 1979. U.S. Environmental Protection Agency, Water
Quality Criteria, Requests for comments, Part V. 44(52): 15926-15981.
Perucca, E., 6. Gattl, C.M. Frlgo and A. Crema. 1978. Pharmacoklnetlcs of
valprolc add after oral and Intravenous administration. Br. J. Pharmacol.
5: 313-318.
Pozzanl, U.C., C.S. Well and C.P. Carpenter. 1959. The toxIcologUal basis
of threshold limit values: 5. The experimental Inhalation of vapor mixtures
of rats, with notes upon the relationship between single dose Inhalation and
single dose oral data. Ind. Hyg. J. 20: 364-369.
Stoklnger, H.E. and R.L. Woodward. 1958. Tox1colog1c methods for estab-
lishing drinking water standards. J. Am. Water Works Assoc. 50: 515-529.
-116-
-------
WHhey, J.R. 1976. Pharmacodynamlcs and uptake of vinyl chloride monomer
administered by various routes to rats. J. Toxlcol. Environ. Health. 1:
381-394.
WUhey, J.R. 1983. Approaches to route extrapolation, Chapter 15. Ir±:
Principles for the Evaluation of Toxic Hazards to Human Health, R.G. TardUf
and J.V. RodMcks, Ed. Plenum Publications Inc., New York, NY. (In press)
-117-
-------
SYSTEMIC TOXICANTS
Multiple Route Exposures
Presentation: Dr. William Pepelko
ECAO, OHEA, U.S. EPA
Critique: Dr. Myron Mehlman
Mobil 011 Corporation
Critique: Dr. James Mellus
National Institute for Occupational
Safety and Health
-118-
-------
PRESENTATION
DR. WILLIAM PEPELKO: HULTIROUTE EXPOSURE
Present Approach
Multlroute exposure 1s considered 1n the "Guidelines for Deriving Water
Quality Criteria for the Protection of Aquatic Life and Its Uses" (45 FR
79354). For noncardnogens, ADIs and criteria are calculated from total
exposure data that Include contributions from the diet and air. The crite-
rion [C] for noncardnogens 1s calculated using the following equation:
c _ ADI - (DT + IN)
" 2 I -t- (0.0065 kg x R)
where 2 8, 1s assumed dally water consumption, 0.0065 kg 1s assumed dally
fish consumption, R 1s bloconcentratlon 1n units of I/kg, DT is estimated
non-fish dietary Intake, and IN 1s estimated dally Intake by Inhalation.
If experimental data are not available to estimate IN, then assumptions
concerning ambient air concentrations, absorption percentage, etc. are made.
The volume of air respired can be estimated using standard equations. In
the case of the rat for example, the number of cubic meters breathed per day
2/3
1s 0.105 [H/0.113] ', where W = body weight 1n kg.
Possible Approach
Any method used, 1n order to be valid, must estimate the concentration
of the test substance at the critical target organ. Assuming that the site
of exposure 1s not the primary target organ, then the relative contribution
from each source may be summed. A summation of Intake from each route,
however, may lead to serious errors since the effectiveness of a chemical
can vary considerably depending upon route of exposure, even with absorption
of equivalent amounts. For example, some chemicals such as Udocalne are
removed from the circulation on the first passage through the liver, thus
rendering them Ineffective via the oral route.
-119-
-------
When adverse effects are not essentially at the site of entry and resi-
dence time 1n the body 1s several hours or more, then route equivalence can
be assumed to be based on expected blood concentrations. The net dose
(dn), for a given species 1s then the sum of the absorbed doses for the
various routes.
dn = Vl + d2r2 * •'• d1 r1
where r 1s the absorption fraction and 1,2 ••• 1 refer to various routes
of exposure. Human exposures are generally limited to oral, Inhalation and
occasionally dermal exposures. Experimental animals, however, may be
exposed via 1ntraper1toneal, Intravenous, subcutaneous and other routes. An
assessment based on an animal ADI, for example, then assumes that acceptable
exposure levels are those 1n which ADI 1s greater than d .
In the case of chemicals that are rapidly removed by the liver or rapid-
ly metabolized by peripheral tissues (cyanide for example), the concentra-
tion at the critical target organ may vary depending on route of entry.
Then the dose for each route must be related to acceptable Intake for that
route, and the resulting scaled dose values summed to determine acceptable
exposure. For the oral route the dose value would equal dally dose d,
divided by the oral ADI. For the Inhalation It would equal dp/8-hour TLV
where d~ equals concentration In mg/m3 x hours exposed/day divided by 8.
If necessary a dermal exposure value could be calculated similarly. Accept-
able exposure levels would be those 1n which d,/ADI + dp/TLV was less
than one.
Finally, 1n the case where the target organ differs depending upon route
of exposure the doses are not summed and the treatment 1s the same as that
for two different chemicals.
-120-
-------
Issues
Some chemicals such as 11doca1ne are completely removed by the liver 1n
a single passage. Thus, even with 100% absorption from the GI tract, none
will reach the peripheral circulation. With Inhalation or dermal exposure,
peripheral regions will be exposed. Thus the extent of liver removal or
alteration must be considered.
A test chemical with a short half-life may reach a steady state with
continuous Inhalation. Since Intake via the oral route 1s likely to occur
at Irregular Intervals, a steady state 1s less likely to be achieved. A
single oral dose may also result In a greater peak exposure at the target
organ than the same total dose via Inhalation. In the case of many noncumu-
latlve poisons such as cyanide, the peak concentration 1s more Important
than the total dose.
In some cases the site of exposure 1s a primary target organ. This 1s
often true with oral or dermal exposure. Should a separate ADI be estimated
for each route? If so, should we sum d-j/AD^ + d2/AOI2, and Is It a
serious problem 1f the sum 1s greater than 1?
A large portion of Inhaled partlculate matter 1s deposited 1n the con-
ducting airways, carried upward via the mucodllary ladder and swallowed.
Thus a significant portion of Inhaled partlcules or chemicals adsorbed on
particles may enter the body via the digestive tract. Methodology must be
modified to account for this.
Relative Effectiveness of Oral vs. Inhalation Exposures
The data from a study by Pozzanl et al. (1959) were recalculated 1n
order to show the variation 1n the effectiveness of oral vs. Inhalation
exposure for a series of organic chemicals. Since body weights were not
given, 1t was assumed that the rats weighed 0.33 kg. Respiratory minute
p/T
volume was estimated using the equation I = 0.105 (wt/0.113 VmVday).
-121-
-------
A 0.33 kg rat would thus Inhale 72 8. 1n 8 hours. The absorption percent-
age was also not given and therefore not considered. It 1s likely, however,
1f absorption rate 1s high by one route 1t would also be high by the other.
The data are presented 1n Table 8.
As can be seen, the mean ratio of LD /LC was 1.8. The largest
ratio, however, was about 21 times greater than the smallest ratio. This
was most likely due to several factors. A mean ratio of 1.8 suggested that
most chemicals were absorbed more effectively from the lung. This Is not
unexpected since the human lung has a surface area of about 70 m3. Liver
detoxification may have decreased the relative effectiveness of some chemi-
cals via the oral route. Finally, for chemicals with a very short half-
life, a steady state may be reached via Inhalation but not via the oral
route. However, a higher peak level may be achieved via the oral route.
Thus while the relative effectiveness of the two routes of exposure
varied by a factor of two or less for about two-thirds of the chemicals, a
better knowledge of pharmacoklnetlcs 1s required before doses from the two
routes can be accurately combined.
-122-
-------
TABLE 8
A Comparison of Lethal Oral Verus Inhalation Doses for a
Series of Organic Chemicals3
Acetone
n-Butylacetate
Butyl alcohol
Butyl cellosolve
Cellosolve
Cellosolve acetate
Ethylene dlchlorlde
Isobutanol
Isopropanol
Isopropyl acetate
Methanol
Methyl ethyl ketone
Methyl Isobutyl ketone
Perchloroethylene
Propylene dlchlorlde
1 ,1 ,2-TMchloroethane
8-hour LC5Q
(g/rat)
3.61
2.93
2.12
0.20
0.53
0.87
0.29
1.33
1.99
3.64
4.27
1.69
0.84
2.46
1.01
0.39
Oral LD50
(9/rat)
4.20
4.97
2.05
0.94
2.05
2.57
0.21
2.05
3.57
5.73
5.93
2.29
1.87
0.54
0.56
0.19
Ratio LD5o/LC50
1.16
1.69
0.97
4.70
3.86
2.95
0.72
1.54
1.79
1.57
1.39
1.39
2.23
0.22
0.58
0.49
1.81b
aSource: Pozzanl et al., 1959 (Copyright permission granted)
''Average ratio
-123-
-------
CRITIQUES
DR. MYRON HEHLMAN
The assumptions Involved 1n setting criteria levels are extremely soft.
In most cases, the levels are significantly below natural background levels
for known, naturally occurring substances such as benzene. These assump-
tions need to be reexamlned.
The availability of sufficient data to estimate multlchemlcal exposure
by various routes Is questionable. Some of the parameters which must be
considered are:
Metabolic fate
Prol1ferat1ve activity of target tissue at time of exposure
Interaction with DNA or protein
DNA repair
Dose-time effect
Age-associated decline 1n enzyme and hormone activity
Age-associated decline of prollferatlve activity of epithelial
cells 1n Intestines and kidneys
Age-associated disturbances 1n ability to repair
For exposure to single chemicals, exposure routes may be used Inter-
changeably. For exposure to mixtures this Is not true. Examples of the
effects produced by exposure to Industrial chemicals by different routes of
exposure are shown 1n Tables 9 and 10. A comparison of the blood concentra-
tions resulting from exposure to complex hydrocarbon mixtures by various
routes 1s shown 1n Figures 22 and 23.
DR. JAMES MELIUS
It Is Important to differentiate between occupatlonally- and environ-
mentally-derived criteria (I.e., TLV vs. ADI). For most chemicals the
environmentally-derived criteria will be much lower than the occupational
-124-
-------
TABLE 9
Percent Rats with Tumors Following Vinyl Chloride Exposure*
en
I
Percent Rats with Tumors 1n:
Dally Dose
(mg/kg)
ORAL
50.0
16.6
3.3
0.0
INHALATION
75.0 (25 ppm)
30.0 (10 ppm)
3.0 (25 ppm)
0.0
Duration Zymbal Gland
120 weeks
120 weeks
120 weeks
120 weeks
87 weeks
87 weeks
87 weeks
87 weeks
1
2
0
1
3
4
0
0
Mammary
6
7
5
5
12
12
9
2
Liver
20
11
0
0
4
0
0
0
Kidney
2
4
0
0
0
0
0
0
Thymus Abdomen Other
1 1 37
0 0 25
0 1 10
0 0 10
6
4
3
3
*Source: Maltonl, 1977 (Copyright permission granted)
-------
TABLE 10
Percent Rodents with Tumors Following Benzene Exposure
Rodent Dally Dose
(mg/kg)
INHALATION
Mouse 360 (300 ppm)
£> Mouse 0 (control)
i
Rat 360 (300 ppm)
Rat 0 (control)
ORAL
Rat 250
Duration
6 hours/day,
5 days/week
for lifetime
—
7 hours/day,
5 days/week
for 85 weeks
—
dally,
Percent Rodents with:
Zymbal Gland Hemolymphoretlcular Reference
Carcinomas Neoplasla
20.0 Snyder et a!.,
1980
5.0 Snyder et al.,
1980
5.8 — Maltonl, 1982
0.3 — Maltonl, 1982
12.3 7.7 Maltonl and
Rat
Rat
50
0 (control)
4-t, days/week
52 weeks
6.9
0.0
3.4
1.7
Scarnato, 1979
Maltonl and
Scarnato, 1979
Maltonl and
Scarnato, 1979
-------
ABBREVIATIONS:
T Toluene
EB Ethylbenane
PMX P- and m-xylenes
OX O-xylene
ET M-ethyltoluene
TMB 1,2,4-trimethylbenzerw
D Durene
10 Isodurene
12
10
8
6
4
2
•
-
PMX
•
"
. T
l>
2x
TMB
ET
-
40
1 30
"^
J
u
I 20
O
§
m
10
D ,n
n
•
m
PMX
.
.
»
n.
ox
40
TMB
ET
1 30
•&
u
820
o
§
m
to
0 .ID
120
100
1
-a 80
I
. 0
PMX
r^EB
h=
Ml
U
O
O
3 40
20
DID
-
-
.
PMX
•
n=
ox
TMB
ET
n
Tl
(A( Liquid Aromatic (B) Intravenous (C) Oral Administration (D) IntraperHoneal
Components Administration (Igm/kgl Administration
present ,n the (150 mg/kg) (1 „, ,
Complex Mixture
FIGURE 22
Comparison of Blood Concentrations Resulting from Intravenous, Oral and Intraperltoneal
Exposure to a Complex Hydrocarbon Mixture.
-------
Cmut 25 ijy/ml
JV DOSE
12 mg/kg of p+m-
XYLENES
JO-
10-
o
o
o
o
o
o
ORAL DOSE
t20 mg/kgof f>+m-XYLENES
ASSUMING 25% ORAL
ABSORPTION OF AN BO mg/kg DOSE
TIME (hr)
8 10 12
TIME (hr)
I
d
O
o
o
o
2
00
30-
20-
10-
6-
STATIC INHALATION
1 hr EXPOSURE
-1.0 mg/kg
30-
20-
•5 10-
I
O
o
o
6-
DERMAL
JNON-OCCLUDED)
80 mg/kg APPLIED
NO DETECT ABLE LEVELS
FOUND IN BLOOD
TIME (hr)
FIGURE 23
Blood concentration vs. time curves of p- and m-xylenes after various
routes of administration of a hydrocarbon mixture (p- and m-xylenes
constitute 8 wt. X of total hydrocarbon content).
-128-
-------
criteria. For example, for chlorinated solvents, the occupational Inhala-
tion TLV allows a 1000-fold greater exposure than the environmental drinking
water criteria. These differences reflect the historical and scientific
bases for these criteria.
In dump sites and other hazardous waste situations, skin absorption 1s
an Important source of exposure. However, we have very little quantitative
Information on the amount of skin absorption and almost no criteria to
determine 1f environmental surface levels are excessive. For example, NIOSH
has Investigated several situations Involving electrical equipment failures
where surface contamination with PCBs, dlbenzofurans and dloxlns were found.
There are presently no criteria to evaluate the significance of these levels
and to determine "safe" levels for cleanup purposes.
At a given site, different routes of exposure may vary In Importance
depending on the population of concern. For Individuals Involved 1n site
Investigation or response, air exposure will be a primary concern; for the
surrounding community, water exposure may be more Important. For each
chemical at a waste site, a single route of exposure would be most Important
depending on the properties of the chemical and the characteristics of the
waste site.
-129-
-------
DISCUSSION
OR. RICHARD KOCIIA
Each dump site should be evaluated on a case-by-case basis to address
the specific Issues regarding multiple routes of exposure, Interactions of
component materials and the relative roles that Inhalation, oral and dermal
exposure play 1n the actual situations. One cannot generalize as to
possible synerglsm, antagonism or addH1v1ty of multiple chemicals.
GENERAL COMMENTS
It 1s unknown whether oral/Inhalation dose-response curves are parallel.
Vehicle effects may be Important 1n exposure studies and must be con-
sidered.
Nasal Irritants need to be considered when discussing Inhalation.
If critical target organ 1s the point of entry, the Intake from two
routes of exposure would not be added.
studies have been done on mixtures and dose-response curves have
been developed.
Two questions not addressed: Can routes of exposure be considered
separately and then combined addHlvely or In an equltoxlc fashion?
Does exposure by one route alter the toxic dose response by another
route?
-130-
-------
REFERENCES
Maltonl, C. 1977. Origins of human cancer. In.: Proceedings of a Confer-
ence on Cell Proliferation. Cold Spring Harbor Laboratory, Cold Spring
Harbor, NY. p. 119-146.
Maltonl, C. 1982. Myths and facts 1n the history of benzene cardnogenlc-
1ty. Adv. Mod. Env. Toxlcol. 4: 1.
Maltonl, C. and C. Scarnato. 1979. First experimental demonstration of the
carcinogenic effect of benzene: Long-term bloassay on Sprague-Dawley rats by
oral administration. Med. Law. 70: B-52.
Pozzanl, U.C., C.S. Well and C.P. Carpenter. 1959. The lexicological basis
of threshold limit values: 5. The experimental Inhalation of vapor mixtures
of rats, with notes upon the relationship between single dose Inhalation and
single dose oral data. Ind. Hyg. J. 20: 364-369.
Snyder, C.A., et al. 1980. The Inhalation toxldty of benzene: Incidence
of hematopoetlc neoplasm and hematotoxIcHy 1n AKR/J and C57B1/6J mice.
Toxlcol. Appl. Pharmacol. 54: 323.
-131-
-------
SYSTEMIC TOXICANTS
The Impact of Carcinogens 1n Risk Assessment of Chemical Mixtures
Presentation:
Presentation:
Presentation:
Dr. Robert McGaughy
Cancer Assessment Group, U.S. EPA
Dr. Roy Albert
New York University Medical Center
Cancer Assessment Group, U.S. EPA
Dr. Debdas Mukerjee
ECAO, OHEA, U.S. EPA
-132-
-------
PRESENTATIONS
OR. ROBERT McGAUGHY: HOW DOES THE CAG ASSESS QUANTITATIVE RISKS?
Risk assessments generated by the EPA Carcinogenic Assessment Group
(CAG) are Intended to provide a rough estimate of the seriousness of an
exposure situation. CAG estimates are not predictions and they do not
provide an estimate of absolute risks. Rather, they serve to Indicate which
situations have negligible hazard and they provide a convenient framework
for summarizing relevant facts and Identifying research needs. They do not
reflect the strength of the evidence.
In absence of Information, CAG makes the following assumptions 1n
assessing quantitative risks:
1. Lifetime Incidence 1n humans 1s the same as In animals receiving an
equivalent dose.
2. Dose 1n mg/surface area 1s equivalent between species.
3. Humans are as sensitive as the most sensitive animal species.
4. A linear, no-threshold model 1s the upper-limit response at low doses.
No estimate 1s made 1f mechanism 1s known Lo be non-linear.
5. Lifetime Incidence 1s proportional to the total lifetime dose received,
averaged on a dally basis.
6. If an experiment 1s terminated early, lifetime Incidence 1s estimated
assuming that cumulative Incidence Increases as the third power of age.
7. The linearized multistage model 1s appropriate for extrapolation and the
upper 95% confidence limit of the linear term 1s appropriate for
expressing the upper-bound of potency.
8. The upper-bound risk 1s less plausible 1f there 1s no evidence of
mutagenld ty.
9. Human data are preferable to animal data as the basis for risk estimates.
10. Negative human data, 1f available, can be used to give an upper-limit of
risk.
11. For human data, the method of analysis 1s tailored to the completeness
and quality of data available. A model which 1s linear at low dose Is
used for extrapolation.
-133-
-------
DR. ROY ALBERT: POLICY INITIATIVE
The purpose of this presentation 1s to describe a policy Initiative
launched by CAG and ECAO-CIN. The policy Initiative deals with two matters:
1) the adoption of the International Agency for Research on Cancer (IARC)
scheme for stratification of the weight of evidence for cardnogenlclty, and
2) the use of mutagenldty data 1n the approach to quantitative risk assess-
ment with special reference to water quality criteria.
The IARC Stratification Scheme
For the first 5 years after the adoption of the EPA guidelines for
carcinogen risk assessment 1n May 1976, there was a strong Impetus toward
the regulation of any agent that showed even relatively modest evidence of
cardnogenlcl ty. The CAG has no formalized nomenclature for describing the
weight of evidence and most frequently used the term "substantial" to
characterize evidence that was more than marginal. However, within recent
years, there has been Increasing resistance to regulation, and, consequent-
ly, a greater need for a stratification of the weight of evidence for carci-
nogenlclty. In examining the various options H was apparent that, although
a number of schemes for describing the weight of evidence existed, only one
had International acceptance and substantial usage: the approach developed
and used by IARC for evaluation of close to 200 compounds. For this reason,
H seemed appropriate for EPA to adopt the IARC scheme for stratifying the
weight of evidence.
The proposal to adopt the IARC scheme was circulated within the EPA and
to 22 of the country's outstanding experts 1n the field of oncology. All
the outside experts who reviewed the IARC scheme were favorable to Us
adoption. There was also strong sentiment for retaining the original char-
acterization of the "limited" category of evidence for situations where no
-134-
-------
decision could be made on whether the agent was a cardnogtn. Ont reviewer
conwwnttd that the distinction between the "sufficient" and "limited"
categories should be sharpened and their regulatory significance clarified.
Another commenter had reservations about limiting "sufficient" evidence to
malignant neoplasms. Considering the response within and outside the agency
to the proposed use of the IARC scheme and also the response of the partici-
pants at other ECAO workshops, H was apparent that adoption of the IARC
stratification scheme would receive wide acceptance 1n the scientific
community.
The Use of Mutagenldty Data and Quantitative Risk Assessment
The second of the two policy Initiatives was to limit the use of the
linear extrapolation model to agents that show evidence of being mutagenlc.
In terms of the water quality criteria, the proposal presented a range of
water concentrations: the lower limit was based on the use of the linear
extrapolation model at a risk level of 10~5, and an upper concentration
limit was based on a modified NOEL approach (a safety factor of 1000 applied
to a 10% response). Guidance for each confound was given as to where 1n the
concentration range the actual standard should be sought. The response of
the outside experts was mixed. Some had very serious reservations about the
approach, primarily on the grounds that not enough was known about the
differences 1n the mechanism of action of mutagenlc and nonmutagenlc
carcinogens. However, a substantial number of the experts supported the
proposed approach.
The central argument 1n support of the proposed approach Is that the
mode of action of mutagenlc compounds that 1s consistent with a linear non-
threshold extrapolation model can be easily visualized as a quanta! Inter-
action of the carcinogen with DNA leading to a mutagenlc event that Is
-135-
-------
linked to a carcinogenic transformation. In the case of human bladder
cancer, recent evidence supports this mode of action. By contrast, there 1s
no mode of action for nongenotoxlc carcinogens that has been postulated
which would lead to single-hit non-threshold kinetics. The absence of even
a speculative mode of action that would be consistent with such a pattern
weakens the argument 1n support of the use of the linear non-threshold
extrapolation model for nonmutagenlc carcinogens. It was apparent from the
meeting at Cincinnati that a number of the participants had the same sort of
reservations as had been expressed by other experts 1n oncology. But a vote
of the assembled group revealed that a majority supported the distinction
between mutagenlc and nonmutagenlc carcinogens for the purpose of quantita-
tive risk assessment.
DR. DEBDAS MUKERJEE: MECHANISMS OF CARCINOGENESIS IN RISK ASSESSMENT
Environmental factors, which Include life style and environmental agents
that man 1s exposed to, seem to be the major cause of most of the cancer 1n
man. The mechanisms by which the normal cell 1s transformed by the environ-
mental carcinogens to a malignant state are complex. Evidence from biochem-
ical molecular analysis, in vitro studies, animal bloassays and epldemlo-
loglc observations Indicates that the problem of cancer lies 1n the regula-
tory dysfunction of the normal gene action. This dysfunction of the genetic
material of cell results In the production of progeny of abnormally prolif-
erating cancer cells. The neoplastlc cells, 1n addition to acquiring a
multitude of abnormal characteristics, lack certain essential basic proper-
ties of the normal cell. Since the reduction of exposure to environmental
carcinogens shows great promise of reducing cancer Incidence, 1t Is essen-
tial to utilize a scientifically valid approach for regulating the carcino-
gens. Consequently, the major concern for the public health lies 1n whether
-136-
-------
the current knowledge of the mechanism of cardnogenesls can be utilized to
determine cancer risk from exposure to environmental carcinogens.
One of the Initial tasks for this mission 1s to detect the carcinogenic
potentialities of the environmental agents for determining their cancer
risk. Human ep1dem1olog1c observations have Identified many of these envi-
ronmental carcinogens. Animal bloassays have also aided 1n determining the
carcinogenic!ty of chemicals. There are nearly two dozen human carcinogens
which have been found also to be carcinogenic 1n animals. These assays have
Identified the carcinogenic!ty of many chemicals. In addition, chemicals
once suspected of having carcinogenic potentialities because of structural
similarities have been found to be noncardnogenlc In animal bloassays. To
avert the massive expense, time and labor, other characteristics of carcino-
genic agents are being studied for Identification. Results from short-term
test Including microblal mutagenesls assays, sister-chromatld-exchanges,
etc. have been utilized for selecting chemicals for long-term animal blo-
assays. Since carcinogens are supposed to affect the genetic materials and
cardnogenesls 1s thought to be a process of mutation of the somatic cell,
various short-term tests which can detect mutagenlc characteristics of the
chemical have been utilized. The carcinogens which express mutagenlclty 1n
short-term biological assays are thought to follow a genetic pathway 1n the
cardnogenesls process (Ames et al., 1975). However, a small group of
carcinogens consistently give negative response to mutagenesls assays. This
has generated considerable confusion 1n the scientific community. It has
even been suggested that the carcinogens which consistently fall to give
positive response to Indicate Us mutagenlc characteristics 1n short-term
biological mutagenesls assays follow a nongenetlc pathway to transform the
normal cell to a neoplastlc state (Welsburger and Williams, 1981).
-137-
-------
Conclusive evidence that carcinogens affect the ONA may be derived from
biochemical analysis of the DNA of the transformed cell (Lutz, 1979). Many
carcinogens which respond negatively 1n short-term mutagenesls assays have
been found to react with DNA. Such reactions can only be observed following
sensitive biochemical analysis used 1n molecular biology.
That the basis of cardnogenesls lies 1n the alteration of DNA of the
cell can also be supported by observations on carcinogen Induced mlsrepalr
or Incomplete repair resulting 1n deletion, addition or translocatlon of the
base of the DNA molecule. The damage to the DNA strands by the carcinogen
1s very often repaired by enzymatic repair mechanisms to preserve the normal
entity of the cell. When the enzymatic repair 1s also Inhibited the cell
acquires the neoplastlc state (Setlow, 1978; Cleaver and Bootsma, 1975).
The active form of the carcinogens because of electrophlllc nature binds
covalently with DNA forming adducts (Miller, 1970). Carcinogens can form
adducts with all the base residues In DNA. However, the N-7 position of the
guanlne, because of Us location outside the helical axis (especially 1n
Z-form DNA) (Rajalakshml et al., 1982) becomes easily available for reaction
with the active form of the carcinogen. Covalently bonded adducts between
the electrophHlc active form of the carcinogen and the electron rich DNA 1s
an Indicator of the mutagenlc characteristic of the cardnogenesls. Ability
of the carcinogen to form adduct with DNA 1s also an Indicator of Its
carcinogenic activities. More than 80 carcinogens have been studied for
their effectiveness of forming DNA adducts. Many of these carcinogens are
barely detectable or even negative 1n Ames assay (Rajalakshml et al., 1982).
Furthermore, a measure of the DNA adduct formation with the carcinogen 1s
also a measure of the quantity of the carcinogen Induced 1n transforming the
normal cell to the neoplastlc state.
-138-
-------
DMA adduct formation seems to be the most conclusive criterion for
determining the actions of the carcinogen at the genetic level (Perera and
We1nste1n, 1982). Extreme cytotoxldty, extreme unstable nature of the
reactive metabolite of the carcinogen, and absence of optimum enzymatic
activation of the carcinogen to an active form are thought to be a few of
the many reasons for the lack of expression of mutagenlc activities 1n
short-term biological assays. But for conclusive evidence 1t Is essential
to determine whether the chemical 1n question binds covalently with the
genetic material and/or 1t mutates any gene of the cell. Covalent binding
of the carcinogen with the DNA Is a reflection not only of the degree of
adduct formation, but also of the efficiency of specific ONA excision
mechanisms.
Genetic and nongenetlc mechanisms of cardnogenesls are not operating 1n
a totally autonomous manner and simplistic separate theories can now be
resolved by a unified concept that seems to fit all the evidence and define
most of the aspects of cardnogenesls. A unified theory of cardnogenesls
has recently been proposed (Ts'o, 1981). According to this concept the
genetic material contains both ONA and the regulatory machinery which
controls the expression and replication of ONA. Carcinogens affecting the
regulators, usually thought of as nonmutagenlc 1n biological short-term
assays, can also affect the DNA and, therefore, may be characterized as
mutagenlc 1n nature. On the other hand, an effect on ONA can be expressed
through the Interaction with the regulator. Consequently, some genetic
phenomenon can apparently look like a nonmutagenlc event. Such dynamic
Interaction of the entire genetic material and the regulator, Influenced by
carcinogens Irrespective of whether they are positive 1n biological muta-
genesls assays or not, results 1n the heritable changes encountered 1n
neoplastlc cells.
-139-
-------
The differentiation process for cancer development, first described as
one of the primary nongenetlc mechanisms of cancer development (P1tot and
Heldelberger, 1963), now appears to be consistent with a genetic mechanism.
Some hormones generally considered to be nonmutagenlc have been found to
react through a genetic mechanism. Ijn utero exposure to diethylstilbestrol
may result 1n vaginal cancer 1n the daughter 15 or more years later. This
potent transplacental carcinogenic hormone and some of Us metabolites have
now been shown to bind covalently to ONA both 1_n vivo (Lutz, 1979) and 1m
vitro (Metzler, 1981). Furthermore, this hormonal carcinogen seems to
Induce sister-chromatld-exchanges 1n lymphocytes from pregnant and premeno-
pausal women (H111 and Wolff, 1982). Asbestos fibers, though giving nega-
tive response 1n mlcroblal mutagenic test systems, have also been suggested
to Interfere w1th DNA molecule. SynerglstU effects relative to DNA binding
for chrysotHe and benzo(a)- pyrene have been observed 1n the human fibro-
blasts 1n culture when asbestos 1s added 24 hours prior to the hydrocarbon.
Such DNA binding has been suggested to be due to Interaction of the charged
asbestos fibers with the ONA, thereby altering Us electronic structure.
This results 1n an increase 1n the number of sites available for binding of
the hydrocarbon to the DNA {Hart et al., 1980). In addition, asbestos Is
also known to Induce sister-chromatld-exchanges (Rom et al., 1983). From
these observations it can be Inferred that asbestos, a carcinogen negative
1n most of the mutagenesis assays, might also react through a genetic
pathway. Carcinogenic metallic salts which are negative 1n mutagenesis
assays can Induce infidelity of DNA polymerase resulting 1n synthesis of
aberrant DNA (Slrover and Loeb, 1976), DNA strand breaks (Robison and Costa,
1982) and formation of prote1n-metal-DNA complexes (Lee et al., 1982) 1n
mammalian cells. Carbon tetrachlorlde and chloroform are potent animal
-140-
-------
hepatocardnogens (IARC, 1979; NCI, 1976). Since these have consistently
been found to respond negatively 1n mUroblal mutagenesls assays, It has
been suggested, especially 1n the case of chloroform, that perhaps these
carcinogens form tumors following strictly a nongenetlc pathway. Phosgene,
a highly reactive chemical, seems to be an Important metabolite of carbon
tetrachloMde and chloroform. However, the metabolites of these hepato-
cardnogens have been found to bind covalently with nuclear ONA (D1az Gomez
and Castro, 1980). Conversely, certain carcinogens like nltrosamlnes and
3-methylcholanthrene, which consistently give positive response 1n biologi-
cal mutagenesls assays, also apparently seem to follow nongenetlc pathways
(DIMayorca et al., 1973; Prehn, 1964). These observations clearly Indicate
that covalent binding studies of the active form of the carcinogens conclu-
sively demonstrate that carcinogens affect the genetic material to Initiate
a cell to the neoplastlc state.
The multistage model for quantitative risk assessment from carcinogens
assumes that the tumor develops from a single cell only after 1t undergoes a
number of changes resulting 1n the dysfunction of the genetic material of
the cell (Armltage and Doll, 1961). Utilization of this model for estimat-
ing risk from carcinogenic pollutants can be justified by the mechanism of
cardnogenesls as proposed 1n the unified theory. One of the unique
features of this model 1s that 1t gets Us credence from ep1dem1olog1c data,
animal studies and even in vitro studies of neoplastlc transformation of
cells. Most Ideal risk assessment for carcinogens requires that human
ep1dem1olog1c data and sufficiently valid exposure Information are available
for the compounds 1n question. In the absence of such observations, the
data are analyzed by an alternate procedure to give an estimate of the
linear dependence of cancer rates based upon the calculated lifetime average
-141-
-------
dose. If the epidemiology data show no carcinogenic effect when positive
animal evidence 1s available, 1t 1s assumed that a risk exists but 1s
smaller than could have been observed In the ep1dem1olog1c study. An upper
limit of the cancer Incidence 1s then calculated, assuming that the true
Incidence 1s just below the level of detection 1n the cohort studies. With
this approach, the response 1s measured 1n terms of excess risk of the
exposed cohort of Individuals compared to the control group. In analyzing
the data, 1t 1s assumed that the excess risk 1s proportional to the lifetime
average exposure and that H 1s the same for all ages.
Both cancer ep1dem1olog1c data from occupational exposure to the
carcinogen and cigarette smoking data are consistent with the multistage
model for cardnogenesls (Day and Brown, 1980).
In view of current knowledge on the mechanism of cardnogenesls and
human cancer epidemlologic observations 1t Indicates the multistage model 1s
the most appropriate model to use for the risk assessment of environmental
carcinogens.
-142-
-------
DISCUSSION
DR. MYRON MEHLMAN
Insufficient Information was presented about assessing the biological
effects of chemical mixtures. Moreover, the discussion of risk assessment
and levels of exposure was based on too many assumptions. This entire area
1s 1n need of reexamlnatlon.
DR. ROBERT NEAL
An Important concept 1n calculating cancer risk at low levels of expo-
sure 1n man 1s the dose of the active compound or metabolite at the target
site. The mathematical models currently used assume, 1n extrapolating from
high-dose effects 1n experimental animals to potential low-dose effects 1n
man, that the concentration of the active compound or metabolite at the
target site changes linearly with exposure concentration. This 1s not
necessarily true. In fact, there 1s a substantial body of data which
suggests that the dose of the active compound at the target site will not
decrease 1n a linear fashion with decreasing exposure concentrations. Thus
more work needs to be done to determine the relationship between exposure
dose and dose of the active compound at the target site. When we have
knowledge of parameters such as this, we will be 1n the best position to
modify our mathematical models to more accurately estimate potential cancer
risk 1n man from low-dose exposure using data generated 1n experimental
animals exposed to high levels of these compounds.
DR. WILLIAM NICHOLSON
I do not support any use of a safety factor approach for risk extrapola-
tion of carcinogens, either for lifetime or short-term exposures. Although
such extrapolations may give results similar to the standard carcinogen risk
procedures 1n some cases, 1t 1s not convincing at all that such would be the
-143-
-------
general case. We have reasonable models for extrapolating carcinogenic risk
to lower doses, both for chronic and short-term exposures. Their use should
be continued.
The Use of a Threshold Model for Eplgenetlc (Nongenotoxlc) Carcinogens
I strongly oppose the proposal discussed by Or. Albert to utilize
different models for genotoxlc (as manifested through ln_ vitro mutagenlc
test systems) and eplgenetlc carcinogens. My objections are as follows:
1. There are no data to Indicate that thresholds generally exist for
eplgenetlc carcinogens. In particular, one of the most studied of such
materials, asbestos, demonstrates strong linearity of response over the
entire range of available data. While 1t may be possible to construe
certain situations 1n which metabolic deactlvatlon, etc., can apply, one
has no justification for assuming a threshold without the complete
knowledge of how such processes operate 1n all potentially exposed
Individuals.
2. The use of mutagenlc test systems to determine what 1s or 1s not a
genotoxlc carcinogen has limitations. Since many genotoxlc carcinogens
require activation, the choice of a proper test system 1s limited and
accurate Identification of these substances 1s difficult. A negative
test may simply be an Inadequate test.
3. Even 1f (as discussed by many authors) a threshold did exist for a
certain carcinogen 1n the absence of exposure to any other similarly
acting materials, other carcinogenic agents 1n the environment would
Influence health effects 1n humans. One 1s not adding an agent to an
unexposed Individual but to an Individual already assaulted by many
other Initiators and promoters. In fact, 1n such circumstances a linear
dose-response function, as determined by the extrapolation of the dose-
-144-
-------
response relationship to zero dose, may not be conservative. Consider
the dose-response relationship Illustrated 1n Figure 24. Extrapolation
from point A to the origin would produce a slope one-third less than the
slope of a slight change along the curve at point 8.
I would suggest that our Information on the appropriateness of a linear
dose-response relationship for genotoxlc compounds 1s not better than that
for the eplgenetlc ones. Until we have explicit Information that would
distinguish effects for each of these compounds, Dr. Albert's discussion on
how to treat genotoxlc agents would appear to be appropriate for all agents
determined to be carcinogenic.
DR. REVA RUBENSTEIN
The discussion concerning promotion versus Initiation highlighted still
another difficulty. There was a definite sense of urgency about changing
the risk assessment procedure, however sufficient data have not yet been
developed. If we were better able to define eplgenetlc, we might be capable
of Investigating It. I heartily support such research, but I am adamantly
opposed to change 1n the carcinogenic risk assessment procedures until the
fruits of the research can be examined.
MR. WILLIAM GULLEDGE
Distinction Between Genotoxlc and Eplgenetlc Carcinogens
Discussion by Dr. Albert on this subject seemed to Indicate that the
concept has received little consideration by the Carcinogen Assessment
Group. This 1s unfortunate, as the focus of the two previous meetings and
the Informal poll taken by Dr. Albert at this meeting (-70% favor the dis-
tinction) clearly Indicate acceptance and technical validity to a threshold
mechanism. Support should be given for proposals already developed for
separate risk assessment approaches for genotoxlc and nongenotoxlc car-
cinogens.
-145-
-------
Ui
CO
O
ID
Cu
Point at which general population is in
terms of risk
EQUIVALENT DOSE
FIGURE 24
Hypothetical Dose-Response Relationship for Exposures to Multiple Agents
-146-
-------
For eplgenetlc mechanisms, a mathematlc model can be used to predict a
10~2 risk 1n humans from experimental animal data. Additional safety
factors (taking Into account negative data) can be applied as needed to
account for human equivalent dose and various uncertainty factors associated
with the adequacy of the data and extrapolating from average to sensitive
Individuals. Total risk factor would be approximately 10~5, but would
vary for each specific case. Policy, economic and other considerations
should also be Included 1n the overall decision, and this will be discussed
separately.
DR. ROLF HARTUNG
The basic concepts of genotoxlc vs. eplgenetlc causation of cancers
still require clarification. Both processes produce somatic mutations which
eventually express themselves as malignant tumors. Once a tumor has been
produced, there may be no way of determining the mechanism by which 1t
originated. The differences In the genotoxlc and eplgenetlc causations of
cancers are purely 1n their postulated mechanisms of origin. Genotoxlc
mechanisms are those which cause a direct alteration of the cellular DNA, by
mechanisms such as direct alkylatlon or Intercalation between adjacent
nucleotldes. The Interactions are compatible with single-hit mechanisms,
which are then modified by the normal states 1n tumor growth and/or suppres-
sion toward the final expression of a noticeable tumor by quantitative
processes which can be expressed In terms of multistage dynamics.
In the case of eplgenetlc causations, a number of other phenomena
predominate. The chemical cannot directly Interact with the DNA, but causes
alterations 1n the cellular DNA only after a series of Intervening steps.
One scenario that has been suggested Is that the chemical or Us metabolite
causes severe cellular damage and necrosis, engendering extensive and
-147-
-------
repetitive repair over the lifetime duration of the exposure. This process
forces an excessive number of cellular replications, many more than would be
expected to occur during the course of a normal Hfespan. Such a forced
excessive cellular replication should result 1n a proportionately (or
greater than proportional) Increased number of transcription errors, some of
which could have a carcinogenic outcome. In another scenario, a generalized
enzyme Inhibitor (e.g., a sulfhydryl reagent) would Inhibit many enzymes
throughout the body, Including some of the enzymes Involved In the synthesis
and transcription of nucleotldes to functional DNA. When some of the
required enzymes are partially Inhibited, H 1s very conceivable that the
transcription of the DNA during cellular replication should become more
error prone. Such a hypothesis may explain, In part, some of the events
leading to heavy metal cardnogenesls. It should be noted that all the
processes that I have Invoked for the production of eplgenetlc effects
require mass action, and some require enzyme kinetics or the overwhelming of
normal functional reserve or repair mechanisms. Such phenomena are, at a
minimum, more compatible with non-linear dynamics, and probably are also
more compatible with threshold dynamics (1t takes conceptually more than one
molecule to do the dirty deed).
When viewed 1n this manner, 1t becomes apparent that one cannot always
distinguish between genotoxlc and eplgenetlc carcinogens on the basis of
mutagenldty, because an agent may be mutagenlc on the basis that It was
able to Inhibit many of the enzymes required for cellular replication.
I would suggest that the evaluation of the probable genotoxlc or
eplgenetlc causation of cardnogen1c1ty of a chemical be based not on a
check-list for critical tests, but on a coherent evaluation, which seeks to
fH a plausible hypothesis to the compendium of experimental results In a
-148-
-------
way that makes sense. Since 1t 1s possible that some agents may act both
through genotoxlc and eplgenetlc mechanisms, one must take great care 1n the
Interpretation of the experimental results. Such Interpretations cannot be
done by rote, but to avoid them may result 1n severe and unnecessary harm to
the economic fabric of the nation.
Let's spend more time on the basic Issues of what 1s the meaning of
eplgenetlc vs. genotoxlc carcinogenic!ty 1n a regulatory context.
DR. RICHARD KOCIBA
There was strong support by the participants to have EPA and CAG
reevaluate their present method of categorically doing mathematical risk
assessment on all carcinogens by using the linearized multistage model to
calculate an upper bound of carcinogenic potency. There 1s little justifi-
able scientific basis for continuing to use this previous approach to
categorically deal with all carcinogens.
Extensive research has lead to significant advances 1n the understanding
of the various mechanisms by which a carcinogenic response can be produced
1n laboratory animals. This research has also generated a more appropriate
data base for extrapolation of results of cancer studies 1n laboratory
animals to man. Recent reviews by Welsburger and Williams {1980, 1981) and
Stott et al. (1981) have summarized some of the significant new developments
resulting from this research.
It has become Increasingly evident that all chemical carcinogens do not
act via the same mechanism. Based on the extent of a chemical's Interaction
with DNA, H appears that chemicals that have a greater propensity to
directly Interact with DNA are appropriately classified as genotoxlc. Those
that do not have this propensity to Interact directly with DNA, but lead to
tumors via recurrent tissue Injury or other secondary events are classified
-149-
-------
as norigenetlc or eplgenetlc carcinogens. The carcinogenic risk to man posed
by such eplgenetlc carcinogens appears to be substantially less that that
posed by purely genotoxlc carcinogens.
Certain of the chemicals categorized as carcinogens have been exten-
sively studied with regard to their mechanisms of action and/or their com-
parative metabolism 1n man and animals. This 1s typified by the extensive
data published by Schumann et al. (1980) Indicating a nongenetlc mechanism
of cardnogenesls for tetrachloroethylene 1n the mouse. The work of Reltz
et al. (1980) described a nongenetlc mechanism of cardnogenesls for chloro-
form 1n the mouse and also concluded that man would be much less sensitive
than the mouse or rat to chloroform. These advances In the understanding of
the mechanisms of animal cardnogenesls and the development of a more appro-
priate basis for extrapolation to man has led to a critical revaluation of
the previous assumptions that have been used as the basis for regulatory
action on materials categorized as carcinogens. Recent and significant
developments 1n this respect Include the following:
1. Dr. A. Kolbye, who has served as the Associate Bureau Director of
Toxicology for the U.S. Food and Drug Administration (FDA), 1981, has
repeatedly stressed the need to go beyond the previous simplistic
categorization of chemicals Into groups that do or do not cause cancer.
Dr. Kolbye recommends that carcinogens be considered on the basis of
their mechanism of action whereby they act (a) as a complete carcinogen
via Initiation by Interaction with DNA, or (b) via secondary effects
mediated by recurrent tissue Injury, decreasing biological resistance,
etc. Dr. Kolbye stated that the latter type "should not be called
carcinogens", even though under some circumstances and at some doses
they can Influence cancer Induction.
-150-
-------
2. The Administrative Conference of the United States recently published a
notice on carcinogenic regulation 1n the Federal Register (47 FR 11024,
March 15, 1982). This notice pertains to recommendations addressed pri-
marily to U.S. Federal Agencies that regulate carcinogens. Recommenda-
tions for the future Include the use of peer review and advisory panels
to address specific chemicals on a case-by-case basis 1n which all the
relevant data and expertise can be used 1n the evaluation of possible
carcinogenic risk to man.
3. Dr. R. Squire, who has served as the Director of the U.S. National
Cancer Institute's program for the testing 1n animals of chemicals for
carcinogenic potential has recently published (Squire, 1981) a series of
recommendations on the appropriate Interpretation and extrapolation of
animal cancer studies to man. Dr. Squire noted that the nature and
extent of data Indicating carcinogenic effects 1n laboratory animals
varies widely, yet present regulatory policy does not permit adequate
discrimination among the many animal carcinogens. He pointed out the
need for a case-by-case consideration that would allow a ranking of
animal carcinogens based on the number of animal species affected, the
types of different tumors Induced, the spontaneous Incidence rate of the
tumors Induced, the dose-response relationship, and the mechanism
(genotoxlc vs. eplgenetlc) by which the carcinogenic response occurred.
4. The Technical Committee of the Society of Toxicology (1981) recently
examined the Issue of regulation of potential carcinogens and stated
that the assessment of human risk of cardnogenesls approaches credibil-
ity only when the material Is examined under conditions "that reasonably
approach human use and metabolic handling." The technical committee
-151-
-------
also recommended a return to the use of the pragmatic approach to safety
assessment that has historically proved so useful 1n the extrapolation
of other animal toxldty data to humans.
5. The Council on Scientific Affairs of the American Medical Association
(AMA, 1981} has recently criticized the basis of the past federal
carcinogen regulation Initiative. The Council stated that the previous
underlying premises need to be revised or refuted, especially 1n view of
"evidence both from experimental animals and man that there 1s a thresh-
old for many carcinogens; thus, the concept of the threshold level
beneath which exposure Is harmless cannot be rejected."
6. On February 17, 1982, the Environmental Criteria and Assessment Office
of the U.S. EPA held a "Workshop on Estimating Ambient Water Quality
Criteria for Eplgenetlc Carcinogens". The objective of this workshop
was spelled out In the workshop report (U.S. EPA, 1982) as follows:
Prior to 1981, It was commonly held that somatic gene mutation
was the principal mechanism by which compounds exerted carci-
nogenic effects. Somatic cell gene mutation 1s thought to
show a one-to-one correspondence between dose and effect and
1s held to have no threshold. Regulatory agencies, espousing
this belief, adopted a very protective stance toward health
and therefore decreed that all carcinogens should be treated
as genotoxlns, as though their dose-response relationships
were linear and without threshold. On this basis, methods
were evolved for determining ambient water quality criteria.
In the case of carcinogens, models used to extrapolate from
the high doses used 1n animal experiments to the low doses,
thought to be encountered by humans, Incorporated this linear
non-threshold concept. Criteria derived using such models 1n
some cases were below ambient atmospheric levels, thus result-
Ing 1n considerable concern, not only because they seemed
unreasonable, but because the validity of the models 1n other
situations might be questioned. More recently, other mecha-
nisms of cancer have been suggested. These mechanisms are
thought not to Involve direct action of the chemical with the
DNA and have thus been termed eplgenetlc. Eplgenetlc mecha-
nisms, such as Inhibition of DNA synthesis or repair enzymes,
are thought to have threshold dose-response relationships.
Therefore, H becomes necessary to develop new methods for
estimating ambient water quality criteria for carcinogens with
eplgenetlc (threshold) mechanisms.
-152-
-------
As Indicated by the objectives of that workshop, this 1s a significant
development that must be pursued by EPA 1n order to utilize the newer
data becoming available on this Issue.
7. The regulatory agencies 1n the Netherlands (Kroes, 1982) have already
Incorporated changes 1n their carcinogen policy that distinguish between
genotoxlc and nongenotoxlc (eplgenetlc) mechanisms of action. For geno-
toxlc carcinogens, the more stringent nonthreshold (one-hit) model 1s
used, but for nongenotoxlc carcinogens, a less stringent extrapolation
to man utilizes the application of a safety factor to the animal data.
As Indicated by the numerous recent developments within regulatory and
academic circles above, 1t 1s obvious that the significant new scientific
Information on carcinogenic mechanisms and extrapolation should lead to a
broad-scale reassessment of the methodology used 1n the establishment of
appropriate limits of human exposure for chemicals Identified as carcino-
gens. This 1s especially true for those carcinogens acting via a nongenetlc
mechanism. This will undoubtedly lead to a more scientifically valid and
realistic differentiation between those chemicals that truly warrant a more
stringent degree of control of potential exposure compared to those chemi-
cals that do not require the same stringent control of potential exposure.
Another critical decision point 1n the risk assessment process 1s the
conversion of animal exposure data to human exposure data. In the past, 1t
has been assumed that man 1s more sensitive than laboratory animals on a
mg/kg basis because of his lower basic metabolic rates. This added conver-
sion factor may be appropriate 1n some Instances, but 1s not appropriate 1f
the available data show that a metabolite or reactive Intermediate Is the
proximate carcinogen rather than the parent compound. In these cases the
-153-
-------
body burden or blood concentration of the proximate carcinogen should be
less for man than for laboratory animals because of his lower rate of
metabolic processes.
DR. HARRY SKALSKY
When Dr. Albert asked for a show of hands, over 75% of the scientists
present Indicated that they supported the subdivision of carcinogens Into
genotoxlc and nongenotoxlc categories. There was strong support by the
participants to have EPA and CAG reevaluate their present policy of
categorically doing mathematical risk assessment on all carcinogens using
the multistage model to calculate an upper bound of carcinogenic risk.
Human risk assessment 1s an uncertain science, and 1t may well remain a
matter of serious dispute for years. Nevertheless, guidelines are needed
for the regulatory decision-making process. Those guidelines must be flexi-
ble to accomodate Improvement mandated by an ever-advancing health technol-
ogy. The Cancer Assessment Group's dogmatic use of a single model can 1n no
way be considered flexible.
I understand that the IARC method of classifying carcinogens 1s current-
ly being considered by EPA. It would appear that the adoption of the IARC
classification would entail the use of different safety assessment proce-
dures for each category. The "Carcinogen Policy" developed by the Committee
of the Health Council of the Netherlands (1980) for consideration by CAG
contains the flexibility that the CAG approach 1s missing. The Netherlands
Health Council (1) has established different classes of carcinogens based on
available scientific Information (nongenotoxlc, etc.); (2) has resisted a
prior choice of a single extrapolation model, but will review a range of
models; (3) suggests that a no-effect approach to safety assessment (NOEL-
safety factor) may be appropriate for IARC carcinogen categories II-IV.
-154-
-------
It would appear advantageous for CAG to develop a flexible policy that would
accommodate new scientific concepts of chemical carclnogenesls.
In re-evaluating their policy, the CAG and the EPA should so consider
a definition of what constitutes sufficient data for a risk assessment. As
a scientist, I cannot believe that a two-dose NCI bloassay provides suffi-
cient scientific evidence to predict the potential carcinogenic risk 1n man
at 1CTS, 10~6 or 10~7. It appears that experts have difficulty agree-
ing to what constitutes sufficient data for risk assessment when only human
data 1s Involved. In discussing the recent IARC monograph on benzene (IARC,
1982), Dr. Tomatls (1982) states:
On pages 395 and 396 (of the monograph), there 1s a complete and
objective summary of the available evidence of risks derived r'rom
exposure to benzene. It 1s clearly Indicated that at 100 ppro the
estimated relative risk for leukemia 1s Increased more man
20-fold. Risks of this magnitude should attract attention to the
possibility of significant risks at much lower levels. The IARC
felt, however, that the data were Insufficient to quantify
precisely risks at lower levels.
There appears to be a disagreement between the IARC and the Cancer
Assessment Group, for CAG has produced risk estimates for benzene 1n air and
water and, evidently, believes the data are sufficient. In light of the
ED - Study and the differences 1n scientific opinions, the Cancer Assess-
ment Group might want to formally review "what constitutes sufficient data
for risk assessment." My Ideas on risk assessment models, etc., are
Included as the second half of this comment.
The workshop participants emphasized a need to re-evaluate the proce-
dures used to convert exposures 1n animals to exposure to man. This 1s
especially true of the parameters utilized with CAG's multistage model. In
the past, 1t has been assumed that man 1s more sensitive than laboratory
animals on a mg/kg basis because of his lower basic metabolic rate.
Certainly, there are exceptions to this as there are any other generalities.
-155-
-------
The success- of the physiologically-based pharmacoklnetlc models 1n pharma-
cology Indicates that they may become a useful tool to lexicologists.
Anderson (1981) has stated:
Some of the predictions of the pharmacoklnetlc analyses are still
tentative and require more definitive experimentation. Nonethe-
less, studies on saturable clearance of a variety of chemicals are
causing a healthy reevaluatlon of several basic tenets of toxicol-
ogy. These fundamental aspects Include proper Indexing of dose-
response curves and better definition of what constitutes a bio-
logically significant dose. This research Is providing a much more
1n-depth appreciation of the complex Interactions between biochemi-
cal and physiological variables which together control the rate of
delivery and the rate of metabolism of Ingested or Inhaled chemi-
cals 1m vivo.
The CAG should formally review these models to determine which Internal
parameters might be significant to relate to observed dose-response curves
of cardongens. Such a review would provide needed Information as to what
specific data should be collected 1n future animal bloassays.
Quantitative Risk Assessments — A Choice of Models
Human risk assessment 1s an uncertain science, and 1t may well remain a
matter of serious dispute for years. Nevertheless, guidelines are needed
for the regulatory decision-making process, even 1n the face of scientific
uncertainties. Those guidelines must be flexible to accommodate Improve-
ments mandated by an ever-advancing health technology.
The process of risk extrapolation can be simply viewed as a procedure of
predicting from known data what might be expected to occur 1n a region where
there 1s no data. In dealing with potential carcinogens, the evaluation of
risk Is often complex. Complications arise due, primarily, to four factors.
First, the majority of cancer data 1s acquired through laboratory experi-
ments with rodents, not primates or human populations. Second, the amount
of material administered to experimentally Induce cancer 1s usually high.
-156-
-------
Third, the Incidence of cancer has to be high 1n order to discriminate H
from control populations. Fourth, present scientific data does not clearly
provide a definitive answer to the Issue of a threshold for carclnogenesls.
Thus, 1t Is uncertain how to precisely utilize the known data and the mathe-
matical direction of the extrapolation line becomes a matter of belief
rather than scientific fact. Many theoretical dose-response models for the
prediction of risk or carclnogenesls have been proposed (Mantel and Bryan,
1961; Armltage and Doll, 1961; Druckrey, 1967; Hoel et al., 1975; Guess et
al., 1977; Cornfield, 1977; Albert and Altshuler, 1973; Chand and Hoel,
1974; Hartley and S1elk1n, 1977; Crump et al., 1977; ReHz et al., 1978;
Gehrlng and Blau, 1977; Mantel et al., 1975).
Van Ryzln (1980) recommends a use of a variety of models to get answers
1n or near the experimental range. Munro and Krewskl (1981) provide a more
complete analysis of risk assessment and regulatory decisions. They point
out that:
"Because of the uncertainties Involved 1n assessing risks of low
levels of exposure, some regulatory authorities have advocated the
use of conservative risk assessment procedures [Interagency Regula-
tory Liaison Group (IRLG), 1979; U.S. EPA, 1980]. While linear
extrapolation may be appropriate for potent electrophlUc carcino-
gens, the use of such conservative procedures for less potent
substances, which may. Induce tumors through perturbation of normal
physiology, may not be warranted. In the latter case, however, the
most suitable model for extrapolation 1s not at all clear."
Munro and Krewskl warn that the quantification of human risk on the
basis of the results of laboratory studies 1n animals should be approached
with great caution. They advise that animal studies serve primarily as a
qualitative surrogate for humans and that any attempts to quantify responses
beyond the realm of biological certainty are open to serious question.
-157-
-------
The wide variety of mathematical models available for risk assessment
and the divergent answers they generate 1s confusing and the act of choosing
one model over another will always stir controversy. However, there have
been some unifying developments 1n this quandry, which would be worthy of
consideration.
The ED study has demonstrated that all models tend to be equally
predictive through a risk of 1CT2. After this point, the models begin to
diverge drastically and 1t 1s clear that the biological relevance 1s lost.
Since predictions beyond 10~2 are scientifically suspect, perhaps a combi-
nation of a mathematical model and safety factor approach should be consid-
ered rather than choosing a traditional risk model.
A model(s) could be used to predict a risk from a biological data set(s)
to 10~2, then a series of safety factors could be assigned to allow for
consideration of the total data base (particularly negative data) and to
clearly define areas where policy decisions have an overriding Influence.
Such an approach would be advantageous because the Issues of science and
policy would be clearly separated.
The mathematlc model could be used to predict risk from biological data
to the 10~2 limit. Then, a safety factor, or a combination of safety
factors, could be used to consider the total data base, Including negative
data. This mathematlc model safety factor method would also distinguish the
scientific elements of setting permissible exposure levels from the elements
based upon policy decisions. An example of such a quantitative risk model
1s described below:
Nongenotoxlc carcinogens: These are threshold toxins and tradi-
tional approaches can be used.
Acceptable Human = NOAEL (mg/kq/day) x 70 kg
Exposure (mg/day) 100
-158-
-------
NOAEL = No-observed-adverse-effect level 1n experimental animals
70 kg = assumed average human weight
100 = 100-fold uncertainty factor which represents 10 for
extrapolating from the average animal to the average human
and 10 for extrapolating from the average to the sensitive
human.
Genotoxlc carcinogen:
Model 1s devised to accommodate genotoxlc
carcinogens and compounds where genotoxldty
has not been well defined by experimental
evidence.
Acceptable Human
Exposure (mg/day)
Model (10~2) x
Human Equivalent
dose
101
102
Approximate Risk Level
10"2
Model: Mathematical model (problt, multlhlt,
multistage, etc.) utilized to predict
risk to 10~2 from experimental
animal data. (Used to predict only
what would be expected to occur 1n
rats, mice, etc.) The use of the 95%
confidence Interval of these models
Is equivalent to an additional safety
factor of 2- to 5-fold, I.e., for an
experiment of 10 animals = 5, for
100 animals « 3, 1000 « 1.8.
(Unless there 1s compelling evidence
to the contrary, the 95% confidence
Interval should not be used. Where
necessary 1t should be replaced with
appropriate safety factors.)
lO'1
Human Equivalent Dose:
Involves:
Model commonly used,
length of exposure
length of observat1on
Vlenc
A nt
length of observation \ 3
fespan of animal /
3 / 70 kg (average man)
-^ average animal weight
This 1s a conservative model and should be
modified where pharmacoklnetic data are avail-
able. When no comparative metabolic data are
available, this model will yield a safety
factor of ~5- to 6-fold 1n the conversion of
rat data to a human equivalent dose and
-13-fold for the conversion of mouse data.
-159-
-------
10 1 10] = uncertainty factor for extrapolating
from average to sensitive man.
10~2 102 = uncertainty factor associated with data
base. WHh good negative data 1n other
experimental animals, a factor of 1
might be used.
TOTAL RISK
FACTOR
10~s 10 ... — policy and other parameters
This example deals 1n a preliminary fashion with data from experimental
animals. If thoroughly developed, H would represent a way to separate and
define Issues of policy and science, which cannot be accomplished with other
commonly used risk models. It would also provide the flexibility to accom-
modate the growth of scientific evidence and to utilize negative data.
DR. MARVIN SCHNEIDERMAN
Figure 25 shows some data relating to threshold. The data relate to
some experiments In the 1950s by Walter Heston of NCI (Heston and Schneider-
man, 1953; Mantel et al., 1961). He was Interested 1n the somatic mutation
theory of cancer and was concerned whether he needed one or two mutations to
produce the lung tumors 1n Strain A mice. Heston postulated that a single
mutation would lead to a straight-line dose-response; two mutations would
lead to a quadratic dose-response curve. Figure 25 shows what he found 1n a
series of experiments. The first experiment produced a nice straight line
displaced to the right. This would be consistent with a one-mutation
process with a threshold. Because of previous work by Charles and Luce-
Clausen (1942), Dr. Heston questioned this apparent threshold for the tumor
system and the chemical (2,4,5,6-dlbenzanthracene) he was working with. (He
did estimate the "threshold" dose, however.) So he conducted another series
of experiments — with doses down 1n the "threshold" range — doses lower
than those he had employed before. Result? As shown 1n Figure 25: a
-160-
-------
O
1/1
re
TO
re
re
RESPONSE
o
re
re
o
3
3 C
O
o» -ix
3 ft> -O
o. -* re
tn
TO
D
O
en
to -*
ri 3* 3 INJ
31 re in
3 3E 3
re -* r+-
—>• r> to
Q. re
re -o
10
tn
co
3
03
rt)
I
-------
shallow dose-response curve 1n the low-dose region. Probably no threshold.
Possibly a two- or even three-stage process, with "background" dose contrib-
uting a small amount to the number of lung tumors found 1n these animals at
these low doses.
These observations bring us back to the concepts of dose add1t1v1ty and
models of cardnogenesls. Crump et al. (1976) showed that with the multi-
stage model, an assumption of dose-add1t1v1ty at low doses leads directly to
the linear, non-threshold model. This 1s true for every stage 1n the multi-
stage model of cardnogenesls. Thus linearity, non-threshold depends not on
whether a material 1s genotoxlc or not genotoxlc, as some research workers
have suggested, but rather on whether there 1s anything 1n the ambient
environment that operates 1n the same manner 1n the carcinogenic process as
the toxic material under question. The major thing to notice Is that the
later the stage at which the material operates, the more likely 1t Is that
reduced (or eliminated) exposure to the material will produce earlier
response, and with H, more of the appearance of a possible threshold. The
take-away lesson for both research and regulatory priorities would seem to
be: find late-stage carcinogens and reduce exposures to them drastically.
This should produce early reduction 1n cancers. For long-term reduction 1n
cancer, restricted exposure to early stage (and "complete") carcinogens
should produce the most effect.
DR. IAN NISBET
With one exception (see paragraph below), I am strongly opposed to
proposals to use separate methods of risk assessment for "genotoxlc" and
"nongenotoxlc" carcinogens, even 1f these can be operationally defined. The
most Important reason for assuming that dose-response relationships are
linear and non-threshold at low doses 1s the principle of dose addH1v1ty.
-162-
-------
The proof that dose addUWHy leads to linear nonthreshold dose-response
relationships 1s a very general one (Crump et al., 1976) and applies 1n all
cases except where the underlying dose-response relationship 1s of a strict
threshold type and the background Incidence 1s exactly zero. In accordance
with this, the multistage model predicts linear dose-response relationships
for chemicals acting at all stages. In my opinion, the burden of proof
should be on those who believe that late stage carcinogens have thresholds
to prove that their effects are not additive to background. It should be
noted that the human population has a background Incidence of all the
phenomena that are hypothesized to be modes of action of "nongenotoxlc"
carcinogens, Including metabolic overload and gross tissue damage. This Is
perhaps a bias that comes with middle age, but I think that late-stage
carcinogens are of greater concern than early-stage carcinogens.
One real difference between early- and late-stage carcinogens 1s In the
rate of decline of excess risks after cessation of exposure (Day and Brown,
1980). I think that risk assessments should take Into account Information
on the stage of action 1f the exposure assessment Indicates that exposure
will cease substantially before the end of life.
DR. JAMES WITHEY
We are still "up 1n the air" with respect to the question of what to do
with mutagens. My own feeling 1s that the air has become cloudy as a conse-
quence of the Increasing complexity of these tests and the arguments about
genotoxlcs, promoters, Initiators, etc. Right now I think that mutagenldty
might be used as an Indicator of potential cardnogenlcHy and to support
cardnogenldty data.
-163-
-------
REFERENCES
Albert, R.E. and B. Altshuler. 1973. Cons1dert1ons relating to the formu-
lation of limits for unavoidable population exposures to environmental
carcinogens. AEZ Symposium Series. Conference 72050. NTIS. p. 233-235.
AMA (American Medical Association). 1981. Carcinogen regulations. Report
of the Council on Scientific Affairs. 3. Am. Med. Assoc. 246: 253.
Ames, B.N., J. McCann and E. Yamasakl. 1975. Methods for detecting
carcinogens and mutagens with Salmonella/mammalian mlcrosome mutagenldty
test. Mutat. Res. 31: 347.
Anderson, M.E. 1981. Saturable metabolism and Us relationship to toxlc-
Hy. CRC Cr1t. Rev. Tox. p. 105-149, May.
ArmHage, P. and R. Doll. 1961. Stochastic models for cardnogenesls. In.:
Proc. of the 4th Berkeley Symposium of Mathematical Statistics and Probabil-
ity: Biology and Problems of Health. 4: 19.
Chand, N. and O.G. Hoel. 1974. A comparison of models for determining
"safe" levels of environmental agents. Reliability and Biometry. Soc. Ind.
Appl. Math. p. 382-401.
Charles, D.R. and E.M. Luce-Clausen. 1942. Cancer Res. 2: 261.
-164-
-------
Cleaver, J.E. and D. Bootsma. 1975. Xeroderma plgmentosum: Biochemical and
genetic characteristics. Ann. Rev. Genet. 9: 19.
Committee of the Health Council of the Netherlands. 1980. The Evaluation
of the Cardnogenldty of Chemical Substances. Government Printing Office,
The Hague.
Cornfield, J. 1977. Carcinogenic risk assessment. Science. 198: 693-699.
Crump, K.S., D. Howell, C. Langley and R. Peto. 1976. Fundamental carcino-
genic processes and their Implications for low-dose risk assessment. Cancer
Res. 36: 2973-2979.
Crump, K.S., H. Guess and K. Deal. 1977. Confidence Intervals and test of
hypotheses concerning dose-response relations Inferred from animal carclno-
genldty data. Biometrics. 33: 437-451.
Day, N.E. and C.C. Brown. 1980. Multistage models and primary prevention
of cancer. J. Natl. Cancer Inst. 4: 977-989.
Diaz Gomez, M.I. and J.A. Castro. 1980. Covalent binding of carbon tetra-
chlorlde metabolites to liver nuclear DNA proteins and I1p1ds. Toxlcol.
Appl. Pharmacol. 56: 199.
DIMayorca, G., M. Greenblatt, T. Frauthen, A. Seller and R. Glordono. 1973.
Malignant transformation of BHK., Clone 13 cells in vitro by nltrosamlnes
— a conditional state. Proc. Natl. Acad. Sc1. USA. 70: 46.
-165-
-------
Druckrey, H. 1967. Quantitative aspects of chemical cardnogenesls. In/.
Potential Carcinogenic Hazards from Drugs (Evaluation of Risk). U.I.C.C.
Monograph Series, Volume 7. SpMnger-Verlag. p. 60.
GehMng, P.J. and G.E. Blau. 1977. Mechanisms of cardnogenesls: Dose
response. J. Environ. Pathol. Toxlcol. 1: 163-179.
Guess, H., R. Peto and K.S. Crump. 1977. Uncertainty estimates for low
dose rate extrapolation of animal carcinogenic data. Cancer Res. 37:
3475-3483.
Hart, R.W., F.B. Daniel, O.R. Klndlg, C.A. Beach, L.B. Joseph and R.C.
Wells. 1980. Elemental modifications and polycycllc aromatic hydrocarbon
metabolism 1n human flbroblasts. Environ. Health Perspect. 34: 59.
Hartley, H.O. and R.L. Slelken. 1977. Estimation of safe doses 1n carcino-
genic experiments. Biometrics. 33: 1-30.
Heston, W.E. and M.A. Schnelderman. 1953. Analysis of dose-response
relation to mechanisms of pulmonary tumor Induction In mice. Science. 117:
109-111.
H111, A. and S. Wolff. 1982. Increased Induction of sister chromaUd e*
change by dlethylstllbestrol 1n lymphocytes from pregnant and premenopa•:;?!
women. Cancer Res. 42: 893.
-166-
-------
Hoel, O.G., et al. 1975. Estimation of risk of Irreversible delayed toxlc-
Hy. J. Environ. Health. 1: 133-151.
IARC (International Agency for Research on Cancer). 1979. Monograph on the
Evaluation of Carcinogenic Risk of Chemicals to Man. 20: 371.
IARC (International Agency for Research on Cancer). 1982. IARC Monographs
on the Evaluation of the Carcinogenic Risk of Chemicals to Humans. Volume
29. Some Industrial Chemicals and Oyestuffs. IARC, WHO Publications
Centre, Albany, NY.
IRL6 (Interagency Regulatory Liaison Group). 1979. Scientific basis for
Identification of potential carcinogens and estimation of risk. J. Natl.
Cancer Inst. 63: 241.
Kroes, R. 1982. Presentation on Carcinogen Policy 1n the Netherlands.
Toxicology Forum, February 15.
Lee, J.E., R.B. C1ccarell1 and K.W. Jennette. 1982. Solub1l1zat1on of the
carcinogen nickel subsulflde and Us Interaction with deoxyrlbonuclelc add
and protein. Blochem. 21: 771.
Lutz, W.K. 1979. In vivo covalent binding of chemicals to DNA as a quanti-
tative Indicator 1n the process of chemical cardnogenesls. Mutat. Res.
65: 289.
-167-
-------
Mantel, N. and W.R. Bryan. 1961. Safety testing of carcinogenic agents.
J. Natl. Cancer Inst. 27: 455-470.
Mantel, N., W.E. Heston and J.M. Gurlan. 1961. Thresholds 1n linear dose-
response models for cardnogenesls. J. Natl. Cancer Inst. 27(1): 203-215.
Mantel, N., N. Bohldar, C. Brown, J. C1m1nera and J. Tukey. 1975. An
Improved "Mantel-Bryan" procedure for safety testing of carcinogens. Cancer
Res. 35: 865-872.
Metzler, M. 1981. Studies on the mechanism of cardnogenldty of dlethyl-
stllbestrol: Role of metabolic activation. Food Cosmet. Toxlcol. 19: 611.
Miller, J.A. 1970. Cardnogenesls by chemicals: An overview. 6.H.A.
Clowes Memorial Lecture. Cancer Res. 39: 559.
Munro, I.e. and O.R. Krewskl. 1981. Risk assessment and regulatory
decision-making. Food Cosmet. Toxlcol. 19: 549.
NCI {National Cancer Institute). 1976. Report on Cardnogenesls Bloassay
of Chloroform. NTIS PB-264018, Springfield, VA.
Perera, F.P. and I.B. We1nste1n. 1982. Molecular epidemiology and
cardnogen-ONA adduct detection: New approaches to studies of human cancer
causation. J. Chron. DIs. 35: 581.
-168-
-------
PHot, H.C. and C. Heldelberger. 1963. Metabolic regulatory circuits and
cardnogenesls. Cancer Res. 23: 1694.
Prehn, R.T. 1964. A clonal selection theory of chemical cardnogenesls.
J. Natl. Cancer Inst. 32: 1.
Rajalakshml, S., P.M. Rao and D.S.R. Sarma. 1982. Chemical cardnogenesls:
Interactions of carcinogens with nucleic adds, ^n: Cancer -- A Comprehen-
sive Treatise, 2nd ed., F.F. Becker, Ed. Plenum Press, NY. p. 335-409.
ReHz, R.H., P.J. GehMng and C.N. Park. 1978. Carcinogenic risk estima-
tion for chloroform: An alternative to EPA's procedures. Food Cosmet.
Toxlcol. 16: 511-514.
ReHz, R., J. Quast, W. Stott, P. Watanabe and P. Gehrlng. 1980. Pharmaco-
klnetlcs and macromolecular effects of chloroform 1n rats and mice: Implica-
tions for carcinogenic risk estimation. In.: Water Chlor1nat1on: Environ-
mental Impact and Health Effects, Vol. 3, R. Jolley, W. Brungs, R. Cummlng
and V. Jacobs, Ed. Ann Arbor Science Publishers, Inc., Ann Arbor, MI.
Roblson, S.H. and M. Costa. 1982. The Induction of DNA strand breakage by
nickel compounds 1n cultured Chinese hamster ovary cells. Cancer Lett. 15:
35.
Rom, W.N., 6.K. Livingston, K.R. Casey, et al. 1983. Sister chromatld
exchange frequency 1n asbestos workers. J. Natl. Cancer Inst. 70: 45.
-169-
-------
Schumann, A. J. Quast and P. Watanabe. 1980. The pharmacoklnetlcs and
macromolecular Interactions of perchloroethylene 1n mice and rats as related
to oncogenldty. Toxlcol. Appl. Pharmacol. 55: 207.
Setlow, R.B. 1978. Repair deficient human disorders and cancer. Nature.
271: 713.
Slrover, M.A. and L.A. Loeb. 1976. Infidelity of DNA synthesis In. vitro:
Screening for potential metal mutagens or carcinogens. Science. 194: 1434.
Society of Toxicology. 1981. Criteria for human risk assessment: With
special emphasis on the regulations of potential carcinogens. Fundamental
Appl. Toxlcol. 1: 2.
Squire, R. 1981. Ranking animal carcinogens: A proposed regulatory
approach. Science. 214: 877-880.
Stott, W., R. Reltz, A. Schumann and P. Watanabe. 1981. Genetic and non-
genetic events 1n neoplasla. Food Cosmet. Toxlcol. 19: 587.
Tomatls, L. 1982. Letter to the journal. Science. 218: 214.
Ts'o, P.O.P. 1981. NeoplastU transformation, somatic mutation, and dif-
ferentiation. In: Cardnogenesls, Fundamental Mechanisms and Environmental
Effects, B. Pullman, P.O.P. Ts'o and H. Gelboln, Ed. D. Reldel Publ. Co.,
Boston, p. 297-310.
-170-
-------
U.S. EPA. 1980. Water Quality Criteria Documents. Federal Register. 45:
79318.
U.S. EPA. 1982. Report on Workshop on Estimating Ambient Water Quality
Criteria for Ep1genet1c Carcinogens. Washington, DC. February 17.
Van Ryzln, J. 1980. Quantitative risk assessment. J. Occup. Med. 22: 321.
Welsburger, L. and G. Williams. 1980. Chemical carcinogens, in: Toxicol-
ogy: The Basis Science of Poisons, 2nd ed. J. Doull, C. Klaassen and M.
Amdur, Ed. MacMlllan Publishing Company, NY.
Welsburger, J. and C. Williams. 1981. Carcinogen testing: Current problems
and new approaches. Science. 214: 401.
-171-
-------
HEALTH ASSESSMENT OF EXPOSURES TO CHEMICAL MIXTURES
September 30, 1982
-172-
-------
HEALTH ASSESSMENT OF EXPOSURES TO CHEMICAL MIXTURES
Outline of Issues and Review of Present Approaches
Presentation: Dr. Jerry F. Stara
ECAO, OHEA, U.S. EPA
Presentation: Or. Richard Hertzberg
ECAO, OHEA, U.S. EPA
-173-
-------
PRESENTATIONS
DR. JERRY STARA: OUTLINE OF ISSUES
Simultaneous exposure to several chemicals 1s a predominate occurrence
1n our environment. The Agency recognizes the need for specific guidelines
to assess the Impact on human health. However, 1t also recognizes that H
may not be possible to develop a scientifically defensible methodology to
resolve all the Issues associated with this complex subject. One of the
major purposes of the following series of presentations on "Health Assess-
ment of Exposures to Chemical Mixtures" 1s to Identify areas 1n which
reasonable scientific judgments can be made and methodologlc modifications
can be proposed, as well as to Identify those areas 1n which limitations 1n
our understanding and knowledge preclude methodologlc development.
DR. RICHARD HERTZBERG: REVIEW OF PRESENT APPROACHES
To date, about a half dozen Superfund-deslgnated sites have been Inves-
tigated. One or two marker chemicals have dominated the situation 1n some
easily Identified sites. For real world situations, the American Conference
of Governmental Industrial Hyg1en1sts (ACGIH) graded response approach based
on addltlvHy of equltoxic doses 1s used:
N
I = I 1,
there 1s cause for concern. If this approach 1s used, the Implications of
various dose-response curves must be considered, and an estimate of the
chance of significant Interactions (e.g., synerglsm) must be made.
-174-
-------
DISCUSSION
DR. MYRON MEHLMAN
Acute data, such as LD5Qs and LC5Qs, are not useful for extrapolat-
ing to the type of Injury that may result from low-level chronic exposure.
The acute data are useful for labeling substances for transportation and for
developing exposure levels for repeat dose exposure. What 1s probably need-
ed 1s a methodology that will yield data on chemical mixtures by Inhalation
and gavage which will reflect actual exposure to man.
The approaches to this could be:
1. Gather Information on:
a. Exposure
- levels
- extent
- frequency
b. Chemical properties
- group chemicals according to structure
- select representative chemicals from structural classes
(analogues)
c. Reports 1n literature
d. Structure-activity relationships
2. Determine additional Information needed
3. Set up testing programs to develop this Information
a. Nongenetlc
- reproductive (short-term tests; male, female)
- teratologlc (short-term tests)
- subchronlc (pathology, bloaccumulatlon)
- pharmacologlc (blood levels, target organs, metabolites,
half-life)
-175-
-------
b. Genetic
- in vitro (with Individual components)
Ames
Mouse lymphoma
DMA repair
Cytogenetlcs
- Jji vivo
Assay urine for water-soluble compounds and feces
for non polars from 30-day study 1n Ames and mouse
lymphoma tests
4. Develop pharmacologlc models
5. Using this data, develop a mathematlc model to estimate risk
6. Perform professional biologic assessment to set exposure level
OR. MARVIN LEGATOR
Risk Estimates: The Non-Science
As to both systemic toxldty and cardnogenlcHy, I have become Increas-
ingly less comfortable by the derivation of specific numbers based either on
safety factors or mathematical models. The scientific basis for adding
safety factors 1s totally lacking, and mathematical modeling does not help
1n resolving the multiplicity of uncertainty factors when extrapolating
animal data to man. In either case, I am afraid we come up with an almost
meaningless number. Although, as described In several papers 1n this
meeting, all sorts of caveats are given by scientists when presenting a
specific numerical risk figure, the final figure 1s usually labeled as a
definitive Indicator of human risk. It may be that for water quality we
need to derive a numerical estimate to set upper limits for specific chemi-
cals. In the case of hazardous waste sites, perhaps we should abandon doing
quantitative risk estimates. An alternative to deriving a specific figure
may Include the following elements:
-176-
-------
1. Collect exposure data to Indicate number and quantity of chemicals at a
specific dump site (Incorporate specific factors presented by Or.
Nlsbet).
2. Determine how many chemicals are teratogens, carcinogens, systemic
toxicants, etc.
3. For each chemical 1n each toxic category, use a ranking procedure to
establish categories of concern. Possible models for this categoriza-
tion are to be found 1n the Food Safety Council discussion on mutagenlc-
1ty (Food Safety Council, 1978} and the Squire scheme for carcinogens
(Squire, 1981). Rather than a specific number, we will use the animal
data to reflect relative potency. Three or five categories can be
established for each toxic class.
This type of ranking avoids the artlfactual specific number and more
accurately reflects the predseness of translating data from one species to
another. Taking Into consideration the exposure data, the number of chemi-
cals present and their specific potency, the hazard of a particular site can
be determined. I would like to expand at a later date on this ranking
approach for hazardous waste sites.
MR. WILLIAM GULLEDGE
Alternative Approaches 1n Risk Assessment
My comments are offered as an observation of human health-based and
aquatic life-based water quality criteria development and possible applica-
tion to- hazardous waste disposal site modification. Human health-based
criteria development uses compound-specific approaches which rely on quanti-
tative chemical speclatlon and quantitative risk assessment modes. Aquatic
life-based criteria development 1s based on a generic toxldty approach that
uses generic bloassays to assess acute and chronic effects 1n aquatic
-177-
-------
organisms. It appears that there may be some technical and cost advantages
to a generic toxldty approach, however Us feasibility to application of
human health criteria may be limited. Generic toxldty may be an Important
tool for hazardous waste disposal site risk assessment. It could be a very
cost-effective technique for preliminary screening of hazard and the degree
of hazard.
DR. REVA RUBENSTEIN
It would have been more fruitful to examine how much the outcomes change
when the underlying assumptions are changed. We frequently are told that
TLVs are established for the "mythical" 70 kg, 21-year-old male worker. How
different will the numbers be 1f they are established for a 55 kg, 21-year-
old female worker, and so on.
DR. MAGNUS PISCATOR
There was no discussion at all of some other problems, such as how to
conduct studies at a waste site. The exposed people are probably not
Interested 1n extrapolated data from animal experiments. They want to know
1f they are sick or If there 1s a risk for birth defects etc. Thus good
methods for studies of effects are needed. Is 1t possible to find good
reference populations for an exposed group? This must be discussed some
time.
Reference was made to the ACGIH Index for exposure to several chemicals,
but 1t was pointed out that this assumes exposure to a group of chemicals
with similar properties (e.g., some solvents).
DR. KURT ENSLEIN
Insofar as the estimation of health effects from multiple chemical
exposures 1s concerned, I discussed this matter at length with my group here
1n Rochester and we have come up with what may be an approach that could
-178-
-------
conceivably be useful. If you recall our structure-activity models, par-
ticularly the rat ID™ equation, you will remember that this equation
consists of a large number of substructural fragments and molecular weight,
each of these being assigned a weight. If one considers a mixture to be
made up of nearly Identical portions of separate chemicals, say two, then 1t
1s conceivable (and this was suggested by Or. Clarkson) that one could
simply add the appropriate substructural fragments to arrive at an estimate
of the toxldty of the two chemicals 1n conjunction with one another, under
the assumption that there will be neither synerglsm nor antagonism. As one
now Increases the number of chemicals, more and more of the fragments from
the equation will be used, so that eventually all the fragments will be used
to some extent. Under those circumstances we know that the asymptotic rat
oral LD50 will be 1700 mg/kg (this number comes from the constant 1n our
equation and 1s based on the 1851 chemicals from which the equation was
calculated).
At the other end of the spectrum, suppose one chemical was dominant, or
for that matter, that one chemical was much more toxic than the others at
the particular site. Then one would be able to say that LD5Q for that
dump lies somewhere between that of the most toxic chemical, and 1s at worst
1700 mg/kg. In fact, 1f there are only a few chemicals 1n the dump, the
estimation problem of course doesn't exist.
It would be Interesting to experimentally check these Ideas by testing
appropriate combinations of chemicals 1n the rat oral L05Q assay. If such
an experimental scheme were properly designed, we would also learn about the
single or multiple weighting to be given to those fragments which appear 1n
more than one chemical. It 1s also not too difficult to Imagine various
ways 1n which synerglsm and antagonism could be accounted for. The same
-179-
-------
general principles could also be applicable to the other endpolnts that we
have presently modeled, as well as other endpolnts that would be modelable
1n the future. This 1s particularly the case for aquatic and Inhalation
toxlclty.
DR. HARRY SKALSKY
Our discussions on the health assessment of exposures to chemical mix-
tures have underscored the complexities Involved 1n attempting to evaluate
multiple chemical exposures as they pertain to dump sites. It does not
appear that there can be a "methodology" developed that will cover all the
situations. Each dump site will entail Us own unique problems, whether H
be Us proximity to populous areas, specific hydrology or the variety of
chemicals 1t contains. There does not appear to be any substitute for a
case-by-case approach. Perhaps the most compelling need surrounding the
assessment of a particular site 1s the gathering of accurate exposure Infor-
mation. It would appear that the Agency would need to devise a procedure to
Identify and quantltate the Individual chemical exposures at a dump site
location. Without this exposure Information, 1t 1s almost Impossible to
accurately assess potential hazard to a specific population. I would think
a response team could be developed with a priority of gathering exposure
data as the first step 1n reacting to notification of a potential dump site
problem. Judgments concerning public safety could then be made on the basis
of sound exposure data rather than a "hunch".
The Identification of the chemicals Involved and a quantification of
exposures 1s a necessary first step. Then judgment as to the effect of
these exposures, or presence of a hypersensitive population, etc., can be
made on a case-by-case basis.
-180-
-------
GENERAL COMMENTS
• The AC6IH graded response formula was Intended to be applicable only to
similar components (e.g., a mixture of solvents), not to a mixture of
unrelated compounds. The similarity 1s assumed to apply to target
organ, kinetics, and overall uptake. Application to mixtures of dis-
similar compounds, assuming no synerglsm, would likely result 1n over-
estimating the actual toxldty. For carcinogens at low risk levels, the
addition of risks 1s probably suitable.
-181-
-------
REFERENCES
Food Safety Council. 1978. The proposed system for food safety assessment.
Food Cosmet. Toxlcol. 16(Supplement 2): 1-136.
Squire, R.A. 1981. Ranking animal carcinogens: A proposed regulatory
approach. Science. 214: 877-880.
-182-
-------
HEALTH ASSESSMENT OF EXPOSURES TO CHEMICAL MIXTURES
Assessment of Exposure
Presentation: Dr. James Falco
Exposure Assessment Group, U.S. EPA
Presentation: Dr. Ian Nlsbet
Clement Associates
-183-
-------
PRESENTATIONS
DR. JAMES FALCO: ASSESSMENT OF EXPOSURES
Introduction
Exposure assessments are critical to the evaluation of the potential
public health risks due to exposure to toxic chemicals. Several components
are needed to estimate the extent of the exposure:
1. An estimate of the releases Into the environment, as well as
monitored ambient concentrations.
2. An estimate of the number of people potentially exposed.
3. A description and quantification of characteristics of the
exposed population.
Chemical-specific (e.g., fate and transport) and site-specific (e.g.,
ecological) data are also necessary components of complete exposure assess-
ments. Each component will directly affect the accuracy and the resulting
Implications of each assessment.
Present Approach
The present approach used to estimate whole-body dose from ambient
environmental concentrations 1s to treat complex mixtures as a set of
Independently acting chemical agents. The environmental concentration of
each constituent Is determined, and then the dose from each constituent 1s
estimated as 1f H were acting singly.
Possible Approach
For complex mixture exposure assessments, the following estimates could
be made:
1. Concurrent release rates.
2. Temporal and spatial variations 1n environmental concentrations.
3. Population exposed.
4. Exposures to each chemical and simultaneous exposure to two or
more chemicals.
-184-
-------
At least two major problems specific to complex mixtures must be
overcome to make this approach workable. First, an estimation of the timing
of releases must be made, and second, any modifications of environmental
behavior due to chemical Interactions must be understood.
DR. IAN NISBET: ASSESSMENT OF EXPOSURES
To make exposure assessments, one must take Into account the behavior of
the chemical 1n the environment and the behavior of the Individual exposed.
Environmental factors Include release rates, environmental transport and
ambient concentrations 1n various media, Including considerations of space
and time variations. Factors related to the exposed Individual Include both
population characteristics (e.g., numbers, demography, movement patterns and
susceptibility factors) and uptake factors (e.g., activity patterns, Intake
rates, absorption factors and pharmacoklnetlcs). Large fluctuations 1n
exposure rates produce the need to statistically express the frequency of
exposure to different concentration levels and the time scale of the
fluctuations. The best exposure assessments will use Information from all
of the following sources: ambient monitoring, models, target monitoring,
analogies and surrogates.
Meaningful assessments of exposure must Include assessment of temporal
variability and duration. In multlchemlcal situations, the constitution of
the mixtures to which people are exposed varies 1n both time and space, so
that no single measure of exposure can be adequate. I recommend the use of
several Indicator (or surrogate) chemicals, and toxldty testing of environ-
mental mixtures to derive a relationship between risk and exposure to the
Indicator chemicals (Including variability of this relationship). Recogniz-
ing the practical difficulties, I recommend using target monitoring (human
tissues and sentinel animals, not wild animals) as an Invaluable tool 1n
exposure assessment.
-185-
-------
DISCUSSION
DR. JULIAN ANDELMAN
In the presentation on assessment of exposure by Dr. Nlsbet, as well as
the critique by Dr. Plscator on subpopulatlons at greater risk (see the
following chapter), the likelihood of encountering log-normal exposures and
uptakes In an exposed population was appropriately raised. Thus, 1n addi-
tion to considering the Increased sensitivity of certain segments of the
exposed population, particular attention should be directed to the question
of the variability of exposures and body burdens among groups that might
nominally be expected to have the same exposure. Such variabilities can and
do arise as a result of Individual behavior, as well as varied physiology.
This 1s graphically shown 1n geographically discrete populations of children
who have log-normally distributed concentrations of lead 1n their blood. An
example of the Implications of this phenomenon 1s the assessment of the
likely exceedance of a threshold level (e.g., NOAEL) via a given route of
exposure, such as drinking water. Assuming the water concentration and a
2 i water Intake would correspond to this NOAEL, the fraction of the
population likely to exhibit an effect might be considerably less than 100%
due to the log-normal question. This could be addressed, however, through
the uncertainty factor mechanism, but should be specifically considered as a
principle to be Incorporated 1n some fashion.
MR. WILLIAM GULLED6E
Disadvantages of Transport Modeling Approach
Discussion at the workshop revealed the belief that transport modeling
can frequently yield results which can be 1-2 orders of magnitude off
measured results. This 1s certainly the case with respect to chemical fate
modeling, which 1s a poor tool for exposure assessment. Very few, 1f any.
-186-
-------
models have been properly field validated. Laboratory validation, using
microcosms or an alternative experiment, does not provide a true Indicator
for performance 1n the environment.
Another disadvantage of modeling 1s the extensive data requirements for
Input Into the program. A typical model which could be used for water qual-
ity assessment and criteria development Includes extensive Input parameters
for the water column and sediment Interactions. Measurement for partition-
ing, hydrolysis, oxidation, blodegradatlon, photolysis and volatilization
must be taken for the water column. Assessment of sedimentation, resuspen-
slon, density and solids concentration must be made for the sediment. These
data must be obtained before most models are run, an expensive process, and
the results used for human health exposure assessment.
Use of Surrogates for Multi-Chemical Exposure
Several presentations during the workshop alluded to the use of Indica-
tor or surrogate compounds for hazard assessment. Research 1n this area has
shown poor correlation between selected surrogates and actual toxic com-
pounds. In a study conducted by the Chemical Manufacturers Association
(CMA), "no statistically reliable correlations were found between conven-
tional and nonconventlonal pollutants and toxic organic pollutants." The
Indicator concept was applied to volatlles, base/neutral, and add fractions
with mixed results. Very few correlations were observed, with one exception
being 2,4-d1chlorophenol. Additional research 1s necessary.
DR. ROLF HARTUNG
The relationships of Intermittent or fluctuating exposures to steady
experimental exposures need further study, using more sophisticated analyti-
cal tools than Haber's Rule or Slderenko's modification.
-187-
-------
DR. IAN NISBET
Most cases of Injury by environmental chemicals have multiple causes. A
consequence of the phenomenon of dose-add1t1v1ty-w1th-background 1s that, at
low doses of the environmental agent, all of the observed cases will be
strongly associated with background causes, and only weakly associated with
the Incremental exposure to the agent. (For example, most of all asbestos-
Induced lung cancers occur 1n smokers.) Unless multiple causation 1s
explicitly recognized, we will underestimate the effect of environmental
agents. (This Is what I would like to be known as the Doll-Peto fallacy).
GENERAL COMMENTS
There 1s a problem 1n obtaining data from dump sites because of the need
to sample quickly and decide what to do for all government bodies and
citizen groups Involved.
Use of exposure models without real data produces a great deal of
uncertainty.
A simple statistical model assuming a log normal distribution of
exposure Is one of the most useful models.
Target monitoring 1s probably the most useful method but unless there 1s
a surrogate chemical to which there has been a high enough exposure, 1t
will be difficult to plan an exposure assessment around target monitor-
Ing alone, due to matrix problems In analyzing very small sample size of
the various chemicals. However, some guidance for future study design
can be obtained from even one tissue sample.
For chemicals that are not unique to the dump site, such as lead,
contribution from other sources must be considered.
There 1s a problem of double counting volatile compounds (e.g., exposure
from water may be measured and then as the compound volatilizes, the air
level 1s measured).
Distribution of exposure has not been characterized for mult1chem1cal
exposures.
There 1s a working assumption to use log normal since 1t fits the data.
The use of tracer chemicals 1s not being considered since 1t would take
too long to monitor groundwater movement.
Measurement of the toxlclty of the unknown materials and their migration
potential should be considered.
-188-
-------
HEALTH ASSESSMENT OF EXPOSURES TO CHEMICAL MIXTURES
Subpopulatlons at Greater Risk
Presentation: Dr. Linda Erdrelch and Ms. Cynthia
Sonlch Mullln
ECAO, OHEA, U.S. EPA
Critique: Dr. Games WHhey
Food Directorate, Bureau of Chemical
Safety
Critique: Dr. Eula Blngham
University of Cincinnati
Critique: Dr. Magnus Plscator
University of Pittsburgh
-189-
-------
•PRESENTATION
DR. LINDA ERDREICH AND MS. CYNTHIA SONICH MULLIN: HYPERSUSCEPTIBLE SUB-
GROUPS OF THE POPULATION IN MULTICHEMICAL RISK ASSESSMENT
Introduction
The existence of hypersusceptlble Individuals has been recognized by the
EPA, even 1n the absence of specific data on the response of hypersusceptl-
ble humans or experimental animals. However, even when specific subgroups
have been Identified, they are often only considered qualitatively 1n cri-
teria derivation. Little consideration has been given toward the methodical
protection of specific hypersusceptlble subgroups. The goal of this presen-
tation 1s to critically assess the need to Identify and protect such Indi-
viduals, particularly 1n the context of risk assessment following exposure
to toxic waste sites. In this regard, two main Issues will be addressed:
1. To what extent does the present approach protect high-risk,
Including hypersusceptlble, subgroups of the population?
2. Do these hypersusceptlble Individuals comprise a proportion of
the population that 1s sufficiently large to justify the
systematic consideration of high-risk groups 1n the risk
assessment process?
Prior to addressing these Issues, specific terminology and background
must be provided. Hypersuscept1b1l1ty and sensitivity have been used Inter-
changeably. Redmond (1981) defines sensitivity as "responsiveness to a
pollutant", where sensitivity refers to the rate of change of a response as
the dose Increases. We prefer the use of the term hypersusceptlblHty 1n
that 1t simply Implies "more susceptible." A hypersusceptlble Individual 1s
one who will experience an adverse health effect to one or more pollutants,
significantly before the general population, because of one or more factors
which predispose the Individual to the harmful effects.
These Individuals are essentially at higher risk of adverse health
effects due to exposure to the pollutants. However, "high risk" 1s often
-190-
-------
used to designate specific groups, such as occupational groups, that are at
higher risk because they are exposed to higher exposures. Another example
would be persons living near hazardous waste sites. The distinction here 1s
that some of the workers or some of the residents may be hypersusceptlble,
but all workers and all residents are at high risk because they have been
exposed to levels higher than levels to which the general population 1s
exposed.
In his book, Pollutants and High Risk Groups. Calabrese (1978) classi-
fies hypersusceptlble Individuals Into five main categories based on bio-
logical factors which Increase human susceptibility to pollutants as
described 1n Table 11. Mainly the first four categories will be considered
at this time. Known physiological mechanisms form the basis of Calabrese's
categorizations and Include phenomena such as transplacental transfer of
certain chemicals, greater gastrointestinal absorption 1n younger children,
and vulnerability of the embryo and,fetus. These factors clearly Indicate
that there Is a wide range of susceptlblHy 1n the human population.
Few data exist to aid 1n quantifying the amount of excess risk due to
hypersuscept1b1!1ty 1n humans. Such quantltatlon can usually be Inferred
from appropriate animal data, however, such data are scarce. Qualitative
data, as well as the mechanistic processes, support the existence of such
groups. Qualitative support 1s derived from observations 1n both animals
and humans that adverse effects occur In only a portion of those exposed,
despite similar exposures. The effects observed 1n the exposed population
may also vary 1n degree of severity.
Despite the lack of quantitative dose-response Information, hypersuscep-
tlble Individuals have not been totally neglected 1n regulatory procedures.
-191-
-------
TABLE 11
Biological Factors Predisposing Individuals to
HypersusceptlbllHy to Pollutants*
Factors
Rationale
Developmental Processes
Examples: Pre- and Neonatal
Young children
Genetic Disorders
Examples: G-6-PD deficiency
Sickle-cell trait
Nutritional Deficiencies
Examples: Vitamin C deficiency
Protein deficiency
Existing Disease
Examples: Heart
Lung
Behavioral Factors
Examples: Smoking
Alcohol
Dietary patterns
Immature enzyme detoxification
systems
Higher rate of GI absorption
Hemolysls 1n presence of certain
chemicals
Predisposes toward hemolytlc
anemia
Potentiates effects of pollutants
Affects metabolism of Insecti-
cides
Increased mortality 1n pollution
episodes demonstrated 1n ep1-
demlologlc studies
Existing conditions are aggra-
vated by respiratory Irritants
Increases exposures and there-
fore risk; Interferes with nor-
mal cleansing mechanisms of lung
Liver damage, synerglsm with
other chemicals
Increases exposures; deficiency
states Increase risk
*Source: Adapted from Calabrese, 1978
-192-
-------
For example, the Clean A1r Act of 1970 mandated that the health of hypersus-
ceptlble as well as healthy Individuals of the population be protected. Two
Instances for which standards have been set for the hypersusceptlble Indi-
vidual are nitrates and lead. For both contaminants, the hypersusceptlbles
are Infants and children. Particularly 1n the case of nitrates, Infants are
the only group of the population affected. It appears that standards are
set for the hypersusceptlble group only 1f available human data demonstrate
the risk to that particular group. If, however, animal models and/or mecha-
nistic Information suggest a hypersusceptlble group, then this Information
can only be considered qualitatively 1n scientific guidance for risk assess-
ment (e.g., the effect of PCBs or chlorinated hydrocarbons on pregnant
women).
In general, the neglect of such subgroups 1s due to the lack of quanti-
tative data and the assumption that those Individuals comprise only a small
proportion of the population. This latter Issue will be discussed and
assessed 1n detail In the following section of this presentation. However,
although some of these groups may be quite small 1n number for a given
chemical, the Issue must be considered 1n light of potential exposure to
complex mixtures. Since one or more specific hypersusceptlble subgroups may
be associated with each contaminant, the total number of Individuals classi-
fied as hypersusceptlble could comprise a significant proportion of the
population. Thus, multiple chemical exposures could lead to multiple hyper-
susceptlble subgroups. Furthermore, because of the diversity of each hyper-
susceptlble subgroup, 1t has been postulated that everyone 1n the population
will be a member of a susceptible group at certain times during life
(Calabrese, 1978; Redmond, 1981).
-193-
-------
Is the Present Approach Protective of Hypersusceptlble Subgroups?
The EPA guidelines suggest that a 100-fold safety, or uncertainty,
factor be Incorporated when extrapolating from chronic animal studies to
humans. This can be perceived as Including a 10-fold factor to account for
Interspedes variability (I.e., extrapolation from animals to humans), and a
10-fold factor to account for 1ntraspec1es variability (I.e., the average
vs. the sensitive human). To evaluate the extent to which the present
approach protects hypersusceptlble subgroups, the data source as well as the
extrapolation process must be considered.
For the majority of chemicals, the sources of the data are laboratory
experiments. Experimental animals are In-bred for physiological homogene-
ity. Further homogeneity 1s Introduced by experimental designs that select
animals of similar age and weight.
In an attempt to ascertain the extent to which a 10-fold dose-reduction
protects sensitive members of the animal population, Dourson (1982) examined
the range of log-dose problt slopes of 490 animal studies compiled by Well
(1972). The majority of the slopes, based on acute effects, were greater
than 3. This would require a dose reduction of 10 to drop the average
response at least three standard deviations, that 1s, to a level protective
of the sensitive animal (Well, 1972; Oourson, 1982). For those 8% of chemi-
cals 1n the example having problt log-dose slope less than 3, a 10-fold
reduction 1n dose would not achieve a three-standard-deviation reduction In
response. Thus, the 10-fold factor may be protective of the sensitive
laboratory animal, based on an assumed normal distribution of varlablUy for
a majority of chemicals. There are few similar studies of the dose-response
relationship 1n humans.
-194-
-------
This 10-fold factor may not be adequate to protect humans because the
human population 1s not as homogeneous as the animal population. Contribut-
ing to the heterogeneity 1n human populations Is the fact that all members
of the population are exposed to environmental pollutants regardless of
health status, sex, age, weight or nutritional status. The range of Indi-
vidual variability of physiological values 1s 111-deflned and 1s often not
normally distributed In the healthy population. The variability for those
who are diseased Is generally much larger than that of the healthy popula-
tion. The safety factor concept was developed 1n recognition of this varia-
bility as well as other uncertainties of the extrapolation procedure, but
the selection of the number 10 1s not based on a quantitative assessment of
this variability.
The present approach has not been designed 1n Ignorance of the existence
of hypersusceptlble Individuals, but rather on the assumptions that 1) no
quantitative data are available to support a more accurate dose reduction to
protect them, and 2) that, for any one chemical, the portion of the popula-
tion which 1s hypersusceptlble 1s too small to merit consideration. The
Issues surrounding the former assumption merit further consideration and
study. The Issues surrounding the latter assumption are evaluated In the
following section.
What Proportion of the Exposed Population Is Hypersusceptlble?
The hypothesis 1s that, for a population exposed to multiple chemicals,
the proportion that can be considered hypersusceptlble will be substantial.
Various subgroups will be hypersusceptlble to at least one of the chemicals
and some to more than one. Therefore, while the present approach may be
adequate for single chemicals, an alternative 1s necessary for the case of
exposure to multiple chemicals.
-195-
-------
There are two components to the process of quantitatively estimating the
number hypersusceptlble to toxic effects from exposure to multiple chemi-
cals. The first step 1s the qualitative Identification of those hypersus-
ceptlble subgroups. The second 1s to estimate the frequency of these groups.
Qualitative Identification of Hypersusceptlble Groups.
The mechanistic approach of Calabrese (1978) 1s particularly useful for
Identifying potential hypersusceptlble subgroups due to the absence of human
and animal data to Illustrate hypersuscept1b1lHy empirically for specific
chemicals. Calabrese (1983) recognizes the scarcity of laboratory data and
suggests potential animal models, several diseases, and conditions of
proposed hypersuscept1b1!1ty. Cardiovascular disease 1s the leading cause
of death 1n the United States and a highly prevalent cause of morbidity.
Although this disease has been widely studied, McCauley and Bull (1980)
contend that the Impact of environmental contamination on cardiovascular
toxldty 1s not well known. Tox1colog1cal and ep1dem1olog1cal studies have
shown an association between cardiovascular disease and environmental
contaminants such as certain metal Ions and chlorinated solvents. In some
of the animal studies, healthy animals are used and exposure levels are
quite high. On the other hand, human studies are unable to control for
confounding factors such as diet and smoking habits. McCauley and Bull
(1980) suggest animal models for various cardiovascular pathologies. Thus,
animal studies offer the opportunity to control for the complexity of
factors 1n the human environment and to define specific steps 1n the disease
process which environmental toxins exacerbate. These authors further
suggest animal models appropriate to the study of such environmental con-
tamination.
-196-
-------
The category of hypersusceptlblHty that has received the most attention
In cases of environmental pollution 1s teratogenldty. The biological basis
for embryonic and fetal vulnerability and for transplacental cardnogenldty
1s strong (Stroblno et al., 1978; Rao et al., 1981, Saxena et al., 1981;
R1ce, 1981; Kurzel and Centrulo, 1981; FabMcant and Legator, 1981).
However, the relationship between animal and human teratogenldty 1s not as
clearly established as the relationship between animal and human cardno-
genldty.
Through preliminary Investigations we have Identified several chemicals
to support the hypothesis that several subgroups are hypersusceptlble to
environmental exposures. Table 12 shows these chemicals. In addition to
the embryo, fetus and young children, those with existing pulmonary disease
and coronary heart disease are likely candidates for high prevalence sub-
groups.
Estimating Prevalence of Hypersusceptlble Subgroups.
While statistics on births and mortality are routinely collected,
statistics on morbidity are not. The prevalence of chronic conditions 1s
available from epldemlologlc studies and national surveys (e.g., Framlngham,
HANES, HIS) which are not collected by political division as are vital
statistics. Furthermore, the epldemlologlc surveys are Irregularly per-
formed on a national sample, whereas to vital statistics are compiled every
year for the entire population.
Table 12 shows the prevalence of several specific hypersusceptlble sub-
groups. Morbidity rates from an Interview survey conducted by the National
Center for Health Statistics (1974) were: chronic bronchitis — 33 per
1000; hypertensive disease — 60 per 1000; and heart conditions — 50 per
1000. This Interview did not Include residential Institutions such as
-197-
-------
TABLE 12
Prevalence of Subgroups Hypersusceptlble to Effects of Common Pollutants
Hypersusceptlble
Prevalence3
Chemicals^
Reference**
CO
CD
Embryo, fetus,
neonate
Young children
Lung disease
Coronary heart
disease
Liver disease
pregnant women: 21/1000°
ages 1-4: 70/1000
emphysema, asthma: 37/1000d
coronary heart disease:
16-27/1000d
liver condition: 20/10006
Carcinogens, solvents,
CO, mercury, lead,
PCBs, pesticides
Hepatotoxlns, PCBs,
metals
Ozone, Cd, partlcu-
lates, S02, N02
Chlorinated solvents,
fluorocarbons
Carbon tetrachloMde,
PCBs, Insecticides,
carcinogens
R1ce, 1981; Kurzel and
Cetrulo, 1981; Saxena et al.,
1981
Calabrese, 1981; FMberg et
al., 1979
Holland, 1979; Redmond, 1981
McCauley and Bull, 1980;
Avlado, 1978
Calabrese, 1978
aAll estimates based on 1970 census
^Representative sample. Some evidence from animal studies only.
cAuthors' estimate from 1970 census statistics data
^Health Interview Survey (NCHS, 1970)
eHealth Interview Survey (NCHS, 1975)
-------
nursing homes and penitentiaries, and 1s believed to underestimate the true
prevalence 1n the general population. Therefore, the potential number at
risk may be higher.
The proportion of hypersusceptlble Individuals 1n an exposed population
may be estimated by multiplying the exposed population by the proportion at
risk 1n the general study population. Although greater precision could be
obtained by partitioning the exposed population Into demographic subgroups
at differing risks and calculating age-, race- and sex-spec1f1c risks, this
1s generally believed to be Impractical. Thus, estimates of the prevalence
of hypersusceptlble subgroups for a given site will be Imprecise. For this
reason a risk assessment scheme for cases of exposure to multiple chemicals
may be limited to an ordinal or ranking scheme rather than one which Is
based on continuous data.
When considering exposure to more than one chemical, the number of
hypersusceptlble subgroups 1s even greater. Table 13 shows Inventories of
typical waste sites along with the prevalence of the high-risk subgroups.
One method of estimating the proportion hypersusceptlble to more than one
chemical 1s to add the rates for each chemical. The proportion at high risk
1s conservatively estimated to be from 10-20% among these sites. The number
of higher risk Individuals for a hypothetical population of 5000 1s given 1n
column 4. Clearly, some of the categories of chronic diseases are not
mutually exclusive, and addition would provide an overestimate since some
Individuals can be the victims of more than one condition. On the other
hand, certain Individuals may be excessively hypersusceptlble to the effects
of one chemical due to more than one pre-existing condition.
-199-
-------
TABLE 13
Hypersusceptlble Subgroups Associated with Typical Inventory of Chemicals at Waste Sites to Which People Have Been Exposed
Occurrence
i
ro
O
O
i
Chemical
4 of 4 sites Chlorinated ethanes
(dlchloroethanes)
Dlchloroethylenes
1,1,1-Trlchloroethane
THchloroethylene
Chloroform
Hypersusceptlble Subgroup
CHO
liver condition
pre-exposure to hepato-
toxlns
embryo/fetus
CHD
CHD
liver condition
pre-exposure to hepato-
toxlns
Uver condition
CHD
lung
Prevalence Rate
(per 1000)
CHO: 24
Uver condition: 20
pregnant women: 21
Number of Hypersusceptlbles 1n
Hypothetical Population of 5000
120
100
105
lung (bronchitis): 33
165
3 of 4 sites
Benzene
Ethyl benzene
Methylene chloride
Tetrachloroethylene
thalassemla plus others
thalassemla plus others
CHD
Uver condition
lung
liver condition
thalassemla: 0.1-8X of
persons of Italian, Greek,
Syrian and African origin
CHO: 24
liver condition: 20
lung (bronchitis): 33
120
100
165
-------
TABLE 13 (cont.)
I
o
I
Occurrence Chemical
2 of 4 sites Carbon tetrachlorlde
Trlchlorof luoromethane
Vinyl chloride
Arsenic
Chromium
Selenium
Silver
Mercury
Hypersusceptlble Subgroup
liver condition plus
others
lung
CHO
vitamin C deficiency
young children
(to metals)
vitamin E deficiency
selenium deficiency
vitamin C deficiency
Prevalence Rate
(per 1000)
liver condition: 20
lung (bronchitis): 33
CHO: 24
vitamin C deficiency: 10-30X
of Infants, children and
adults of low Income
ages 1-4: 70
Number of Hypersusceptlbles 1
Hypothetical Population of SO
100
165
120
25
350
105
cystlnurla, pregnant
women plus others
-------
Conclusion
Due to the large number of Individuals 1n high risk subgroups the
present approach may be adequate only for single chemicals. An alternative
1s necessary when evaluating the health risk from exposure to multiple
chemicals. Some of the hypersusceptlble subgroups for single chemicals may
comprise a significant proportion of the population while others may not.
However, when considering multiple chemical exposures and, therefore,
multiple hypersusceptlble subgroups, even the very small groups become
significant because they are part of a large number of Individuals who are
hypersusceptlble to the chemical mixture. For a typical waste site, 10-20%
of the population may be at high risk.
Because the Importance of considering such groups 1s evident, a scheme
has been devised to systematically assess the vulnerability of high-risk
groups to exposure to multiple chemicals for a defined site. The scheme 1s
proposed to derive an Index for a hazardous waste site on an ordinal scale
(Table 14). The proposed approach Involves Identification of the contami-
nants and their respective hypersusceptlble subgroups. Once these subgroups
have been Identified, the proportion of the population they comprise will be
estimated by applying prevalence rates for the general population to the
exposed population. Methods of merging the affected population with the
exposure data will be explored. This overall ranking scheme Is envisioned
as a component of the multiple chemical risk assessment procedure.
-202-
-------
TABLE 14
Proposed Approach to Evaluate Multiple Chemical Exposures
for Impact on Hypersusceptlble Subgroups
1) Obtain monitoring data relative to the exposure
2) List the hypersusceptlble subgroups associated with each contaminant
3) List the prevalence rates for each hypersusceptlble subgroup
4) Calculate (from 2 and 3), the estimated percentage and number of hyper-
susceptlble Individuals In the exposed population
5) Incorporate this Index Into the site-specific risk assessment
-203-
-------
CRITIQUES
DR. JAMES WITHEY
The following are philosophical approaches to dealing with subpopula-
tlons at greater risk:
1. Ban exposure to substance. Tell people they can't live near dump site.
2. Clean up site. (This may not be cost-effective.)
3. Recommend that smokers and drinkers be looked at specifically.
4. Some groups that are particularly sensitive may be discovered 1f good
ep1dem1olog1cal programs and followup are 1n place.
5. Consider those people who are exposed to chemicals with long elimination
half-life, like DOT. Such chemicals may accumulate 1n large amounts 1n
storage depots of exposed Individuals and then become mobilized during
sickness or weight loss.
DR. EULA BINGHAM
In her presentation, Dr. Erdrelch presented the following definition of
hypersusceptlble:
A hypersusceptlble Individual (or populations) 1s one who will
experience an adverse health effect to one or more pollutants,
significantly before the general population, because of one or more
factors which predispose the Individual to the harmful effects.
I agree with this definition -- actually we are asking whether certain
groups are at "higher risk." These groups have been Identified by various
methods and may be characterized as follows:
1. Developmental — (I.e., various periods 1n life cycle) — e.g., concept-
us, children, reproductlvely active males and females, aging populations.
2. Dietary Influences — nutritional aspects.
3. Disease states -- e.g., Impaired renal function, asthmatics, chronic
pulmonary disease, hypertension, etc.
-204-
-------
4. Previous exposures (usually occupational) — e.g., lead burden, Initiat-
ing doses of carcinogens.
5. Genetic —
AHH Induction vs. human lung cancer
Thalassemlas — deficiency 1n production of RBCs, homozygons vs. hetero-
zygons
25% Italian-American and German-American
Sickle cell trait --
G-6-PD deficiency — frequency may be 15%
All Individuals will be more susceptible at one time or another. I agree
with this statement, but I don't agree that as a hypersusceptlble group, the
conceptus has received most attention. Societal responses and the high
media value make 1t a visible area, but actually from a scientific view-
point, skin sens1t1zat1on, enzyme disfunctions and allergic reactions, other
than skin, have received great attention.
A particular note of caution 1s that when you use a TLV 1n any way to
set acceptable levels, 1t should be remembered that not only are they based
on a limited period of exposure, but are usually set for young, healthy,
adult, white males.
The next questions are:
1. What are the sizes of the subgroups and what constitutes a "significant"
numbers of hypersusceptlble Individuals?
2. How much more susceptible are they, e.g., 10 times, 100 times, 1000
times? (Risk 1s 16 and 36 times greater 1n medium and high Inducers of
AHH.)
In the present approach by the Agency:
1. Quantitative data to support the thesis have not been presented and
cannot be found 1n the literature.
-205-
-------
2. Any one chemical may be quite small 1n the number of susceptlbles, but
would merit consideration 1n combinations (as 1n dump sites).
A review of the literature to determine the magnitude of the Increased
risks Incurred by "susceptible" groups will provide some Insight Into the
appropriate safety factor. It seems likely that for some contaminants the
so-called susceptible groups will be so numerous 1f a Hfecycle viewpoint 1s
used, that the "nonsusceptlble" or more resistant becomes the minority of a
population.
DR. MAGNUS PISCATOR
The following are some parameters of Interest when a population 1s
exposed to a contaminant In water:
1. The contaminant may occur In one water supply used by the whole commun-
ity. The concentrations In tap water will be similar 1n all households.
If each household has Its own well, the distribution of concentrations
of the contaminant will probably be log-normal; and a few wells may have
quite high concentrations. If several contaminants are present, they
may move with different speeds Into the water supply, and peak concen-
trations may thus occur at different times.
2. The Ingested dose will be dependent on water consumption, which may vary
between 0.5 and 3 a/day. The distribution 1s probably normal.
3. The absorbed dose will depend on age, diet, existence of nutritional
deficiencies and diseases, and probably many other factors. Some sub-
stances may be expected to always be absorbed to more than 90%, whereas
others may show a very large variation 1n absorption, e.g., lead may be
expected to show a log-normal absorption distribution, with Infants and
women with Iron and calcium deficiencies at the extreme end of the
distribution. An additional dose may be obtained by Inhalation, 1f the
substance easily evaporates from water.
-206-
-------
4. The biological half-time may vary due to age, existence of liver or kid-
ney disease, Interactions, diet, etc. Some compounds are more rapidly
metabolized by children, people already exposed to some chemicals, e.g.,
drugs. Both normal, log-normal and t>1 modal distributions may exist.
5. The critical concentration for effects on an organ may vary consider-
ably. This may be due to genetic, nutritional and disease factors. The
cardlomyopathy seen In beer drinkers exposed to cobalt may serve as an
example. Asthmatics may get lung Irritation from concentrations of air
pollutants tolerated by healthy people.
6. The final result 1s a wide variation 1n effects, the distribution being
log-normal, with a majority showing no or mild effects, whereas a few
may be expected to get some more marked effects. Most populations
exposed to waste chemicals are small and 1t 1s unlikely that 1n such
small groups any significant adverse effects will be noted.
7. The decisive factor Is obviously the Ingested dose. If 1t 1s below a
certain level, none of the other factors will have enough Influence to
create an effect.
-207-
-------
DISCUSSION
OR. JULIAN ANDELHAN
In the presentation on assessment of exposure by Or. Nlsbet (see
previous chapter), as well as the critique by Or. Plscator on subpopulatlons
at greater risk, the likelihood of encountering log-normal exposures and
uptakes In an exposed population was appropriately raised.
Thus, In addition to considering the Increased sensitivity of certain
segments of the exposed population, particular attention should be directed
to the question of the variability of exposures and body burdens among
groups that might nominally be expected to have the same exposure. Such
variabilities can and do arise as a result of Individual behavior, as well
as varied physiology. This 1s graphically shown 1n geographically discrete
populations of children who have log-normally distributed concentrations of
lead In their blood. An example of the Implication of this phenomenon 1s
the assessment of the likely exceedance of a threshold level (e.g., NOAEL)
via a given route of exposure, such as drinking water. Assuming the water
concentration and a 2 8. water Intake would correspond to this NOAEL, th.e
fraction of the population likely to exhibit an effect might be considerably
less than 100% due to the log-normal question. This could be addressed,
however, through the uncertainty factor mechanism, but should be specifi-
cally considered as a principle to be Incorporated 1n some fashion.
DR. MARVIN LEGATOR
The area of the susceptible Individual as discussed at this meeting
presented an Interesting approach to Unking Individual chemicals to a
possible adverse effect on subjects 1n the population. It may well be that
establishing such linkages could offer a procedure for detecting an adverse
outcome before 1t 1s more generally apparent.
-208-
-------
DR. REVA RUBENSTEIN
The paper on hypersusceptlble Individuals was very thought-provoking.
It seems to me that examination of clinical case studies of Individuals with
underlying pathology may be a fruitful way of addressing both "hypersuscep-
t1bH1ty" and mult1chem1cal exposure. Many of these patients are already
receiving multiple drug therapy. An additional, possibly productive line of
Investigation 1s to better characterize, or classify, the known drug Inter-
actions 1n "normal" populations.
DR. MAGNUS PISCATOR
The main emphasis of the presentation was on preexisting disease, and
certain groups were Identified. In my critique I pointed out that the final
effect will be dependent on a number of exogenous and endogenous factors,
all of which will be within normal or log-normal distributions, e.g.*,
concentration of contaminant 1n water, Intake of contaminant, absorption,
biological half-time, critical concentration.
I also showed that 1n a subgroup with cardiovascular disease treated
with a phenoxyproplonlc add, additional exposure to a small amount of
phenoxy adds would have little significance, whereas the exposure might be
significant 1n a healthy population.
People with preexisting disease generally are taking drugs, and Inter-
ference with drug metabolism may occur, changing elimination. Thus exposure
to PAH, DDT, etc., may decrease half-times for some drugs and make treatment
less effective.
DR. JAMES WITHEY
Some of the newer topics such as the "Special Groups at Risk" need to be
refined a little and the approach crystallzed. Undoubtedly, the "dumps"
will have to be considered on a case-by-case basis, and "special groups" may
be Identified as a consequence of the retrospective epidemiology.
-209-
-------
DR. RICHARD KOCIBA
The Issue of hypersusceptlble subpopulatlons should consider these
subpopulatlons as the extreme ends of the normal distribution curve of the
population as a whole. On this basis, there appears, 1n most cases, to be
sufficient adequacy 1n the conventional uncertainty factors used to histori-
cally accomodate Intraspedes variability.
One must keep 1n mind that the entire population (and the resultant
normal distribution curves) can be considered to be comprised of various
subgroups of varying susceptibility. Based on these factors, there does not
appear to be any great need to change the basic approach that has been used
historically (uncertainty factors) to accommodate Intraspedes variability.
In regard to the Issue of mult1chem1cal carclnogenesls, the unpublished
studies discussed by Dr. Schnelderman 1n his critique of the presentation on
biological bases of toxicant Interactions should be reviewed. This would
allow one to evaluate the likely outcome of mult1chem1cal long-term exposure
with regard to carclnogenesls.
DR. IAN NISBET
Hypersusceptlble groups are the only groups of Importance for risk
assessment because they constitute the low end of the dose-response curve,
by definition. If a hypersusceptlble subpopulatlon 1s sufficiently discrete
that the distribution of susceptibilities 1s blmodal [Figure 26(A)], then
the dose-response curve may be convex upwards. If so, the assumption that a
linear dose-response relationship 1s an upper bound on risk will be wrong
[see Figure 26(B)].
-210-
-------
CO
D
G
is
(A)
"Hypersusceptible
Subpopulation
DOSE (d)
CO
II
Si
c t-
Si
st
(B)
Linear Model
Underestimates
Risk at Low Doses
DOSE (d)
FIGURE 26
Dose-response characteristics of a hypothetical population that Includes
a hypersusceptlble subpopulatlon. (A) Frequency of Individuals susceptible
to dose d (I.e., for whom d Is a threshold). (B) Comparison of linear
dose-response model with dose-response curve for hypothetical population.
-211-
-------
DR. MARVIN SCHNEIDERMAN
WUh respect to the Issue of sensitivity, two pieces of data that I had
difficulty Interpreting or understanding 1n the past now look as 1f they
might be Interpreted to show either the effects of greater sensitivity 1n a
portion of a human population, or greater responsiveness 1n a portion of the
population perhaps Induced by other exposures. That 1s, these data may
provide evidence of an effect of mixed exposures 1n a human population. The
two sets of data derive from Industrial exposures. The attached Figure 27
shows these data, schematically.
Both parts of the figure show responses (or relative risk) 1n an Indus-
trial population as a function of duration of exposure. In each of the two
parts the points representing the persons with the lowest exposures He
above the dose-response line fitted to the data. One possible Interpreta-
tion of this 1s that these "excessive responses" Include persons who had
been previously exposed to other materials, thus making them more sensitive
to the subsequent exposure (of asbestos, or radon gas). There are, of
course, several other "explanations": the sick worker effect (what made
them "sick"?); poor data for the "controls" or zero-exposed groups; under-
estimate of dose for low exposures, etc.
GENERAL COMMENTS
The Intent should not only be to estimate the number of Individuals 1n
each subgroup but also to determine the health Impact on the overall
population. Both frequency and severity of effects must be addressed.
Susceptibility can be differences 1n kind as well as degree.
Susceptibility may be blmodal, which would put Into question the slope
of the usual dose-response curve.
Multiple causation must be considered. People with preexisting health
problems may be the first ones affected. An additional insult (e.g.,
chemical exposure) may be overlooked and their disease will be blamed on
the preexisting condition (e.g., alcoholism, problems associated with
smoking).
-212-
-------
g
at
00 QC
O
100
WORKING LEVEL MONTHS
(DOSE MEASURE)
LU
LU
tr
ASBESTOS
DURATION
FIGURE 21
Response (or risk) 1n an Industrial Population as a Function of
Duration of Exposure to Uranium and Asbestos
Source: Adapted from Lundln et al.t 1971 and Nicholson et al., 1982
-213-
-------
Subgroups should be Identified and used to Indicate adverse effects only
when 1t 1s appropriate.
Adding all hypersusceptlble populations together should not be consid-
ered.
The most sensitive animal model could be used.
-214-
-------
REFERENCES
Avlado, D.M. 1978. Effects of fluorocarbons, chlorinated solvents, and
1nos1ne on the cardlopulmonary system. Environ. Health Perspect. 26:
207-215.
Calabrese, E.J. 1978. Pollutants and H1gh-R1sk Groups. The Biological
Basis of Increased Human Susceptibility to Environmental and Occupational
Pollutants. John Wiley and Sons, NY.
Calabrese, E.J. 1981. Nutrition and Environmental Health: The Influence of
Nutritional Status on Pollutant Tox1c1ty and Carclnogenlclty. John Wiley
and Sons, NY.
Calabrese, E.J. 1983. Principles of Animal Extrapolation. John Wiley and
Sons, NY.
Dourson, M.L. 1982. Regulatory and experimental support of safety fac-
tors. (Submitted for publication)
FabMcant, J.D. and M.S. Legator. 1981. Etiology, role and detection of
chromosomal abberatlons In man. J. Occup. Med. 23: 617-625.
FMberg, et al., Ed. 1979. Handbook on the Toxicology of Metals.
Elsevler/North Holland Blomedlcal Press.
-215-
-------
Holland, W.W., A.E. Bennett, I.R. Cameron, et al. 1979. Health effects of
participate pollutants: Reappraising the evidence. Am. J. Ep1dem1ol. 110:
525-659.
Kline, J.K., Z.A. Stein, B.R. Stroblno, M.W. Sussex and D. Warburton. 1977.
Surveillance of spontaneous abortions: Power 1n environmental monitoring.
Am. J. Epidemic!. 106: 345-350.
Kurzel, R.B. and C.L. Cetrulo. 1981. The effect of environmental
pollutants on human reproduction, Including birth defects. Environ. Sc1.
Technol. 15: 626-640.
Lundln, F.E., J.K. Wagoner and V.E. Archer. 1971. Radon daughter exposure
and respiratory cancer. Quantitative and temporal aspects. NIOSH/NIEHS
Joint Monograph No. 1. NTIS, Springfield, VA.
Maclure, K.M. and B. MacMahon. 1980. An epidemlologic perspective of en-
vironmental cardnogenesls. Epidemic!. Rev. 2: 49-70.
Mantel, N. and M. Schneldermann. 1975. Estimating safe levels, a hazardous
undertaking. Cancer Res. 35: 1379-1386.
McCauley, P.T. and R.J. Bull. 1980. Experimental approaches to evaluating
the role of environmental factors 1n the development of cardiovascular
disease. J. Environ. Pathol. Toxlcol. 4: 27-50.
-216-
-------
NCHS (National Center for Health Statistics). 1970. Natality Statistics
Analysis, United States, 1965-1967. Vital and Health Statistics. PHS
Publ. No. 1000, Series 21 - No. 19. NCHS, PHS, Washington, U.S. GPO, May.
NCHS (National Center for Health Statistics). 1974. Prevalence of Chronic
Circulatory Conditions, United States, 1972. Vital and Health Statistics
Series 10 - No. 93. NCHS, PHS, Washington, U.S. GPO, September.
NCHS (National Center for Health Statistics). 1975. Selected VHal and
Health Statistics 1n Poverty and Nonpoverty Areas of 19 Large CH1es, United
States, 1969-1971. Vital and Health Statistics Series 21 - No. 26. NCHS,
PHS, Washington, U.S. GPO, November.
Nicholson, W.J., G. Perkel and I.J. Sellkoff. 1982. Occupational mortality
to asbestos: Population at risk and projected mortality 1980-2030. Am. J.
Ind. Med. 3: 259-311.
Ohio Department of Health Report of Vital Statistics for Ohio, 1975.
Columbus, OH.
Rao, K.S., B.A. Schlevetz and C.N. Park. 1981. Reproductive toxldty risk
assessment of chemicals. Vet. Human Toxlcol. 23: 167-175.
Redmond, C.K. 1981. Sensitive population subsets In relation to effects of
low doses. Environ. Health Perspect. 42: 137-140.
R1ce, J.M. 1981. Prenatal susceptibility to cardnogenesls by xenoblotU
substances Including vinyl chloride. Environ. Health Perspect. 41: 179-188.
-217-
-------
Saxena, M.C., J.K.J. S1dd1qu1, A.K. Bhargaua, C.R. Krishna Murtl and
0. Kutty. 1981. Placental transfer of pesticides 1n humans. Arch.
Toxlcol. 48: 127-134.
Stroblno, B.R., J. Kline and Z. Stein. 1978. Chemical and physical expo-
sures of parents: Effects on human reproduction and offspring. Early Human
Development. 1: 371-399.
Well, C.S. 1972. Statistics vs. safety factors and scientific judgement 1n
the evaluation of safety for man. Toxlcol. Appl. Pharmacol. 21: 454-463.
-218-
-------
HEALTH ASSESSMENT OF EXPOSURES TO CHEMICAL MIXTURES
Biological Bases of Toxicant Interactions and Mathematlc Models
Presentation: Or. Patrick Durkln
Syracuse Research Corporation
Critique: Or. Thomas Clarkson
University of Rochester
Critique: Or. Herbert Cornish
University of Michigan
Critique: Dr. Kenneth Crump
Science Research Systems
Critique: Dr. Marvin Schnelderman
Clement Associates
-219-
-------
PRESENTATION
DR. PATRICK DURKIN: MULTIPLE CHEMICAL EXPOSURES
Introduction
Having addressed the Issues of single chemical risk assessments from
multiple routes of exposure, the next and last step 1s to determine a
reasonable approach or set of approaches for dealing with multiple chemical
exposures. While some hazardous waste disposal facilities may Involve
significant exposure to only a single chemical, most hazardous waste dis-
posal facilities will Involve exposures to a variety of compounds that may
Induce similar or dissimilar effects. For the purposes of this discussion,
1t will be assumed that the compounds at the site have been Identified,
single compound risk assessments have been conducted as described 1n the
previous chapters, exposure levels for the population at risk have been
determined, and the available data on toxicant Interactions have been
analyzed. This section will discuss the biological and chemical bases for
assuming that toxicant Interactions may occur, describe mathematlc models
which can be used to assess the effects of multiple compound exposure, give
examples and Indices for quantifying toxicant Interactions, and recommend an
approach for hazardous waste disposal facilities.
Biological and Chemical Bases of Toxicant Interactions
The ability to predict how specific mixtures of toxicants will Interact
must be based on an understanding of the mechanisms of such Interactions.
Most reviews and texts which discuss toxicant Interactions make some attempt
to discuss the biological or chemical bases of the Interactions (e.g.,
Klaassen and Doull, 1980; Levlne, 1973; Goldstein et al., 1974; NRC, 1980;
Veldstra, 1956; WHhey, 1981). Although different authors use somewhat
different classification schemes for discussing the ways 1n which toxicants
-220-
-------
Interact, 1t 1s generally recognized that toxicant Interactions may be based
on any of the processes that are significant to the toxlcologlc expression
of a single compound: absorption, distribution, metabolism, excretion, and
activity at the receptor s1te(s). In addition, compounds may Interact chem-
ically, causing a change 1n the biological effect, or they may Interact by
causing different effects at different receptor sites. Using a modification
of the basic scheme proposed by Veldstra (1956), Table 15 summarizes these
general modes of Interaction along with some examples. As Indicated 1n the
discussion below, there 1s some overlap among the different categories.
Most cases of direct chemical-chemical Interactions lead to a decrease
1n toxlcologlc activity, and this 1s one of the common principles of anti-
dotal treatment. Examples Include the use of chelatlng agents to complex
with metal Ions, the 1nact1vat1on of heparln by protamlne, and the use of
ammonia as an antidote to the Ingestlon of formaldehyde through the forma-
tion of hexamethylenetetramlne (Goldstein et al., 1974). This class of
reactions has been referred to as chemical antagonism by Klaassen and Doull
(1980). Chemical reactions which lead to greater than additive effects
appear to be less common and are certainly much less documented. One
example which has recently received considerable attention 1s the formation
of nltrosamlnes from nitrites and amines, which results 1n an Increase 1n
both toxic and carcinogenic effects (Welsburger and Williams, 1980). Thus,
while antagonism may be predominant 1n this type of toxicant Interaction,
synerglsm or potentlatlon cannot be ruled out.
Many examples of toxicant Interactions are based on alterations 1n pat-
terns of adsorption, distribution, excretion, or metabolism of one or more
compounds 1n the mixture. A recent review of these factors 1n the assess-
ment of multiple chemical exposures has been presented by WHhey (1981).
-221-
-------
TABLE 15
Chemical and Biological Bases of Toxicant Interactions*
Bases of
Interaction
Examples
Synerglsm
or Potentlatlon
Antagonism
Chemical
Biological
Absorption
Distribution
Excretion
Metabolism
Interaction at
Receptor Sites
(Receptor An-
tagonism)
Interaction Among
Receptor Sites
(Functional
Antagonism)
Formation of nltrosamlnes
from nitrites and amines
Increased dermal absorp-
tion of many toxicants
when administered In
dimethyl sulfoxlde
Displacement of antico-
agulants from plasma
proteins by phenylbuta-
zone
Decreased renal excre-
tion of penicillin when
co-administered with pro-
benedd
Increased toxlclty of
parathlon by simulation
of mlcrosomal enzyme ac-
tivity with phenobarbltal
Chelatlng agents and
metals
Decreased absorption of
tetracycllne when admin-
istered with calcium
carbonate
Increased renal elimina-
tion of phenobarbltal
when co-administered with
sodium bicarbonate
Decreased toxlclty of
parathlon by Inhibition
of mlcrosomal enzyme
activity with plperonyl
butoxlde
Blocking of acetylchollne
receptor sites by
atroplne after poisoning
with organophosphates
Interaction of hlstamlne
and noreplnephrlne on
vasodllatlon and blood
pressure
kSee text for discussion, additional examples, and references.
-222-
-------
All of these types of Interactions essentially alter the b1oava1labH1ty of
the toxic agent(s) at the receptor s1te(s) without qualitatively affecting
the toxicant-receptor site Interaction.
Most types of Interactions based on alterations 1n absorption Involve
vehicle effects, the chemical formation of poorly absorbed conjugates, or
decreases 1n gastrointestinal motllUy. For Instance, dimethyl sulfoxlde, a
commonly used vehicle 1n dermal toxldty studies, 1s known to facilitate the
absorption of many organic compounds across the skin, thus causing apparent
potentlatlon (Goldstein et al., 1974). Similarly, the acute oral toxldty
of many compounds 1s substantially affected by the vehicle used, and a large
number of these effects are probably due to differences 1n rate of absorp-
tion. Examples of compounds that form poorly absorbed complexes after oral
administration Include tetracycllne and calcium carbonate, as well as
cholestyramlne and cholesterol (Goldstein et al., 1974). Some compounds,
such as codeine, morphine, atroplne, and chloroqulne decrease the rate of
gastric emptying and thus decrease the rate of absorption of orally adminis-
tered compounds. For the most part, such Interactions usually lead to
decreases 1n effects due to the slower rate of absorption, rather than
Increases 1n effects due to more complete absorption (Levlne, 1973). As
discussed by WUhey (1981), there are relatively few examples of lexicologi-
cally significant changes 1n absorption associated with the Inhalation of
mixtures.
Distribution can play a role 1n compound Interactions 1f a more active
agent 1s displaced from an Inactive site to a primary receptor site by a
less active or Inactive agent. One of the best documented examples of this
type of activity 1s the displacement of anticoagulants from plasma proteins
by compounds such as barbiturates, analgesics, antibiotics, or diuretics
-223-
-------
(Goldstein et al., 1974). Since body fat represents a major Inactive
storage site for many UpophlUc xenoblotlcs, It may be anticipated that
compounds which cause fat mobilization could result 1n similar potentiating
effects (WHhey, 1981). It should be noted that both of the above types of
examples result 1n greater than additive effects — synerglsm or potentla-
tlon. A distributional mechanism for antagonism does not seem probable and
has not been encountered 1n the literature reviewed.
Excretion as a basis for toxicant Interaction usually Involves compounds
which are eliminated via the kidneys. For Instance, probenecid or caMna-
mlde both competitively Inhibit the elimination of penicillin, thus prolong-
ing or potentiating Us desirable therapeutic effect. Similarly, phenyl-
butazone Inhibits the renal excretion of hydroxyhexamlde, which can cause
undesirably prolonged hypoglycemla (Goldstein et al., 1974). If a toxicant
1s eliminated via the kidneys, a stimulation of renal elimination can cause
an antagonistic effect, as 1s seen with the coadm1n1strat1on of phenobarbl-
tal and sodium bicarbonate 1n which the Increased urine alkalinity Induced
by the bicarbonate 1on Increases the excretion of phenobarbltal.
Altered patterns of compound metabolism have been shown to be the bases
of many toxicant Interactions. The major enzyme system Involved 1n such
Interactions 1s liver mlcrosomal mixed-function oxldase which 1s Involved 1n
the activation or detoxlcatlon of a wide variety of compounds and can be
Induced by agents such as phenobarbltal and Inhibited by agents such as
plperonyl butoxlde (Goldstein et al., 1974). Thus, depending on whether or
not the toxicant 1s activated or detoxified, Inducers or Inhibitors of this
enzyme system can cause synerglstlc/potentlatlng effects or antagonistic
effects. However, toxicant Interactions Involving this enzyme system can be
very complex and are dependent on both dose and duration of exposure, with
-224-
-------
some compounds causing an Initial Inhibition of enzyme activity followed by
a marked Induction of activity (NRC, 1980). Although liver mlcrosomal
mixed-function oxldase 1s the most commonly studied enzyme system Involved
1n toxicant Interactions, mixed-function oxldases 1n other tissues may also
play an Important role 1n toxicant Interactions, as may other enzyme systems
such as alcohol and aldehyde dehydrogenases, monamlne and dlamlne oxldases,
dehydrochlorlnases, azo and nltro reductases, hydrolases as well as enzyme
systems Involved 1n conjugation reactions. For Instance, ethanol Is a
useful antagonist for the toxic effects of methanol by competitive substrate
Inhibition of alcohol dehydrogenase, suppressing the formation of formalde-
hyde and formic add from methanol (Goldstein et al., 1974).
As Indicated above, all of these biological modes of toxicant Inter-
actions — absorption, distribution, excretion, and metabolism -- are essen-
tially d1spos1t1onal, affecting the amount(s) of toxlcant(s) reaching the
primary receptor(s), and most of these types of Interactions can Involve
either synerglsm/potentlatlon or antagonism. The other basic type of
biological bases for toxicant Interactions Involves events that occur at the
receptor sites or among the receptor sites, and are usually thought to
result solely 1n antagonistic Interactions. The antagonistic nature of
Interactions that occur at the same receptor site has been discussed by
Veldstra (1956):
...we may say that the effect of a combined action of two compounds
at the same site of primary action will not result 1n a synerglsm,
but will, generally, even be unfavorable. The competition for the
receptor will usually decrease the frequency of the best Inter-
actions, and with decreasing Intrinsic activity of one of the
components the combined action will more and more take the form of
a competitive antagonism.
-225-
-------
Examples of such Interaction Include the antagonistic effects of oxygen on
carbon monoxide, atroplne on chollnesterase Inhibitors, and naloxone on
morphine (Goldstein et al., 1974). The antagonistic consequences of this
type of toxicant Interaction are so consistent that H has been termed
receptor antagonism by Klaassen and Doull (1980) and pharmacologlc antago-
nism by Levlne (1973). While H does not seem Inconceivable that one com-
pound could Increase the Intrinsic activity of another compound by modifying
the receptor site — analogous to the effect of modulators on regulatory
enzymes -- such Interactions have not been demonstrated.
Interaction among receptor sites 1s also thought to result primarily 1n
antagonistic effects and has been referred to as functional antagonism by
both Klaassen and Ooull (1980) and Levlne (1973). This type of Interaction
Is most commonly defined as two or more compounds acting on different recep-
tor sites and causing opposite effects on the same physiologic function.
Examples Include the opposite effects of hlstldlne and noreplnephrlne on
vasodllatlon and blood pressure, and the antlconvulslve effects of barbitu-
rates on many compounds that cause convulsions. However, that Interactions
among receptor sites uniformly result 1n an antagonistic response Is not
certain, particularly when the receptor sites act on different physiological
systems. The rationale for this statement has been presented by Veldstra
(1956):
The sites of action for two compounds having the same type of
activity may be different. This 1s the case when the effect can be
caused either by a direct stimulation or by the annihilation of an
Inhibition.
Competitive antagonists for different Intermediates 1n a bio-
synthetic chain fall 1n the same category, 1f the Inhibition of the
synthesis of the end-product 1s taken as their effect.
In both cases, the combination of two compounds, linked 1n
parallel or 1n series, as H were, may well result 1n a synerglstlc
effect.
-226-
-------
When the components of a combination possess different sites
of action and different types of activity, no plausible prediction
about the possibility of synerglsm can be made, unless their mode
of action 1s well known.
A possible Illustration of Veldstra's argument 1s presented 1n the work of
Alstott et al. (1973), who examined the acute lethal effects of combinations
of l-methylxanth1ne and ethanol on mice, and noted two basic types of
effects: kidney dysfunction and Increased respiratory rate and depth. In
organisms exposed to mixtures 1n which the ratio of 1-methylxanthlne to
ethanol was relatively high, antagonism of acute lethal toxlclty was
observed. However, In mixtures 1n which the same ratio was relatively low,
a synerglsm of acute lethal toxldty was observed. This Indicates that 1n
cases where toxicants Interact at more than one receptor site, the nature of
the Interaction can be either antagonistic or synerglstlc. The complicating
factor of the "asymmetric" pattern of Interaction observed by Alstott et al.
(1973) 1s discussed 1n greater detail 1n the following section.
MathematU Models for Joint Action
The simplest mathematlc models for joint action describe either dose
addition or response addition. Dose addition, referred to as simple similar
action by Flnney (1971) and simple joint action by Bliss (1939), assumes
that the toxicants 1n a mixture behave as 1f they were dilutions or concen-
trations of each other, thus the slopes of the dose-response curves for the
Individual compounds are Identical and the response elicited by the mixture
can be predicted by summing the Individual doses after adjusting for differ-
ences In potency, the ratio of equltoxic doses. Although this assumption
can be applied to any model (e.g., the one-hit model 1n NRC, 1980), H has
been most often used 1n toxicology w1 th the log-dose problt-response model
-227-
-------
which will be used to Illustrate the assumption of dose addH1v1ty. Assume
that two toxicants show the following log-dose problt-response equations:
Y =0.3+3 log Z (1)
Y2 =1.2+3 log Z2 (2)
where Y. 1s the probH response associated with a dose of Z,. The
potency, p, of tox1cant-2 with respect to tox1cant-l 1s, by definition,
Z,/Z2 when Y, = Yp (I.e., equltoxic doses). In this example, the
potency, p, 1s ~2. Dose addition assumes that the response, Y, to any
mixture of the two toxicants can be predicted by:
Y = 0.3 + 3 log (Z1 + pZ2) (3)
It should be noted that since p 1s defined as Z,/Z2, Equation 3 essen-
tially converts Z2 Into an equivalent dose of Z, by adjusting for the
difference 1n potency. A more generalized form of this equation for any
number of toxicants 1s:
Y = a1 + b log (l f1 p^ + b log Z (4)
where a-, 1s the y-1ntercept of the dose-response equation for toxlcant-1,
b 1s the slope of the dose-response line for each toxicant, f. Is the
proportion of the 1 toxicant 1n the mixture, p, 1s the potency of the
1 -toxicant with respect to tox1cant-l (Z,/Z,), and Z 1s the sum of
the Individual doses In the mixture. A more detailed discussion of the
derivation of the equations for dose addition 1s presented by Flnney (1971).
The other form of addH1v1ty 1s referred to as response addition. As
detailed by BUss (1939), this type of joint action assumes that the two
toxicants act on different receptor systems and that the correlation of
Individual tolerances may range from completely negative (r = -1) to com-
pletely positive (r = +1) correlation. Analogous to the concept of dose
-228-
-------
addition, response addition assumes that the response to a given concentra-
tion of a mixture of toxicants 1s completely determined by the responses to
the components and the correlation coefficient. Taking P_ as the proper-
O
tlon of organisms responding to a mixture of two toxicants which evoke
Individual responses of P, and P?
P3 = P] 1f r = 1 and If P] 1s >P? (5)
P3 = P2 if r = 1 and 1f P] 1s
-------
However, as Illustrated 1n the following section of this presentation, most
of the available data on toxicant Interactions are not adequate to test the
hypothesis of add1t1v1ty and cannot be used to estimate the necessary
parameters 1n Interactive models.
Measurements of Toxicant Interactions
Approaches to the analysis of toxicant Interactions used by most toxl-
cologlsts have been based on the assumption of dose addition. One common
measurement, referred to here as the ratio of Interaction (R.I.), 1s the
ratio of the observed EC5Q of a mixture to the EC™ predicted by
Equation 3 for dose addition. Most applications of this ratio are based on
a single mixture and use questionable methods to determine significance.
KepHnger and Delchman (1967) used the ratio of Interaction to measure the
joint action of various pesticides 1n mice. In this study, only one mixture
of each combination was used, and significant Interaction was arbitrarily
defined as ratios of 0.57 and less for synerglsm and 1.75 and greater for
antagonism. Smyth et al. (1969, 1970) used a slightly modified expression
of the ratio of Interaction, which resulted 1n estimates that approximated a
normal distribution. Significant Interaction was then defined as those
ratios which were beyond 1.96 standard deviations from the mean ratio. In
studies on the joint action of pesticides 1n housefHes, Sun and Johnson
(1960) defined the cotox1c1ty coefficient as the ratio of Interaction
multiplied by 100. Again, the Investigators used only a single mixture.
Significant Interaction was estimated by taking repeated measurements and
determining 1f the 95% confidence Interval of the cotoxiclty coefficients
Included zero. More recently, Wolfenbarger (1973) used cotoxiclty coeffi-
cients to estimate the joint action of toxaphene-DDT mixtures 1n Insects.
-230-
-------
Although different mixtures of this combination were used, no attempt was
made to Integrate the results Into a clear pattern of Interaction. In
addition, Wolfenbarger (1973) used 95% confidence Intervals of the LC5Q to
determine 1f the mixtures were significantly more or less toxic than either
of the components.
Along with these uses of the ratio of Interaction, Ohsawa et al. (1975)
used dose addition In an attempt to account for the toxlclty of technical
grade toxaphene based on the toxlclty of various toxaphene fractions to
housefHes. This concept has enjoyed widespread use 1n aquatic toxicology
(Esvelt et al., 1971). Marking and Dawson (1975) have recently proposed a
modified approach 1n a test for Interaction. Like most of the approaches
using the ratio of Interaction, this method utilizes only a single mixture
of each combination. In this method, significance 1s determined by using
the 95% confidence Intervals of the LC of the mixture and two components
to estimate the confidence Intervals of the additive Index. Intervals of
the additive Index which do not Include zero are considered Indicative of
significant Interaction.
All of the above approaches are severely limited by their reliance on a
single Interactive ratio. As discussed by Hewlett (1969), the ratio of
Interaction 1s characteristic only of a particular mixture of a combination.
In other words, the estimated value of the ratio of Interaction will vary
depending on the proportions of the toxicants present In the mixture. This
concept 1s explicitly defined 1n Equation 3 by the terms f, and f?.
Another limitation 1n the use of ratios of Interaction 1s encountered 1n
attempts to demonstrate statistical significance. The method used by Sun
and Johnson (1960), based on repeated measurements of the ratio of Inter-
action, may be the least objectionable. However, because of the dependence
-231-
-------
of the ratio of Interaction on f1 and f2, the estimate of Interaction 1s
valid only for the particular mixture tested and has no merit 1n assessing
the overall Interaction characteristic of the combination being tested.
This limitation may be particularly misleading for those compounds which
evidence asymmetric Interaction. The approach adopted by KepHnger and
Delchman (1967) 1s totally arbitrary and makes no attempt to establish a
criterion for statistical significance. The method of Smyth et al. (1969,
1970) 1s based on arbitrary selection of test chemicals which Influence the
criteria for Interaction. The other methods which use 95% confidence Inter-
vals of the LOrQ of the mixture and Individual components (Marking and
Oawson, 1975; Wolfenberger, 1973) are overly sensitive to both endogenous
and exogenous variance. Marking and Oawson (1975) recognized the difficulty
with exogenous variance 1n stating that "well-planned toxlclty tests which
result 1n narrow confidence Intervals are most useful 1n the assignment of
the effects of chemical mixtures." However, 1f endogenous variation 1s high
(I.e., the slope of the log dose-problt response line 1s low), even well-
designed toxldty tests may yield 95% confidence Intervals which preclude
the detection of Interaction.
The difficulty 1n demonstrating significant Interaction with any of
these tests using single ratios of Interaction 1s primarily one of experi-
mental design. Since the ratio of Interaction Is dependent on the propor-
tions of the components 1n the mixture, a test has the best chance of demon-
strating significant Interaction 1f the mixture giving maximum Interaction
1s selected. If the combination of toxicants being tested 1s assumed to
evidence a pattern of symmetric Interaction, a mixture of equHoxU doses
would be the best selection. Even with this simplifying and not necessarily
valid assumption, however, tests based on single ratios of Interaction will
-232-
-------
not yield significant results unless the magnitude of the Interaction 1s
substantial and the experimental variability 1s minimal.
Possible Approaches Based on AddUlvUy
Two types of approaches may be used by the Agency, depending on whether
ADI's or practical thresholds for the different toxicants have been estab-
lished, or whether dose-response estimates have been made.
In the former case, one approach would be to use a modification of the
equivalent exposure Index defined by OSHA (37 FR 23502-23505) and recommend-
ed by De Rosa (1981) and ECAO (U.S. EPA, 1981). Using this method, a hazard
Index (HI) for a single toxicant to which Individuals are exposed by oral
(0), Inhalation (I), and dermal (D) routes can be defined as:
HI = E0/ThQ * Ej/Thj * ED/ThD (9)
where Eg, E,, and £„ are the dally exposures to the toxicant from
oral, Inhalation and dermal routes, respectively, and Thn, ThT and Thn
are the corresponding route-specific practical thresholds. If the hazard
Index for the compound 1s less than unity, no hazard 1s assumed to exist.
If the hazard Index 1s greater than unity, a hazard 1s assumed, but the
magnitude of the hazard 1s defined only 1n relative terms with respect to
the practical thresholds. Although this approach does not define dose-
response relationships, It would be possible, If sufficient data were avail-
able, to derive practical thresholds for a spectrum of effects (e.g., MFO
Induction, minimal effects on several organs, severe effects on several
organs, reproductive dysfunction, behavioral effects, and mortality). If
practical thresholds could be derived for such a spectrum of effects, the
hazard assessment could suggest not only 1f effects were likely to be seen,
but also what types of effect, 1f any, might be expected.
-233-
-------
For hazardous waste sites, which would probably Involve exposures to
more than one toxicant, the total hazard Index (HI...) for the site could be
calculated as the sum of the hazard Indices for the n number of toxicants of
concern:
HI_ = HI, + HI- + . . . HI (10)
T 1 2 n
Again, 1f practical thresholds for a spectrum of effects could be defined,
total site-specific hazard Indices could be calculated for each effect.
A multiplication factor for the total hazard Index could be recommended
1f data suggested that several of the toxicants at a site evidenced syner-
glstlc effects when applied 1n combination. For Instance, for a site with
10 toxicants, 5 of which were reported to evidence synerglstlc Interaction
on liver tox1c1ty, the base total hazard Index for liver toxldty could be
multiplied by 1.5 (I.e., 0.1 for each of the Interacting toxicants);
however, such an approach would have only a pretense of predictability. The
approach might have merit for "protection", but Us use would be a matter of
policy, not science, and would Ignore the realities and complexities of
toxicant Interactions. Furthermore, much of the literature reporting
"synerglsm" or "antagonism" makes no meaningful attempt to determine 1f the
observed responses reflect true Interaction or simply additlvHy. Conse-
quently, the decision to use such correction factors for Interaction would
have to be carefully monitored.
If dose-response relationships have been defined for the Individual com-
pounds at each site, Information will be available on the expected Incidence
of response, P, for each effect of concern caused by each chemical for a
single route of exposure. If more than one route of exposure 1s Involved,
route-to-route extrapolations, combined with the monitoring data/exposure
estimates for each route, will be used to calculate the cumulative (I.e.,
-234-
-------
from all routes) expected Incidence of response, P, for each effect of
concern caused by each chemical. An example of such a data set 1s given 1n
Table 16, 1n which five hypothetical chemicals (I to V) are associated with
a total of six effects of concern (A to F). The problem 1s to estimate the
expected Incidence of response for each effect and the cumulative Incidence
of adverse response 1n the population.
Accepting the premise that some form of add1t1v1ty must be used, the
most reasonable approach would seem to be response addition, 1n which the
correlation of Individual responses within the population 1s assumed to be
zero. As Indicated In Equation 7, the formula for predicting the total
expected response (PT) for exposure to two chemicals, using this assump-
tion, can be expressed as: P = P^ + P (1-P-j). This equation can
be generalized, for any number of chemicals, as:
PT = 1 - nd-P,) (11)
Using this equation, the cumulative Incidence of all adverse responses from
each chemical (PC.) 1s given 1n the last column of Table 16 and the cumu-
lative Incidence of each adverse effect caused by the combination of chemi-
cals (PE.) 1s given 1n the last row of Table 16.
The calculation of PE. Is a straightforward use of the above equation.
The calculation of PC., the total Incidence of adverse responses caused by
each chemical, 1s somewhat different 1n that the assumption 1s that the
effects Induced by a given chemical are Independent of one another. For
some combinations of effects [e.g., Increased liver weight, MFO Induction,
proliferation of smooth endoplasmlc retlculum (sER) 1n liver cells] this
assumption obviously will be Invalid. For such cases, 1t may be more
reasonable to assume that the correlation of tolerances 1s unity. The
Implications of this assumption are discussed below.
-235-
-------
TABLE 16
Example of Risk Assessment for Multiple Toxicant Effects*
Chemical
Effects of Concern
f\5
CO
III
IV
V
2xlO~2
5xlO
8xl(T4
3xlO~3
lxlO
4xlO~2
9xlO~3
6xlO
7xlO
6xlO"3
2.08xlO~2
4.00xlO"3
4.67xlO~2
1.39xlO"2
6.00x10~«
2.49xlO~2 3.60xlO~3 4xl(T2 9.79xlO~3 lxlO~3 1.30xlO~2 PT = 7.8xlO~2
*See text for explanation of terms.
-------
Accepting for the moment that the responses of concern have been select-
ed so that the assumption of Independence among responses 1s reasonable, the
total cumulative Incidence of all adverse effects from all chemicals (PT)
can be calculated from Equation 11, substituting PE^ for P^.
At least two major concerns can be expressed about the application of
this method: the assumption of tolerance correlation and the combining of
responses. As previously discussed 1n the section "Mathematical Models for
Joint Action", several types of response addition are possible, depending on
the correlation of Individual tolerances (r) within the population.
However, the true correlation of Individual tolerances to toxicants within
the human population 1s not known. Some evidence suggests that cancer sus-
ceptibility 1n humans may be partially genetic. Furthermore, strain differ-
ences within a species 1n the susceptibility to chemical carcinogens also
suggest a genetic component. Thus, a case probably could be made for assum-
ing that r 1s positive for carcinogens. Nonetheless, the degree of the
correlation cannot be estimated and r probably varies for different carcino-
gens and systemic toxicants. Consequently, 1t seems reasonable to assume
that r equals zero. This can be criticized as being somewhat conservative,
but 1t 1s certainly less conservative than assuming that r equals -1.
Assuming that r equals +1 would probably underestimate the risk. Conse-
quently, Equation 11 1s recommended for calculating the total expected
response for exposure to multiple carcinogens or systemic toxicants.
It may be of some use to examine the practical significance of assuming
r = 0 compared to the more conservative assumption of r = -1. For the hypo-
thetical data given In Table 16, PT 1s 7.8xlO"2, assuming r = 0. If the
assumption was made that r = -1, P_ would equal IP or 9.24xlO~2
which Is -18X greater than the estimated response assuming r = 0
-237-
-------
(7.8x10 2). It can be stated that the higher the expected Incidence of
response, the greater the difference will be between the estimates of
response based on the assumption of r = 0 and r = -1. For Instance, 1f
effects A, 8, C were considered, the predicted Incidence of response would
be 6.72xlO~2 and 6.82xlO~2 for r = 0 and r = -1, respectively, a differ-
ence of only -2%. Conversely, 1f all of the P's 1n Table 16 were
Increased by a factor of 10, the PT'S would be 6.57X10""1 and 9.24X1CT1
for r = 0 and r = -1, respectively, a difference of -41%. This 1s substan-
tially higher than the 18% difference noted at the original response rates
given 1n Table 16. For most hazardous waste disposal facilities, the PT
probably will not exceed IxlO"2, and differences between the two assump-
tions should be small. Thus, since the Increased conservatism of r = -1
will 1n most cases not be substantial, and since r = 0 1s a more reasonable
biological assumption than r = -1, the use of r = 0 rather than r = -1 seems
justified.
However, as Indicated previously, H may sometimes be more reasonable to
assume that r = 1, or at least that r Is positive, for some sets of
responses. By analogy to the approach used for r = 0, PT could be set at
the maximum value of P., 1f the assumption was made that r = 1. This
could, however, lead to some substantial errors 1n the estimate of risk.
For Instance, take the following series of responses In the liver for which
r could be assumed to be positive and possibly equal to unity:
Case 1 Case 2
MFO Induction 0.50 0.5
Proliferation of sER 0.30 0.5
Increased liver weight 0.20 0.4
Liver necrosis 0.05 0.3
-238-
-------
Assuming that r = 1, the Pj for these effects 1n combination would be 0.5
for both Case 1 and Case 2, although Case 2 would be of greater concern than
Case 1.
The most obvious, and perhaps the most defensible, approach would be not
to combine effects. Thus, 1n Table 16, the PE's could be used to esti-
mate the expected Incidence for each effect, but no estimate of P-,- would
be made. If 1t 1s necessary to estimate P-,-, 1t will be necessary to
segregate the effects Into categories, In which the assumption of r = 0 1s
reasonable. For each of these categories, a P, would then be estimated so
that the severity of the effect 1s comparable to the other effects or
categories of effects which are being combined.
Note: Throughout the above discussion, 1t has been assumed that the effects
are quantal or can be treated as quantal responses. It 1s unlikely that
sufficient data will be available on graded responses to allow for a quanti-
tative analysis.
-239-
-------
CRITIQUES
DR. THOMAS CLARKSON: BIOLOGICAL BASES
At low levels, the kinds of Interactions will possibly be nothing more
than the Independent action of the Individual chemicals because the receptor
sites, transport systems, etc., will not be saturated.
The mathematical approach 1s very complicated and possibly too expensive
to use. Maybe the model could be used for statistical prediction as a first
screen.
Nearly all our examples are from pharmacology and deal with rapidly
acting drugs that cause acute effects. It Is Important to study examples of
Interactions among environmentally Important agents.
Tumor promoters - Cigarette smoking
D1ox1n receptors - Oloxlns Toxlclty
- Olbenzofurans Induction of AHH
- PCBs and genetic control
- Aryl Hydrocarbons
Ant1ox1dants - Selen1um/CCl4 Toxlclty
- Vitamin E/CC14 Toxlclty
- GSH/Carc1nogens
The area of I1p1d peroxldatlon Involving oxidizing free radicals con-
tains many examples of Interactions.
Intestinal micro- - tox1f1cat1on of agents, e.g., cycasln
flora - detoxification, e.g., methylmercury
- diet — T^/2 methylmercury
Enterohepatlc - Kepone and exchange rings
circulation - Diet — methylmercury
Ethanol - CC14
- Hg°
SOX, NOX - Clearance of deposited aerosols from lung.
Gastrointestinal - Ca++ and Fe+* versus Pb+t
absorption
• Complexlng agents - Metals
Natural and man- - Aluminum Zinc
made (F~)
-240-
-------
Kidney Interactions: Loss of epithelial cells leads to
Increased resistance
(F~ versus Pb+t)
Induction of metal - thloneln (cadmium)
protein - nuclear Inclusion bodies (lead)
DR. HERBERT CORNISH: BIOLOGICAL BASES
Toxicant Interactions may Involve the Interaction of a chemical toxicant
with a biological component. A typical example 1s the nonenzymatlc reaction
of unsym-d1methylhydraz1ne (UDMH) with vitamin B, (pyMdoxal phosphate) to
form a hydrazone. The resulting acute B, deficiency results In hyper-
exc1tab1l1ty and convulsions 1n experimental animals. Treatment with pyrl-
doxlne results 1n prompt alleviation of the CNS symptoms.
An Interesting phenomenon Is where the action of a second toxicant can
alter the organ specificity of the first. 1-N1 tronaphthalene produces both
lung and liver toxldty 1n normal rats. Pretreatment with phenobarbltal
prevents the lung damage and enhances hepatotoxldty. This 1s accompanied
by altered rates of excretion of metabolites and altered patterns of cova-
lent binding 1n the lungs and liver.
Ethanol has long been known as a potentlator of halogenated solvent
hepatotoxlclty. Several other alcohols, such as methanol, Isopropanol, or
secondary and tertiary butanols, are also excellent potentlators of toxlc-
Hy. Recent literature reports show that not only Isopropanol, but also Us
metabolite acetone potentiate halogenated solvent toxldty. Further studies
with diabetic rats demonstrate that the uncontrolled diabetic rat was at
greater risk from carbon tetrachlodde exposure than was the normal rat.
Thus diabetics may be a susceptible subgroup In the human population.
-241-
-------
Of major concern 1s how to make the best use of known Information on
synerglsm or antagonism 1n risk assessment of a specific mixed exposure
situation. It appears relatively easy to add another safety factor when
synerglstlc effects of two chemicals have been demonstrated. Are we equally
comfortable 1n making a similar type of judgment when antagonistic effects
of two chemicals have been amply demonstrated?
DR. KENNETH CRUMP: MATHEMATIC MODELS
The proposal to use a mathematical model to estimate a "benchmark
Intake" for setting ADIs seemed to me to meet with a generally favorable
reaction. The principal objection was the lack of the necessary quantita-
tive data for some toxic endpolnts. This would mean that the methodology
could not be applied universally, although 1t could be used 1n many
Instances. I believe that adoption of such an approach could have the
useful side benefit of promoting a greater degree of quantification and
Improved data-reporting procedures 1n toxlcologlcal studies. To achieve the
greatest Impact 1n this area, the methodology should be presented In the
scientific literature. Details of the method that need to be worked out and
agreed upon Include:
Definition of benchmark (e.g., dose corresponding to 10"1
risk).
Mathematical model to be used 1n setting benchmark.
Whether confidence limits should be used 1n setting the bench-
mark (my own answer to this 1s definitely yes).
How to combine data for different toxic effects (e.g., whether
to use different safety factors to reflect severity of effects).
-242-
-------
The suggestion I made In my discussion for setting ADIs for mixtures 1s
as follows: Let f be the fraction of the mixture made up of chemical 1
and let ADI. be the ADI for chemical 1. Assume the risk from each chemi-
cal 1s linear 1n dose, I.e., that risk from exposure to chemical 1 alone 1s
given by
R = q. dose.
If we assume all ADIs are comparable 1n the sense that they all correspond
to the same risk R, then 1t follows that
Now assume further that the risk from exposure to a mixture 1s the sum of
the risks from the component chemicals. Then 1f ADI , . Is also to
K mixture
correspond to risk R, It must satisfy
RI + . . . + Rk = R
where R. Is the risk from the 1th chemical. But this can be written as
VlADIm1xture * • • • * VkADIm1xture = R
or
fk/ADIk
This approach 1s consistent with addltWHy 1n risk and linearity 1n dose.
It could be used with mixture of carcinogens and systemic toxicants. It
seems Inevitable that some form of add1t1v1ty must be the basis for estimat-
ing allowable exposures to mixtures. A reasonable argument can be made for
such an approach, and data needed for more complicated methods will almost
never be available.
DR. MARVIN SCHNEIDERHAN: MATHEMATIC MODELS
There 1s a thoughtful (and useful) publication by the NAS/NRC (1980) on
the problems of joint toxldty. The work was done under the chairmanship of
Dr. Sheldon Murphy of the Department of Pharmacology of the University of
-243-
-------
Texas on the behalf of the Coast Guard. Why the Coast Guard? Coast Guard
offices Inspect ships, Including the tanks and holds 1n which chemicals are
shipped, and are thus exposed to many toxic materials, sometimes 1n combina-
tion, more often 1n sequence. The Coast Guard was concerned with long range
effects. Chapter 9 and Appendix B of the report discuss some of the mathe-
matical models of joint or combination actions. Or. John Gart of the
National Cancer Institute wrote Appendix B.
An operational conclusion of the Coast Guard report 1s that for rela-
tively low exposures, a good first approximation to overall toxldty can be
made by (effectively) adding the toxldtles. For materials treated as 1f
they were linear 1n dose-response and without threshold, add1t1v1ty 1n
response 1s equivalent to addHlvHy 1n dose, and there Is no need to make
distinctions between which of the additive models will be used.
For materials for which a threshold Is assumed (or presumed), addHlvHy
In response can be In error, and addHlvHy In dose 1s the only appropriate
approach. An example should make this obvious. Say we have a "threshold"
material, and the threshold 1s at 5 units of dose. At any lower dose,
response Is zero. Say this material exists In the ambient environment
(Source A) at a level of 3 units. There will be no toxic responses. It
also exists 1n the elutlons from a toxic waste dump (Source B) at a level of
3 units. Persons exposed only to the toxic dump product will show no toxic
responses. When we have persons exposed to both sources we could say (using
additlvlty 1n response) that persons exposed to Source A should show no
response; persons exposed to B show no response; hence persons exposed to A
and B should obviously show no response. Zero + zero = 0. But Source A
plus Source B gives 6 units of dose, and the threshold was 5. Hence, there
should be response. Adding doses leads to the conclusion that there should
be response after exposure to both A and B.
-244-
-------
The major problem with trying to do something with adding doses 1s that
we usually need to know the dose of the active material at the site of the
action. Nominal dose (of different materials) 1s very hard to convert to
"real" dose. This problem 1s exacerbated by the fact that standards usually
need to be set 1n terms of nominal doses.
Whether combinations of materials act addltlvely (1n any sense) or more
than addltlvely (combining results of response to Source A and Source B, 1f
we didn't know these came from the same material, would lead us to say —
using add1t1v1ty 1n response, as 1s usually done -- that A and B were syner-
glstlc) 1s usually rather hard to determine; and once we determine 1t 1n the
mouse or the rat, we still cannot be certain that mouse synerglsm (or
antagonism) will also be human synerglsm (or antagonism).
It may be helpful to look at some past attempts at uncovering possible
synerglsm to see 1f these provide useful suggestions for possible experi-
mental evaluation of the toxldty of combinations of materials that find, or
might find, their way out of toxic waste dumps. Figure 28 shows schemati-
cally one of the approaches by Abraham Goldln and Nathan Mantel (Goldln et
al., 1958, 1974) 1n their (successful) attempts to find combinations of
chemotherapeutlc drugs to be used against human leukemia. Mantel speaks of
this as attempting to find "therapeutic synerglsm." The model system they
used was the L-1210 mouse leukemia -- a transplanted leukemia that kills
rather quickly. The figure 1s an attempt to show 1n three dimensions the
separate effects of two materials In Increasing the llfespan of L-1210
bearing mice.
At zero dose the mice have the survival of the control animals. At
Increasing dose (assuming the material Is effective) there will be Increas-
ing survival until the toxldty of the drug begins to Intervene and survival
-245-
-------
RESPONSE:
NUMBER OF "EXCESS" SURVIVORS
OVER CONTROLS
Increasing
Dose of A
100
% B
I
%A
100
FIGURE 26
Conceptual dose-response relationships for two chemotherapeutlc drugs.
(A) Three-dimensional representation of separate effects of materials A and
B. (B) Vertical plane through doses of A and B that elicited maximum
response. Possible synerglstlc, antagonistic and simple additive
dose-response curves for combined chemotherapy are shown.
Source: Adapted from Golden et al., 1958, 1974
-246-
-------
1s then reduced from the peak. The Goldln-Mantel procedure was to take the
optimal dose of one material (say A) and the optimal dose of another
material (say B), and create an array of pseudomateMals (AB) consisting of
different proportions of A and B. The trace of these materials on the base
plans (1n the figure) can be thought of as rays In the horizontal plane
starting at the (0,0) point. Each ray might be considered as one combina-
tion of A and B (e.g., say 30% A and 70% B). The farther one moves on the
ray from the (0,0) point, the higher the dose.
The bottom part of Figure 28 shows a plane taken out of the three-dimen-
sional representation of the top part of the figure where the best responses
were shown for the Individual materials. At the left end of this plane we
have 100% B and, on the Y axis, the response (survival) at this dose. At
the right end we have 100% A and, on the Y axis, the survival at this dose
of A. The next stage of experimentation would be to set up an array of dif-
ferent pseudomaterlals, each material corresponding to different proportions
of A and B, and conduct survival (dose-response) experiments at several dose
levels. The responses will create a "mountain" rising out of the base
plane. The bottom of Figure 28 shows a slice taken through such a mountain,
with three possible contours. The top contour shows a response of A and B
that will yield a higher response than the best of A and B alone. The
optimal, as the figure 1s drawn, looks as 1f 1t comes at about 60% B and
40% A.
The lowest line 1n the figure shows that the combination of A and B 1s
deleterious, and that the best response would come at 100% B and 0% A.
(These two materials are antagonistic!) The middle line shows that A and B
Just dilute each other, and the best response would also come at 100% B.
-247-
-------
If a large experiment had been conducted at several different dose
levels of the combination of A and B, 1t 1s possible (likely?) that the peak
of the mountain would have come somewhere other than along the sample plane
drawn here.
All of this looks very time consuming and complicated, and 1t was.
There are some mountain-peak search strategies that have been developed
("maximum-seeking" strategies) that could shorten the process, but all of
them required lots of work and rather prompt response. In the chemotherapy
problem, very few materials needed to be looked at, and the experiments
usually took no more than 30-45 days from beginning to burning (of the dead
mice 1n the Incinerator), so that this search process was possible and did
prove effective.
Figure 29 shows how a Goldln-Mantel scheme might look If one were look-
Ing for joint tox1cH1es. The bottom part of the figure shows the plane
that Includes the ED5Qs or LD^s of the two materials. If some dosage
combination of A and B produces more toxldty (at equivalent total dose),
then the tox1c1t1es would be more than additive, as 1n the Goldln-Mantel
model. There are many added complications, however. How do we combine
different tox1c1t1es? If there are several effects what do we do with them?
If 1t takes a long time to get an answer (I.e., chronic Illness; Hfespan
measures; long, latent period Illnesses like cancer), 1s 1t even possible to
go through a Goldln-Mantel process? And, finally, 1f there are a lot of
materials to consider jointly, 1s H possible to do even a palrwlse Goldln-
Mantel procedure?
Let us determine the magnitude of the combinations of materials that
might come about when there are several materials that might leak out of a
toxic waste dump, either one at a time or 1n any combination with each
-248-
-------
Increasing
Dose of A
% RESPONSE
100
%B
%A
100
FIGURE 29
Conceptual dose-response relationships for two toxic substances. (A)
Three-dimensional representation of separate effects of materials A and B.
(B) Vertical plane through doses of A and B that elicited 50% response.
Possible synerglstlc, antagonistic and simple additive dose-response
relationships are shown.
Source: Adapted from Golden et al., 1958, 1974
-249-
-------
other. Rather than consider different proportions of each of the materials,
we wm calculate the combinations possible for just presence or absence of
the material. Thus, two materials can be present 1n an effluent 1n three
ways (e.g., A alone, B alone, or A and B together); three materials 1n seven
wasy (e.g., A or B or C, AB, AC, BC or ABC). The general form 1s 2n -1.
Ten materials could be present 1n 2n -1 = 1023 combinations; and this
large number does not consider the materials present 1n different propor-
tions, only whether they are there or not there.
Thus, 1t seems clear that 1t 1s most unlikely that direct measures of
the joint tox1cH1es of the multitude of materials found In toxic waste
dumps can or will be made. In addition, 1t seems reasonable to consider
further possible Interactions of toxic waste dump materials with other mate-
rials In common use or those to which people are commonly exposed, such as
ethanol or cigarette smoke.
Clearly, approaches other than detailed, definitive testing need to be
developed. It has been suggested, for example, that as a first approxima-
tion, joint tests of 1,1 ,l-tr1chloroethylene, the most commonly found mate-
rial 1n toxic waste dumps, be tested In combination with other commonly
prevalent materials — or that commonly prevalent toxic waste dump materials
be tested 1n combination with ethanol. None of these suggestions, of
course, 1s completely satisfactory.
Not much combination testing for carclnogenesls (which seems to be the
major basis for standard setting for single materials) has been done In the
past. The testing 1n the large NCI-Stanford Research Institute combination
experiment (24 palrwlse contrasts) has been completed for some time, but
analysis 1s not complete. So far as I know, no substantive publications
have come from this study, but some are 1n the process. Several years ago,
-250-
-------
when I had an opportunity to look at some preliminary data, I could make no
generalizations. I found some combinations that appeared additive or
perhaps more than additive 1f the cancers they produced had the same target
tissues. When the carcinogens affected different target tissues their
actions often appeared to be less than additive, largely because the animals
often died early from the first cancer and did not have an opportunity to
develop tumors at the other, later-appearing site. The analysis of these
data 1s likely to be very difficult, requiring consideration of competing
causes of death, growth and weight gain, and allowing for time to tumor
appearance.
-251-
-------
DISCUSSION
DR. PATRICK DURKIN
As Drs. Clarkson and Cornish recommended, a major effort must be made to
Identify and explicate examples of environmentally significant multiple
toxicant exposures and their related effects.
Another major area to explore, as I Indicated 1n my presentation, 1s the
potential low-dose significance of the Interactions. I suggested, based on
my understanding of the biological modes by which chemicals Interact, that
1n the low-dose region the Interactions may be quantitatively less signifi-
cant or may not occur. Dr. Crump reinforced this feeling 1n his statistical
analysis. I think Dr. Crump's approach should be explored 1n greater depth.
In addition, I am examining multiple-order models along the same lines as an
extension of the earlier work by Hewlett (Hewlett, 1969; Hewlett and
Plackett, 1950, 1959; Plackett and Hewlett, 1948, 1952). Also along these
lines, I am examining examples of subchronlc studies that Involved multiple
toxicant exposures at relatively low doses.
An additional point that was not extensively discussed at the meeting
Involves the quantitative significance of multiple toxicant exposures. Over
the next month or so, I will be addressing this Issue for Dr. Stara's
office. Although most of the data will probably come from acute studies, I
will make every effort to examine the quantitative significance of Inter-
actions 1n the low-dose region.
MR. WILLIAM GULLEOGE
Definition of the term "synerglsm potentlatlon" 1s unclear as 1t was
presented at the workshop. If synerglsm 1s defined as something less than
additive effects, this definition 1s favorable. Very Uttle evidence exists
-252-
-------
to Indicate that chemicals act by additive effects. Dose-addition experi-
ments have shown that chemicals act Independently of one another 1n terms of
observable effect. Any argument for toxic Interaction stands on weak ground
unless compounds are naturally reactive 1n some manner.
DR. MAGNUS PISCATOR
The whole section on Interactions, well written and theoretically good,
applies to large doses of toxic agents and may have little relevance to what
happens at low level exposure.
DR. ROLF HARTUNG
The assumption has often been made that the toxldty of chemicals 1s
likely to be additive, and 1t 1s hoped that potentiating responses will be
offset on the average by antagonistic responses. Whether any of these
Interactions actually occur at low doses under chronic conditions has not
been satisfactorily established. The frequently dted examples of the
Interactions between asbestos and cigarette smoke are probably not typical
examples, since they probably Involve the Interactions of Initiation-promo-
tion processes, and also tend to Involve relatively high doses. It 1s
Important that Interaction studies, such as the 16-compound NCI/SRI study be
published for evaluation. It 1s also Important to study the effects of
ongoing multiple exposure (e.g., diet), and to set up experiments on multi-
component mixtures 1n order to test Interactions 1n general principle.
DR. RICHARD KOCIBA
These discussions underscored the premise that one cannot categorically
assume that all chemical Interactions should be treated as synerglstlc or
additive phenomena. There were numerous cited examples wherein direct
chemical-chemical Interactions usually led to a decrease In toxlcologlc
activity rather than a synerglstlc or additive effect.
-253-
-------
It appears as 1f there 1s little scientific justification to categori-
cally assume that mult1chem1cal exposure warrants the assumption of syner-
glsm or addH1v1ty unless Indicated by available data. A case-by-case
approach would appear to be the most appropriate course of action In dealing
with these mult1chem1cal exposures. This would best utilize all the data
available on the chemicals comprising the mult1chem1cal exposure.
DR. HERBERT CORNISH
At the moment there 1s little basis for assuming other than an additive
effect of chemicals at any one site.
DR. THOMAS CLARKSON
The two models described by Or. Durkln refer only to Interaction with
receptors; no modes are available to deal with pharmacoklnetlc Interactions.
If one chemical produces serious tissue damage, effects on the pharmaco-
klnetlcs and/or toxldty of a second chemical might be expected. However,
the addition of responses Is reasonable for low exposure levels. Chemicals
would be expected to act Independently and not to Interfere with either
their respective pharmacoklnetlcs or with the reaction with their respective
receptors.
Dose addition 1s also reasonable for these chemicals acting on the same
receptor. Saturation of the receptor 1s unlikely.
DR. ROBERT NEAL
In calculating the allowable exposure of man to compounds that produce
cancer 1n experimental animals, mathematlc extrapolation 1s the best tech-
nique currently available. However, we should not delude ourselves that we
have a biological basis for extrapolating from the observable range 1n
experimental animals to the low levels to which man 1s normally exposed. At
this time there 1s clearly no justification for sophisticated manipulations
-254-
-------
of mathematical models to estimate what the Incidence rate may be 1n man,
based on Incidence rates at high doses 1n experimental animals. At best,
the data generated using the mathematlc models are a guess.
Since there 1s no biologic basis for choosing between the various mathe-
matical models used 1n extrapolating from high-dose animal experiments to
low-dose human exposure, there 1s some merit 1n standardizing the mathematlc
model used by regulatory agencies 1n estimating cancer risk. However, 1n
applying a standardized model, consideration should be given to different
levels of allowable risk depending upon the applicability of the cancer data
("weight of evidence") generated 1n experimental animals to man. For
example, a compound that causes a tumor Incidence 1n only one sex of one
species and not 1n the other sex of that species or 1n other species
examined should perhaps be given less weight than a compound that produces
tumors 1n multiple sites and multiple species.
An approach to considering the weight of evidence 1n terms of risk
assessment for carclnogenUHy has recently been proposed by Dr. Robert
Squire 1n an article 1n Science (Squire, 1981). However, the method
proposed by Dr. Squire does not provide a numerical estimate of risk that
can be used by the regulator 1n setting an allowable level of exposure. A
modification of the Squire proposal might be to accept a higher level of
risk, determined by mathematlc extrapolation, for those compounds for which
the weight of the evidence 1s such that there 1s some question as to the
applicability of the data generated 1n experimental animals to man. For
example, an allowable risk of 1 1n 10,000 might be allowed for a compound
that only Increases the Incidence of liver tumors In male mice but not 1n
female mice or 1n rats, whereas an allowable risk of 1 1n 1,000,000 be
-255-
-------
considered for a compound that causes tumors 1n multiple organs of multiple
species. Option 3 proposed by Dr. Crump 1s also a modification of that same
concept.
DR. WILLIAM NICHOLSON
For combinations of exposures at low doses, each with low risks (10~4,
etc.), addltlvHy of effects would appear to be justified. This would apply
both for carcinogenic agents and those demonstrating only systemic effects.
However, addltlvHy may not apply when the exposure to an agent 1n a dump
site or 1n water combines with a personal exposure to an agent that can be
significant. Such personal exposures Include 1) cigarette smoking, 2) alco-
hol consumption, 3) medicinal or other drug use, and 4) special unique
exposure circumstances. Here the effects may be directly multiplicative,
especially for lung carcinogens. Thus 1t would be Important to establish
some of the combined effects for the dozen or fewer chemicals of concern 1n
the environment and the agents to which some humans could have extremely
high exposures. Where a review of the literature Indicates that such data
are lacking, appropriate research should be undertaken.
DR. MYRON MEHLMAN
Insufficient data are available to develop any meaningful model without
further consideration of biologic processes.
MR. WILLIAM GULLEDGE
Due to the uncertainty 1n predicting low doses and the various options
associated with the multistage model, recommendation for uniform use 1n
establishing water quality criteria cannot be given at this time. It would
seem that most models are equally predictive to a risk level of 10~2, and
a combination of mathematlc model and safety factors would lessen the
necessity for choosing one model over another.
-256-
-------
Hazard assessment Indices tend to magnify problems of uncertainty. Most
cardnogenesls Indices take Into account only positive test data. Data that
would Indicate that a material 1s probably not carcinogenic also need to be
considered. Frequently, technical studies are contradictory as to the
carcinogenic potential of a substance. The negative data also should be
considered as well as differentiation of poor quality studies and very
definitive studies.
DR. IAN NISBET
I was disappointed that the second day of the workshop was devoted
almost entirely to 2-chem1cal Interactions, since we live 1n an N-chem1cal
world. I Hked Dr. Crump's derivation of the conclusion that Interactions
are unimportant to first order, although I think 1t only applies when
effects of all N chemicals are small. My quick extension of his result to
second order was Incorrect, I am afraid, but I will try to develop a correct
version.
GENERAL COMMENTS
The Interactions of the different compounds 1n the dump sites may not be
as significant as the Interactions of these compounds with smoking or
alcohol use.
Dose add1t1v1ty for first-order Interactions should hold for second
order as well.
Biologic aspects should not be oversimplified with hypothetical examples.
A dose-response cannot be determined for the mixture due to problems
with extrapolation and because the composition of the mixture would vary
from place to place and time to time.
Interactions are specles-spedf 1c.
The RCRA weighting scheme should be considered.
If we have a sufficiently good data set over a wide range of chemicals,
we may be able to predict risk from mult1chem1cal exposure.
-257-
-------
REFERENCES
Alstott, R.L., M.E. Tarrant and R.B. Forney. 1973. The acute toxUHIes of
l-methylxanth1ne, ethanol, and l-methy1xanth1ne/ethanol combinations In the
mouse. Toxlcol. Appl. Pharmacol. 24: 393-404.
Bliss, C.I. 1939. The toxldty of poisons applied jointly. Ann. Appl.
B1ol. 26: 585-615.
De Rosa, C. 1981. U.S. EPA, Cincinnati. Memorandum to Jerry Stara, U.S.
EPA, Cincinnati, July 23.
Durkln, P.R. 1981. An approach to the analysis of toxicant Interactions 1n
the aquatic environment, in: Proc. Fourth Ann. Symp. on Aquatic Toxicology.
Am. Soc. Test. Mater.
Esvelt, L.A., W.J. Kaufman and R.E. Selleck. 1971. ToxUHy Removal for
Municipal Wastewaters, Vol. IV, of a Study of Toxldty and Blostlmulatlon 1n
San Francisco Bay-Delta Waters, SERL Report No. 71-7, Univ. of California,
Berkeley, CA. p. 224.
Flnney, D.J. 1971. ProbH Analysis. 3rd ed. Cambridge Univ. Press,
Cambridge, Great Britain. 333 p.
Goldln, A., S.R. Humphries, J.M. VendHtl and N. Mantel. 1958. Factors
Influencing antUumor synerglsm: Relation to screening methodology. Ann. NY
Acad. Sc1. 76: 932-938.
-258-
-------
Goldln, A., J.M. VendHtl and N. Mantel. 1974. Combination chemotherapy:
Basic considerations. In: Handbook of Experimental Pharmacology, A.C.
Sartorelll and D.G. Johns, Ed. New Series XXXVIII/1. SpMnger-Verlag,
Berlin, p. 411-448.
Goldstein, A., L. Aronow and S.M. Kalman. 1974. Principles of Drug Action:
The Basis of Pharmacology, 2nd ed. John Wiley and Sons, Inc., NY. 854 p.
Hewlett, P.S. 1969. Measurement of the potencies of drug mixtures.
Biometrics. 25: 477-487.
Hewlett, P.S. and R.L. Plackett. 1950. Statistical aspects of the Indepen-
dent Joint action of poisons, particularly Insecticides. II. Examination of
data for agreement with the hypothesis. Ann. Appl. B1ol. 37: 527-552.
Hewlett, P.S. and R.L. Plackett. 1959. A unified theory for quanta!
response to mixtures of drugs: Non-Interactive action. Biometrics. 15(4):
591-610.
Kepllnger, M.L. and W.B. Delchmann. 1967. Acute toxlclty of combinations
of pesticides. Toxlcol. Appl. Pharmacol. 10(3): 586-595.
Klaassen, C.D. and J. Doull. 1980. Evaluation of safety: ToxUologlc
evaluation. JJK Toxicology: The Basic Science of Poisons, J. Doull, C.O.
Klaassen and M.O. Amdur, Ed. Macmlllan Publishing Co., Inc., NY. p. 11-27.
-259-
-------
Levlne, R.E. 1973. Pharmacology: Drug Actions and Reactions. LHtle,
Brown and Company, Boston, MA. 412 p.
Marking, L.L. and V.K. Dawson. 1975. Method for Assessment of ToxUHy or
Efficacy of Mixtures of Chemicals. USOI, F1sh Wlldl. Serv., Bur. Sport
F1sh. Wlldl., Washington, DC, Investigations 1n F1sh Control, No. 67, p. 1-8.
Muska, C.F. and L.J. Weber. 1977. An approach for studying the effects of
mixtures of environmental toxicants 1n whole organisms performance. Recent
Advances In F1sh Toxicology, R.A. Taub, Ed. EPA 600/3-77-085. p. 71-87.
NAS/NRC (National Academy of Sciences/National Research Council). 1980.
Principles of lexicological Interactions Associated with Multiple Chemical
Exposures. Prepared by the Panel on Evaluation of Hazards Associated with
Maritime Personnel Exposed to Multiple Cargo Vapors. National Academy
Press, Washington, DC.
NRC {National Research Council). 1980. Principles of lexicological Inter-
actions Associated with Multiple Chemical Exposures. Natl. Academy Press,
Washington, DC.
Ohsawa, T., J.R. Knox, S. Khalifa and J.E. Caslda. 1975. Metabolic
dechlorlnatlon of toxaphene 1n rats. J. Agrlc. Food Chem. 23: 98-106.
Plackett, R.L. and P.S. Hewlett. 1948. Statistical aspects of the Indepen-
dent joint action of poisons. Ann. Appl. B1ol. 35: 347-358.
-260-
-------
Plackett, R.L. and P.S. Hewlett. 1952. Quantal responses to mixtures of
poisons. J. Roy. Stat. Soc. B14(2): 141-163.
Smyth, H.F., C.S. Well, J.S. West and C.P. Carpenter. 1969. An exploration
of joint toxic action. I. Twenty-seven Industrial chemicals Intubated 1n
rats 1n all possible pairs. Toxlcol. Appl. Pharmacol. 14: 340-347.
Smyth, H.F., C.S. Well, J.S. West and C.P. Carpenter. 1970. An exploration
of joint toxic action. II. Equ1tox1c versus equlvolume mixtures. Toxlcol.
Appl. Pharmacol. 17: 498-503.
Squire, R.A. 1981. Ranking animal carcinogens: A proposed regulatory
approach. Science. 214: 877-880.
Sun, Y-P. and E.R. Johnson. 1960. Analysis of joint action of Insecticides
against houseflles. J. Econ. Entomol. 53: 887-892.
U.S. EPA. 1981. Guidelines and Methodology for Quantitative Risk/Hazard
Assessment of TSPC Solvents. Environmental Criteria and Assessment Office,
Cincinnati, OH. (Unpublished report)
Veldstra, H. 1956. Synerglsm and potentlatlon with special reference to
the combination of structural analogues. Pharmacol. Rev. 8: 339-387.
Welsburger, J.H. and G.M. Williams. 1980. Chemical carcinogens. ln_:
Toxicology: The Basic Science of Poisons, J. Doull, C.D. Klaassen and M.O.
Amdur, Ed. Macmlllan Publishing Co., Inc., NY. p. 84-138.
-261-
-------
WHhey, J.R. 1981. Tox1codynam1cs and blotransformatlon. In: Internation-
al Workshop on the Assessment of Multlchemlcal Contamination. Milan, Italy.
(Draft)
Wolfenbarger, D.A. 1973. Synerglsm of toxaphene-DDT mixtures applied topi-
cally to the bollworm and the tobacco budworm. J. Econ. Entomol. 66(2):
523-524.
-262-
-------
SUMMATION OF MEETING
AND
CONCLUDING COMMENTS
-263-
-------
SUMMATION OF MEETING
DR. ROLF HARTUNG
The purpose of this meeting was to examine approaches to risk assessment
for multiple chemical exposures. The first day was spent largely looking at
the effects of single chemicals on various species. Initially, we discussed
the present methodology for derivation of an ADI and reviewed the NOEL and
other approaches. We looked at derivations of uncertainty factors, saw what
some of the derivations were and how they were utilized, taking Into account
bloconcentratlon factors. Then we heard some of the newer methodologies
described that Included, for Instance, adjustments based on body surface
area or various uncertainty factors. What was not settled, even though H
was pointed out, was to what extent a body surface area adjustment would
take the place of some of the uncertainty factors that now substitute for
1t, or that have previously substituted for 1t. It was also pointed out
that there 1s probably no scientifically verifiable means of setting an ADI.
As a matter of fact, 1t 1s only when we fall 1n our risk assessments and our
attempts to protect that we can see what may have gone wrong and analyze the
problem. There were also Important discussions on approaches to differenti-
ations between safety factors; the numbers that should be used, and when and
what uncertainty factors should be added to these numbers; and the consider-
ation of uncertainty of data versus uncertainty 1n the extrapolation process.
Throughout the early part of the meeting, there were a number of
Important points brought up Involving the Increased utilization of pharmaco-
klnetlcs. However, the exact means of how to do this needs further clarifi-
cation. Most chemicals found at dump sites have significantly poorer data
bases than do the pesticides and food additives for which a large segment of
-264-
-------
our present NOAEL-based risk assessments were originally developed. So, the
question of missing data for some of the Industrial chemicals may need to be
addressed.
It also became quite clear that we need to verify or Increase the data
base for quantitative structure-activity relationships. One observation
that appeared to be somewhat comforting was that most of the dumps, at least
at first sight, contained only several key chemicals, which tended to repeat
from dump site to dump site. However, 1t was pointed out that some of this
Information may be an artifact of the chemical priority testing schemes used
for ease of analysis, and that there may be large groups of chemicals that
are missed. A good candidate group for these possibly missed chemicals
(which I would like to add) 1s that of the aromatic amines, which are very
recalcitrant as far as analysis Is concerned.
One area for which we had relatively little resolution was species
difference. The extent to which a mouse 1s a man, or a man 1s a mouse, has
not really been resolved except to point out specific differences, some of
which are almost anecdotal. It Is clear that there are differences 1n size
and metabolic rate, and occasionally differences 1n pharmacoklnetlcs,
response, cell turnover, etc. It was pointed out that we probably could use
this type of data, especially for some of the better known solvents, more
effectively than we have 1n the past. In trying to make spedes-to-spedes
conversions, Dr. O'Flaherty pointed out a number of Interesting relation-
ships that have been used 1n aquatic toxicology based on log-log transforms,
but she also pointed out that there needs to be a better mechanistic under-
standing of some of the phenomena that were observed by Slderanko. It was
pointed out that there are Instances In which the large animal Is less
sensitive than the small animal, although the rule tends to be that, at
-265-
-------
least on a mg/kg or mg/m3 basis, the small animal tends to be less sensi-
tive than the large animal. For Instance, the assumption that a mouse
should be less sensitive to chloroform than a human did not hold up. In
this particular case, the mouse 1s much more sensitive, but this seems to be
at least 1n part an exception to the rule. There may be enough exceptions
so that such a rule cannot be readily applied.
The next areas that were discussed Involved route-to-route extrapola-
tions. These particular types of extrapolations are still difficult because
they depend greatly on pharmacoklnetlcs and our understanding of them. A
point was made that a need exists for test systems that have no false
negatives; however, a test system having false negatives produces a consumer
risk while one having false positives produces a producer risk. The problem
may have to be Investigated for specific tests, especially with respect to
the number or the degree of false negatives that different tests produce 1n
relation to the false positives they produce. Personally, I know of no test
system that has only false positives or only false negatives.
There was significant discussion on Incidence and severity of effect.
The present NOEL approach works with relatively limited data and Ignores the
slope of dose-response curves. It was mentioned that the NOEL approach
looks at only one small portion of the data, but this statement Is probably
Incorrect because, 1n reality, when a NOEL 1s derived one looks at the
entire data set, looks to the extent to which 1t represents a coherent
whole, and then selects the threshold level. One does not arbitrarily just
pick the lowest number that exists. It 1s correct, however, that 1t ends up
being a point that 1s based largely on scientific judgment. It 1s a single
point and one cannot extrapolate downwards from 1t, therefore 1t cannot be
used for assessing lower risk levels. Its applicability has been blessed by
-266-
-------
tradition. It may have served us well, but there haven't been too many
studies that have verified how well 1t has served 1n the past. It was
pointed out that 1t may be useful to have the option of using dose-response
data and extrapolation models for systemic toxic phenomena; but It was also
pointed out that some of the most sensitive types of effects, such as hlsto-
pathology, as presently reported 1n the open literature, may not readily
lend themselves to that kind of an approach, while the data as they exist 1n
the gray literature, consultant reports, etc., Indeed would allow such an
analysis. It was Indicated that 1t might be possible to present Incidence
data 1n some of the reports, but this would require changes 1n the habits of
pathologlsts and editors. It was also pointed out that there Is a great
deal of literature that exists for drugs, which looks at different physio-
logic phenomena 1n relationship to pharmacoklnetlcs, and which looks at
metabolic processes and has tied them Into an operating approach to pharma-
ceutical research. These approaches have not been fully utilized for the
analyses of environmental toxicants and probably could be utilized to a
greater extent.
In our discussion of the extrapolation of responses to systemic toxi-
cants or carcinogens, 1t was pointed out that, outside of the experimental
range, one 1s Increasingly dependent on the assumptions of the model that
has been selected, and that model selection, especially for systemic
toxicology, represents a judgmental process. It 1s necessary to try and
understand whether the best approach might be that of using a linear model,
starting from the lowest point that would define the upper bounds for all
concave curves, and thereby Introduce a system of conservatism. In my
opinion, the question as to how far systemic effects should be modeled,
handled with safety factors, and handled with extrapolations to various
-267-
-------
levels like 10 2, 10 3, etc., was left 1n the air, and probably proposed
only at this time as a comparative type of tool.
Route-to-route extrapolation was discussed. The problems with route-to-
route extrapolation, especially as related to the Stoklnger-Woodward equa-
tion, were reviewed. In addition, the Importance of the time course of the
dose as well as the resulting blood levels were Indicated, especially when
Ingested doses were to be compared with Inhaled doses. The effect of the
half-life of a chemical was considered as particularly Important, and
appears, In some cases, to alter some of the Influences of the delivery rate
of Individual doses. It was also pointed out that even though one may be
able to calculate an equivalent absorbed dose by various routes, one would
still also need to take Into account site-specific effects at the portal of
entry, which may Indeed greatly Influence the overall toxlclty.
The question of simultaneous multiple route exposures Is a difficult
one. The way this has been treated 1n the past 1s to observe the propor-
tional absorption from each particular route In order to calculate total
body dose. A number of sem1pol1tlcal problems were pointed out, such as
that of trying to add partial doses related to ADI to partial doses related
to a TLV, which was judged not to be appropriate. Problems were discussed
with the present approach of subtracting a dietary Intake and an Inhalation
dose from the ADI so as to apportion the controllable levels due to water
Intake alone. Although 1t 1s true that the doses should be apportioned, 1t
may require a more complex or a more Integrated scheme of cooperation of
different units within the Agency.
We shifted to a discussion of the risk assessments for carcinogens. Dr.
McGaughy pointed out a two-stage type of evaluation: 1) a qualitative
evaluation of cardnogenldty estimating the likelihood that a material
-268-
-------
might be a human carcinogen, and 2) a quantitative evaluation. In his
presentation and subsequent discussions, we were told that the Agency 1s
contemplating, or at least evaluating, the possibility of adopting the IARC
criteria and possibly treating genotoxlc and nongenotoxlc materials somewhat
differently. This was not entirely clear, however. The Agency currently
uses a multistage model, the lower range of which amounts to a linear
extrapolation on the upper 95% confidence Interval. Its use has a number of
justifications, e.g., when an agent 1s clearly a genotoxlc carcinogen or as
a default case (when Us cardnogenldty Is not known). There was some
discussion regarding the possibility of differentiating genotoxlc materials
from those that do not have a direct Interaction with DNA. These might be
treated differently as eplgenetlc carcinogens for which the mechanism of
cardnogenlclty 1s less understood but might Involve a NOEL or a different
type of extrapolation than that of the regular Global 79 model extrapola-
tion. Some problems were pointed out 1n CAGs present averaging of dose by
time weighing. The CAG Indicated that they are looking for possibly better
models of accomplishing dose averaging, since the present method, 1n some
ways, appears to violate some of the basic concepts of the multistage model.
CAG also Indicated that human data are fitted linearly on a case-by-case
basis.
When 1t came to the discussion of chemical mixtures for which cardno-
genld t'y had been studied, we essentially had only one example of a complex
mixture, I.e., dlesel exhaust. For this Dr. Albert Indicated the possi-
bility of using a comparative potency type of approach, since H was not
possible to administer a sufficiently high dose of dlesel exhaust to produce
cancer without having other toxic adverse effects due to noncardnogenlc
exhaust constituents. An exposure assessment could not be clearly
-269-
-------
performed. Apparently H was felt that an exposure assessment was necessary
because of the finding that the extracts of dlesel exhaust were positive 1n
skin painting and were positive In a number of mutagenldty assays. A
fairly lively discussion then ensued regarding Interactions of carcinogens
and noncardnogens, and the questions Involving direct versus Indirect
pathways 1n cardnogenldty.
Most of the second day of the meeting was devoted to a discussion of
mixtures per se. Initially, the discussion Involved the rather simple
approach of the summation of fractional doses and effects as currently used
by ACGIH. This 1s somewhat similar to Dr. Crump's third proposed approach.
It 1s not entirely clear as to how Or. Crump's numerical treatment differs
significantly from that of the ACGIH approach. It appears to be essentially
the same. In the case of the ACGIH approach, 1t 1s used as an Indication
that a TLV, accumulated from a number of different toxicants that are
related, has been exceeded. This has been used apparently 1n cases of
mixtures by the Agency to compare systemic toxldty from exposure to a
number of unrelated chemicals.
In an extensive discussion of exposure assessment, the subject of target
populations was extended to Include subpopulatlons at greater risk. The
degree of susceptibility was brought up with relation to developmental
changes, genetic differences, nutritional deficits, existing disease,
behavioral effects, and possibly concomitant or previous exposures. It was
questioned whether Indeed a 10-fold safety factor, as 1s currently the use
1n conventional methodology, would be adequate to protect this population.
It was pointed out that 1f one considers such groups as pregnant women,
children from 1-4 years of age, or people with lung, heart or liver disease
as being hypersensitive groups rather than being just a portion of the
-270-
-------
general dose-response curve, then these groups could separately represent a
sizable proportion of the population. These groups might need to be treated
separately and considered as responding through separate mechanisms. An
Initial scheme was suggested as to how the presence of hypersensitive
Individuals 1n the population might be Incorporated Into site-specific risk
assessments. It was also pointed out that a few of the hypersenslt1v1 ties
might contribute a sizable variation, extending close to two orders of
magnitude. However, this appeared to be relatively small considering other
variables. One of the difficulties 1n utilizing, for Instance, the TLVs has
been that they are based on young, healthy white males, and 1t may be
possible that quite a few of the people who live around a particular release
site of many chemicals may exhibit significantly greater sensitivity than
the base population from which TLVs were originally derived. Or. Plscator
Indicated an Interesting way In which statistics, just by getting the right
combinations of circumstances at the extremes of various distributions,
could give us apparently hypersensitive populations. Without having to
Invoke the presence of genetic predisposition, 1t would be possible to have
a small fraction of the population affected and to be definitely further out
than a 10-fold factor might Include. Some of the major questions were
whether some of the hypersenslt1v1ties were differences 1n kind as compared
with just degree, and that possibly some hypersensitive Individuals might
respond entirely differently than the population contributing to the main
dose-response curve.
The last subject discussed was the one of biological bases for toxicant
Interactions and mathematical Interactions. Dr. Durkln discussed some bio-
logical bases on which chemicals might Interact, as well as simple chemical
Interactions that might then result 1n a new chemical, this new chemical
-271-
-------
then producing an effect. Following this he discussed a series of mathe-
matlc approaches to try to help us uncover the quantitative relationship
that might develop 1n such Interactions. It was pointed out that experi-
mental designs for such particular studies were extremely complex, required
very large numbers of animals, and were not very likely to be done. A
number of additional and unusual types of Interactions were discussed as
well as some generalities. The question of absorption of divalent cations
as Influenced by calcium, phosphate, vitamin K, and Iron was discussed. The
mathematlc modeling that needs to be done on multlchemlcal Interactions 1s
quite complicated and was advanced as being 1n the most average default kind
of condition. The modeling must work with available data and should be
consistent with what we currently use for the single chemical approach. On
that particular basis, Dr. Crump has developed a series of additive types of
mathematical treatments for the development of ADIs, or virtually safe
doses. There was some question as to whether add1t1v1ty really occurs at
low doses. This was answered to some degree for carcinogens, where at least
for several examples 1t does appear to occur. To what extent this occurs
with systemic toxicants Is still unknown. A suggestion was made that, 1n
light of the great complexities Involved, studies must be conducted to
better predict possible antagonism or potentlatlon Involved 1n chemical
Interactions, and actual testing of various dump effusla should be per-
formed. It was pointed out that this would probably present significant
technical difficulties, and that for the time being one would probably have
to rely on modeling data, Incorporating those phenomena that may be readily
explained.
-272-
-------
CONCLUDING COMMENTS
DR. THOMAS CLARKSON
Due to deficiencies 1n mathematical modeling, H would be useful to
compile a listing of chemicals known or expected to be 1n dump sites that
might exhibit different types of Interactions — dose-addition, response-
addition, potentiating or synerglstlc action.
The statistical model now developed to predict the toxlclty of Individ-
ual chemicals should be developed to predict the Interaction of mixtures.
For acute effects, a mechanism should be established to deal with a data
base and communication system to Inform Poison Control Centers.
DR. ROBERT NEAL
In my opinion, we need to spend more time on understanding the biologi-
cal mechanisms of toxlclty of chemicals and the validity of our animal
models for estimating risk 1n man. Entirely too much time Is currently
being spent on trying to refine the mathematlc models used to predict
low-dose effects In man. Dr. Skalsky pointed out that lexicologists should
take more advantage of the drug testing data to validate our animal models.
I strongly support this proposal. This 1s the only substantial body of data
where we have dose-response Information 1n both experimental animals and man
exposed to the same compound. This 1s a valuable resource and should be
explored more fully for the purpose of validating our animal models for
predicting chemical risk 1n man.
DR. KENNETH CRUMP
It will be more difficult to develop generic methodology for systemic
toxicants than for carcinogens because of the much greater diversity of
experimental protocols and methods for reporting data 1n the case of
systemic toxicants. EPA should attempt to foster more uniformity 1n testing
protocols and data reporting.
-273-
-------
DR. REVA RUBENSTEIN
There was an unspoken assumption, which I found troublesome; that U 1s
possible to unravel the effects of multiple exposure given the same physio-
logic endpolnts. There are such a large number of single chemical exposures
for which there 1s not yet an established risk level, that 1t 1s question-
able whether we should (or can) proceed to assessment of multiple chemicals.
I suggest that any future workshops focus on ways to quantitatively evaluate
the basic assumptions.
In general, the participants spent most of the discussion time restating
basic assumptions 1n toxicology. Many, 1f not all, of these assumptions are
1n place because one cannot either measure phenomena more accurately, or
frame the appropriate questions to sufficiently restructure the toxicology
paradigm. Nevertheless, It Is Important for the regulatory agencies to
understand the absolute limitations of the science, I.e., the areas where
uncertainty will always remain.
DR. JULIAN ANDELMAN
The prepared document and the discussion at the meeting did not suffi-
ciently distinguish between criteria being considered for protective vs.
predictive purposes, as discussed at previous meetings. Thus, for example,
there was a considerable discussion of ADI, which 1s clearly not appropriate
1n assessing risk.
Also, the concepts of safety and uncertainty factors should be carefully
distinguished. It 1s clear that a safety factor should not be used In
assessing risk.
-274-
-------
DR. SHELDON MURPHY
I felt the conference was worthwhile and that H was an effective mecha-
nism for exchange of Ideas and positions on the conference topic. I was
somewhat disappointed, however, that so much time was dedicated to reviewing
principles of the lexicological approaches to risk assessment. It seems to
me that essentially all the topics discussed on the first day are standard
to all chemical risk assessments and have been discussed and debated often
and at length for Individual chemical exposures. Although our current
methods of risk assessments are Imperfect, It seems to me that, 1f we can't
make decisions as to what needs to be done and how to apply scientific
principles to assessing the hazards of exposure to Individual chemicals, we
will never be able to assess the health risks of multlchemlcal exposures.
DR. IAN NISBET
Although the comments on the first day were focused on the complexity of
lexicological phenomena, and many exceptions to simple generalizations were
pointed out, I do not think thai ECAO's current procedures were called
seriously Into quesllon. There seems lo be general supporl for Ihe basic
procedures, provided lhat the possibility of excepllons 1s borne In mind.
DR. HERBERT CORNISH
It might have been useful 1f we had some Introduclory Informallon on
specific chemicals al dump sites. I believe 1t was pointed out that tr1-
chloroelhylene occurred 1n 40% of the sites. Similar data on other commonly
occurring chemicals might have helped to focus the discussion.
It 1s also evident that considerable variability will occur 1n assessing
hazards depending on the nalure and concentrations of the toxlcanls at the
site.
-275-
-------
There 1s obviously a need for further studies on the effect of multiple
exposures 1n animals, to provide data needed to assess human risk at dump
sites.
DR. MAGNUS PISCATOR
A large number of Interesting estimates and formulas were presented.
However, there was an obvious lack of hard data.
MR. WILLIAM GULLEDGE
Development of water quality criteria and subsequent water quality
standards should not be solely based on quantitative risk assessment. Other
factors must be considered 1n developing compound specific criteria on a
national level. Policy Issues are factors that must be addressed and can
Include feasibility of enforcing national standards, cost-effectiveness of
complexlng with proposed regulations, and political climate for Implementing
a given regulation. Other technical factors should also be Included In the
dedslonmaklng process, and these may Include effect of existing waste
treatment technology 1n achieving water quality criteria and alternative
regulatory options.
DR. ROLF HARTUNG
There appear to be uneven requirements for the amount of evidence that
must be presented before various criteria are set or before various phenom-
ena that are part of the evaluation of the evidence for setting criteria are
accepted. It 1s clear that criteria must be developed 1n the face of uncer-
tainty, I.e., before all the required evidence 1s 1n. Similarly, decisions
on species differences, eplgenetlc vs. genotoxlc causation, the use of
negative data, may have to proceed before all of the data are 1n, but when
the data present a coherent picture.
-276-
-------
The relationships of findings 1n laboratory animals to likely effects 1n
humans continue to be poorly resolved Issues for both carcinogens and non-
carcinogens. This problem area 1s especially Important for quantitative
comparisons. The present approaches of using a safety factor of 10 or a
surface area adjustment can only be considered to be first-order approxima-
tions. The use of pharmacoklnetlc data, metabolic Information, and compara-
tive physiologic responses reported for drugs may form the basis for better,
though more complex, comparisons. A great deal of Information on species
differences 1s available, but 1s presently utilized only rarely.
DR. MARVIN LEGATOR
Although the meeting was convened to discuss mult1chem1cal exposure, few
definitive Issues and almost no solutions to problems of multlchemUal expo-
sure were presented. This may not be so much a reflection of the sponsors
of the meeting or the participants as much as a reflection on our Ignorance
1n this area. It may be fruitful to have a specific meeting devoted to
specific research needs 1n this area.
It may well be that we should consider a new approach to risk assessment
In hazardous wastes sites, I.e., a ranking method for toxic chemicals rather
than overquantlfylng existing data.
-277-
*USGPO:
------- |