F ax-On-Demand
Telephone: (202)401-0527
Item: 6021
6023 (Appendix)
Guidance for Submission of
Probabilistic Human Health Exposure
Assessments to the
Office of Pesticide Programs
Office of Pesticide Programs,
U.S. Environmental Protection Agency
[Draft, 11/4/98]

-------
[Draft, 11/4/98]
Guidance for Submission of Probabilistic Human Health Exposure Assessments
to the Office of Pesticide Programs
Office of Pesticide Programs, U. S. Environmental Protection Agency
1.	PURPOSE	2
2.	AGENCY POLICY FOR PROBABILISTIC ASSESSMENTS	3
3.	GENERAL CONSIDERATIONS FOR PROBABILISTIC ASSESSMENTS 	5
3.1	Data Distributions 	5
3.2	Data Sources 	6
3.3	Toxicity 	7
3.4	Exposed Populations 	7
4. ACUTE DIETARY ASSESSMENTS 	8
4.1	Population of Concern 	10
4.2	Reportable Percentile Exposure Values	10
4.3	Food Consumption	10
4.4	Magnitude of the Residue	11
4.4.1	Crop Field Trial Data	12
4.4.2	Monitoring Data	13
4.5	Adjustments For "Typical" Use Patterns 	14
4.6	Exposure Assessment for Livestock Commodities 	16
5. OCCUPATIONAL AND RESIDENTIAL EXPOSURE ASSESSMENTS 	17
5.1	Mixer/loader/applicator (M/L/A) exposure	18
5.2	Residential exposure	21
5.3	Postapplication exposure 	22
Attachment 1: Glossary	Al-1
Attachment 2 : Probabilistic Risk Assessment and Monte-Carlo Methods:
A Brief Introduction	A2-1
Attachment 3: Distribution Selection	A3-1
Attachment 4: June 13, 1996 Final Office Policy on Performing Acute Dietary
Exposure Assessment	A4-1
1

-------
[Draft, 11/4/98]
Guidance for Submission of Probabilistic Exposure Assessments
to the Office of Pesticide Programs
Office of Pesticide Programs, U. S. Environmental Protection Agency
1. PURPOSE
U.S. EPA ("the Agency") recently established a policy and a series of guiding principles for the
use of probabilistic risk assessment techniques. The Agency determined that probabilistic analysis
techniques, "given adequate supporting data and credible assumptions, can be viable statistical
tools for analyzing variability and uncertainty in risk assessments." (Memo1, Fred Hansen, Deputy
Administrator, May 15, 1997) The Agency also established a baseline set of conditions for
acceptance of a probabilistic analysis for review and evaluation; these conditions relate to the
good scientific practices of transparency, reproducibility, and the use of sound methods (Policy
for Use of Probabilistic Analysis in Risk Assessment1, U.S. EPA Office of Research and
Development, May 15, 1997). This Agency Policy Document noted that Monte Carlo analysis is
the probabilistic technique most frequently encountered, but other probabilistic techniques will not
necessarily be excluded from consideration.
The purpose of this Guidance document is to establish guidance for submission and review of
probabilistic human health exposure assessments in the Agency's Office of Pesticide Programs
(OPP). This document is intended to be used chiefly by persons conducting probabilistic human
health exposure assessments for purposes of registration or reregi strati on of pesticides. This
guidance will help assure that probabilistic assessments accurately portray exposures and risks to
the U.S. population and subpopulations of special concern such as infants and children. Such
assessments will play an increasingly important role in the evaluation of risks posed by pesticides
and will improve the Agency's ability to make regulatory decisions that fully protect public health
and sensitive subpopulations, including infants and children. A more general introduction to
OPP's risk assessment procedures and overview of probabilistic exposure techniques (including
Monte-Carlo) is provided in Attachment 2 to this document entitled Probabilistic Risk
Assessments and Monte-Carlo Methods: A Brief Introduction. The reader desiring a more general
overview of probablistic risk assessment is referred to this attachment.
1 Reproduced as Appendix A in "Guiding Principles for Monte-Carlo Analysis, March 1997,
EPA/63 0/R-97/001.
2

-------
OPP currently considers risk and exposure assessment submissions using probabilistic techniques
for acute dietary risk as well for occupational or residential exposures. This document is designed
to provide interim guidance for the preparation and review of probabilistic assessments. This
document is not intended to provide step-by-step instructions on conducting probabilistic
assessments; as OPP gains greater experience in reviewing such assessments, more detailed
guidance will be provided. In addition, the Agency encourages the continued development of
probabilistic risk assessment techniques. This Guidance is not intended to inhibit the advancement
of the science, but rather to provide interim guidance as the science of probabilistic risk
assessment develops.
This document is divided into five sections. Section 1 is this part which describes the historical
background and purpose of this document. Section 2 provides an overview of Agency policy for
probablistic assessments, reiterating the Agency's eight conditions for acceptance of probabilistic
analysis techniques. Section 3 covers general HED considerations for probabilistic assessments as
they apply to both acute dietary and occupational/residential exposure assessments. Section 4
provides more specific information regarding the conduct of acute probabilistic dietary
assessments while Section 5 covers this specific information for occupational and residential
exposure assessments. There are four attachments associated with this guidance document. They
include a glossary (Attachment 1), an introduction to risk assessment and probabilistic methods
(Attachment 2), a guide to distribution selection (Attachment 3), and, for historical reference and
perspective, a 1996 Office policy document on performing acute dietary exposure, assessments.
2. AGENCY POLICY FOR PROBABILISTIC ASSESSMENTS
Agency policy is that risk assessment should be conducted in a tiered approach, proceeding from
simple to more complex analyses as the risk management situation requires (Agency Policy
Document, 5/15/97). More complex analyses require greater resources both to prepare and
review, and probabilistic assessments can represent high levels of complexity. In a deterministic
assessment, exposure is expressed as a single value, which could represent an upper-bound
scenario (for example, tolerance levels on foods), or a statistical tendency (for example, average
values from appropriate field trial data). If a deterministic analysis based on conservative
assumptions leads to risk estimates that are below levels of regulatory concern, then there is no
need to refine risk assessments with more complex techniques.
In contrast to deterministic techniques, probabilistic risk assessments more fully consider ranges
of values regarding potential exposure, and then weight possible values by their probability of
occurrence. Individual input values used to generate a point estimate are replaced by a
distribution reflecting a range of potential values; a computer simulation then repeatedly selects
individual values from each distribution to generate a range and frequency of potential exposures.
The output is a probability distribution of estimated exposures, from which exposures at any given
3

-------
percentile can be determined. Consideration of probabilistic analyses will be limited to exposure
(and not toxicity) assessments at the current time.
The Agency Policy Document (5/15/97) designated eight conditions for acceptance of
probabilistic analysis techniques. The Policy Document also identifies guiding principles specific
to Monte Carlo analyses, and these should be considered in exposure assessments, as applicable.
The eight general conditions are summarized below, amplified with some comments relevant to
consideration of probabilistic assessments by OPP:
1.	The purpose and scope of the assessment should be clearly articulated in a
"problem formulation" section that includes a full discussion of any highly exposed
or highly susceptible subpopulations evaluated (e.g., children, the elderly). The
questions the assessment attempts to answer are to be discussed and the
assessment endpoints are to be well defined.
2.	The methods used for the analysis, including models used, data upon which the
assessment is based, and assumptions that have a significant impact upon the
results, are to be documented and easily located in the report. This documentation
should include a discussion of the degree to which the data are representative of
the population under study. Also, documentation should include names of models
and software used to generate the analysis. Routes of exposure should be clearly
defined. Sufficient information is to be provided to allow the results of the analysis
to be independently reproduced.
3.	The assessment should include a sensitivity analysis, which is an evaluation of how
the overall exposure distribution changes as individual inputs, expressed as
distributions, are varied. The results of sensitivity analyses are to be presented and
discussed in the report, in order to better determine which inputs drive the
predicted risk. Probabilistic techniques should be applied to chemicals, pathways,
and factors of importance to the assessment as determined by sensitivity analyses
or other basic requirements for the assessment.
4.	The presence or absence of moderate to strong correlations or dependencies
between the input variables is to be discussed and accounted for in the analysis,
along with the effects these have on the output distribution.
5.	Information for each input and output distribution is to be provided in the report.
This includes tabular and graphical representations of the distributions (e.g.,
probability density function and cumulative distribution function plots) that
indicate the location of any point estimates of interest (e.g., mean, median, 95th
percentile). The selection of distributions is to be explained and justified. For both
4

-------
the input and output distributions, variability and uncertainty are to be
differentiated where possible.
6.	The numerical stability of the central tendency and the higher end (tail) of the
output distributions are to be presented and discussed.
7.	Calculations of exposures and risks using deterministic (e.g., point estimate)
methods are to be reported if possible. Providing these values will allow
comparisons between the probabilistic analysis and past or screening level risk
assessments. Further, deterministic estimates may be used to answer scenario
specific questions and to facilitate risk communication. When comparisons are
made, it is important to explain the similarities and differences in the underlying
data, assumptions, and models.
Point estimates, even if they represent a high-end exposure assessment, can
provide important information to risk managers. If a particular exposure pathway
is responsible for a significant risk, then mitigation could focus on that pathway.
8.	Since fixed exposure assumptions (e.g., exposure duration, body weight) are
sometimes embedded in toxicity metrics (e.g., Reference Doses, cancer potency
factors), the exposure estimates from the probabilistic output distribution should
be consistent with the toxicity metric.
3. GENERAL CONSIDERATIONS FOR PROBABILISTIC ASSESSMENTS
In addition to the conditions established by Agency Policy in the previous section, additional
general guidance concerning data distributions, data sources, toxicity, and exposed populations is
appropriate for probabilistic assessments submitted to OPP. These considerations are more fully
described below.
3.1 Data Distributions
In probabilistic analyses submitted to OPP to date, inputs in most cases have represented
collections of discrete data, rather than continuous distributions. An example of such discrete
data are residue values from a set of composite samples from field trials. Assessments submitted
to date have typically used the data values themselves directly, assigning equal probability of
selection, as justified, to each data point. Such discrete data distributions are entirely acceptable.
In fact, if adequate data are available (in terms of both quality and quantity), this technique may be
preferred; there is no requirement to fit a collection of samples to a mathematical model.
5

-------
Moreover, the Agency Policy Document (5/15/97) advises that there may be no single criterion
for demonstrating "best fit" of a collection of data points to mathematical models for particular
types of distributions.
However, if a person desires to conduct probabilistic assessments where data inputs are
mathematical models of distributions, he or she should provide appropriate justification. Once
input data are collected for an exposure variable of interest, a number of techniques are available
for representing the variable as a continuous function. Two of these techniques are summarized
below, and criteria for justifying the expression of distributions as specific mathematical models
are described in more detail in Attachment 3 concerning distributions:
	An assessment may use the data to define a linear interpolated empirical
distribution function (EDF). In this case, the data values themselves are used to
specify a continuous cumulative distribution and the entire range of values
(including intermediate points) is used as the input. With this technique, any value
between the minimum and maximum observed values can be selected and model
input is not limited to the specific values present in the measured data.
	An assessment may attempt to fit a mathematical expression to the data using
standard statistical techniques, and input values can be selected from this fitted
distribution.
The choice of input distribution should always be based on all relevant information (both
qualitative and quantitative) available for an input. The selection of a distributional form should
consider the quality and quantity of the information in the database, and should address broad
questions such as the mechanistic basis for choosing a distributional family, the discrete or
continuous nature of the variable, and whether the variable is bounded or unbounded. In all cases,
input values expressed as a distribution should be fully described.
We note, however, that not all input values need, or should, be expressed as a mathematically-
modeled distribution, and probabilistic techniques should be used only on those pathways and
exposure patterns which may significantly influence the final risk estimate. If an input variable
does not significantly affect an exposure estimate regardless of its distribution, then its use in a
probability distribution represents marginal value added. A sensitivity analysis should be
performed to identify variables with significant effects on an assessment.
3.2 Data Sources
The sources for data used in an assessment should be clearly identified. Where these are studies
that have previously been submitted to OPP, and/or reviewed by the Agency, identifying
6

-------
information such as petition number, reregi strati on submission, document number (MRID), or
Agency review number should be provided, so the data points used may be readily confirmed.
Where available data points have been excluded from the probabilistic analysis, the exclusion
should be identified and justified. In general, HED does not believe it is appropriate to exclude
data points as "outliers" based on statistical tests only; the decision to discard an outlier should be
based on a scientific or quality assurance basis, and should only be done with extreme caution,
particularly for environmental data sets which often contain legitimate extreme values. We believe
that statistical tests can be used to identify suspect data points which require further investigation,
but that it is inappropriate to eliminate outliers from analysis unless further review of the suspect
points reveals a significant mistake in protocol which renders a generated data point irrelevant to
label conditions (e.g., wrong tank-mix concentration, mistaken application rate, too early a PHI,
too many applications, etc.). This is particularly true in cases where the data points in question
were used by the Agency in establishing a tolerance or other regulatory limit.
Persons submitting probabilistic analyses should also distinguish between data points based on
independent individual samples, and replicate determinations or analyses of the same sample.
Where replicates are reported, probability weighting of data should be equal (absent other
justification), regardless of the number of replicate analyses of any one sample.
Studies from which data are obtained should contain sufficient quality assurance/quality control of
data to assure sample integrity during treatment, collection, transportation, storage, and analysis.
Examples of such assurance include validation of the analytical method used or supporting storage
stability data. Supporting data should be provided for summaries.
3.3	Toxicity
Endpoints used in assessments should be consistent with the exposure of concern (acute,
subchronic, chronic), and should be those selected by the HED Hazard Identification Assessment
Review Committee, or selected in accordance with the Toxicology Endpoint Selection Process: A
Guidance Document, presented to the Science Advisory Panel in February 1997.
The population(s) of concern should be consistent with the appropriate toxicity endpoint. Where
an effect is likely to be specific to a defined subpopulation, assessment should be targeted to that
group. For example, since developmental effects are only likely to be expressed following
exposure of a pregnant female, the appropriate subpopulation for assessment would be females,
13+ years old.
3.4	Exposed Populations
7

-------
The complexity of probabilistic exposure assessments is demonstrated with the following two
scenarios:
	Exposed individual, single chemical, single use, single route;
	Potential exposed individual, multiple chemicals, multiple uses, multiple routes.
As the complexity of the exposure assessment increases so does the potential of obscuring a
highly exposed subpopulation; such subpopulations which are readily definable must be analyzed
in a risk assessment. Consistent with the Agency Policy section above, probabilistic assessments
should include bounding estimates for high exposure scenarios. Examples of highly exposed
subpopulations include early reentry of individuals in treated areas, frequent users, and high
volume users for occupational and residential assessments. For acute dietary assessments,
examples include the small number of individuals who consume rare or less-popular agricultural
commodities or subpopulations who preferentially consume certain commodities (e.g., locally
grown produce) whose treatment history differs substantially from national patterns (e.g., see
Section 4.4.1 of this document for a discussion of this subgroup and use of percent crop treated
data). Specifically, if there are concerns about certain subpopulations in small geographic areas
receiving increased exposures as a consequence of their consumption habits and/or local
agricultural practices, then separate probabilistic analyses by geographic region and/or season
might be necessary for those subpopulations if they can be identified.
It is important that the risk assessment highlight risk to any heavily exposed subpopulations, in
addition to the overall population, particularly since risk mitigation may become more achievable
when risk applies to a small population.
4. ACUTE DIETARY ASSESSMENTS
At present, OPP will consider for review submissions of probabilistic assessments on acute
dietary exposure only; probabilistic analyses of chronic dietary exposure are discouraged. This
policy stems from the limits of available data on chronic dietary patterns of the U.S. population.
The surveys currently accepted by the Agency as sources for estimating food consumption by
individuals (U.S. Department of Agriculture, National Food Consumption Survey (NFCS) 1977-
78, Continuing Survey of Food Intakes by Individuals (CSFII) 1989-91, and CSFII 1994-96
consist of data obtained over two or three days based on questionnaires completed by consumers.
OPP does not consider these data adequate to model chronic consumption patterns as
distributions across the population. Frequently, foods are not consumed repeatedly by an
individual sampled during the survey, and seasonality, personal preference, and demographics
make resampling of the data to generate surrogate chronic consumption of questionable validity.
8

-------
Chronic dietary risk assessments are conducted by OPP using a tiered approach, beginning with
conservative assumptions and then proceeding through refinements to more closely reflect residue
levels that might be eaten by the population of U.S. consumers. By the later iterations of the
assessment, estimates of dietary risk are based on average consumption of foods (which may be
categorized by population sub-groups), and a statistical evaluation of residues in specific foods
(averages). While available food consumption data will not capture the full range of chronic
consumption, the statistical central tendencies should be sufficient to estimate chronic dietary risk.
Even if probabilistic analyses were appropriate, they might provide additional information on the
nature of a risk assessment distribution, but would not be expected to significantly alter
conclusions based on central statistical tendencies. Probabilistic analyses therefore would not be
expected to add significant value to chronic dietary risk assessments.
Acute dietary risk assessments are also conducted in a tiered approach (see Attachment 4: Final
Office Policy for Performing Acute Dietary Exposure Assessment, June 13, 1996). The HED
Policy document defines input data for Tiers 1 and 2, representing deterministic assessments by
the Agency, and for Tiers 3 and 4, where submission of probabilistic analyses by registrants is
authorized. Tiers 1 through 3 are discussed briefly below:
0 Tier 1 uses a single high end residue estimate and a distribution of consumption
data. It provides only an upper bound (worst-case) estimate of acute exposure.
 Tier 2 is the same as Tier 1, except that it uses a single average residue data point
for commodities which are typically mixed or blended. It provides a more realistic
estimation of exposure by considering average anticipated residues for food forms
that are typically mixed prior to consumption.
0 Tier 3 uses a distribution of residue data points as well as a distribution of
consumption data points. This provides a more realistic estimation of acute
exposure than Tier 2.
 Tier 4 requires incorporates more extensive data (e.g., single-serving market
basket surveys, cooking studies, etc.) and provides the most representative
exposure picture. However, it may not provide a lower exposure estimate than
Tier 3.
Acceptable data sources, and/or data considerations, for acute dietary exposure and risk
assessments are described below. The aforementioned policy document should be consulted for
further details.
4.1 Population of Concern
9

-------
Acute dietary risk assessment as currently implemented by OPP focuses on population risk, not
risk to the most highly exposed individual. These risk assessments assume that treated
commodities are uniformly distributed in the food supply and that it is appropriate to accumulate a
single day's exposure (24 hours) for comparison to the acute toxicity endpoint (which is based on
a single dose).
Acute dietary assessment focuses on population risk, and not risk to "eaters only". The use of the
total population rather than "eaters only" allows consistent comparisons between different
crop/residue combinations because the population of" eaters only" differs with every analysis.
We recognize that the assessment of population-based exposure (also referred to "per capita
exposure") increases the potential for obscuring highly exposed consumers of certain
infrequently-eaten commodities through "probabilistic dilution" among the entire population.
Nevertheless, since there may be concern over commodities consumed by small numbers of
people, exposures should also be expressed on a "per user day" basis; this latter metric provides
exposure information on "eaters only" and can be used to assess exposures to individuals
consuming infrequently consumed items. In these cases the percentile level of regulatory concern
can be raised to incorporate the high-risk eater population. This approach was endorsed by the
OPP Scientific Advisory Panel in September 1995.
4.2	Reportable Percentile Exposure Values
OPP's interim policy is to base its regulatory decisions on acute dietary risk on the 99.9th
percentile of the exposure distribution when a probabilistic analysis is conducted for acute dietary
exposure. This percentile of regulatory concern applies to distributions established on a total
population basis ("eaters" and "non-eaters") which include both consumers and non-consumers of
specific food items. A draft issue paper addressing the rationale for selection of this percentile of
regulatory concern will be made available for public comment for 60 days via an announcement in
the Federal Register in the near future. EPA is seeking public comment on a series of draft
documents concerning nine science policy issues in order to make the development of its FQPA-
related science policies transparent and participatory for all interested parties.
It is necessary for acute dietary assessments that probabilistic analyses demonstrate stability of
higher tail values (Number 6 under the Agency Policy section above). Changes in the 99.9th
percentile value as the number of simulations increases should be clearly described. In keeping
with Agency policy (5/15/97), information for such output and input distributions must include a
tabular and/or graphic representation of various percentile exposures up to and including at least
the 99.9th percentile.
4.3	Food Consumption
The accepted sources (NFCS 1977-78, CSFII 1989-1991, and CSFII 1994-1996) are considered
suitable to represent population variation of consumption patterns on a daily basis. CSFII 1987-
10

-------
88 is not considered a suitable data source because of an unacceptably low response rate.
Submissions should include: 1) identification of the consumption survey used and a submission of
all survey data if those survey data have not previously been reviewed by EPA; 2) a list of all
individual food items (e.g., pizza, apple pie) used in the assessment (these items are often referred
to by "food codes"; and 3) the mapping that translates individual food items into the commodities
on which tolerances are established. The submission should provide all food codes associated
with each of the commodities that currently have a tolerance for the pesticide in question and, if
applicable, each commodity or commodities for which a tolerance is being sought.
4.4 Magnitude of the Residue
Residue data guidelines specific to acute dietary assessment have also been established, and details
are contained in the Office Policy document referenced above (Final Office Policy for Performing
Acute Dietary Exposure Assessment, Debra Edwards, HED, June 13, 1996). More detailed
guidance on the acceptability of field trial data for exposure assessment is contained in the
applicable policy document (Draft OPP Policy for the Use of Anticipated Residues of Pesticides in
Foods for Use in Chronic Dietary Exposure Assessments, June 1997). Requirements for field
trials are specified in OPPTS Test Guidelines, Residue Chemistry, 860.1500, August 1996.
We note that any data files used in the analysis should be submitted in electronic format (i.e., on
diskette) so that the analyses can be duplicated (and modified, if necessary) by the Agency. For
dietary assessments, these data files would include residue data files (RDF files) which should
include residue levels from either crop field trials or monitoring data (each of which are discussed
below). For crop field trials, the RDF files would generally consist of all the crop field data points
that represent the maximum (label) application scenario (e.g., maximum rate, minimum PHI).2 It
is important to note that we will require submission of complete RDF files for all crops currently
registered, even if the registrant intends to remove crops from the label, cancel a registration as
part of a risk mitigation measure, or believes certain points to be outliers. These crops or specific
data points can be specifically noted in the data submission and OPP itself will adjust the data
files, if warranted.
To summarize the guidelines for Tier 3 of acute dietary assessment, the first tier at which
probabilistic analysis is used, residue values for non-blended single-serving commodities (e.g.,
2 In principle, data generated from field trials conducted at the maximum label rate is
preferable to data from field trials conducted at other rates (unless bridging or other data is available
 see Section 4.5) and preference should be given to inclusion of these data points. However, on a
case-by-case basis, data from field trials conducted at 25% of the labeled maximum rate and 25%
of the minimum labeled PHI may be used provided there is no apparent impact on residue values and
no appreciable bias is introduced.
11

-------
apples, bananas, oranges) may be based on a distribution of field trial data points. For
commodities that are typically mixed or "blended" prior to consumption (e.g., oils, grains and
grain products, sugars, certain tomato products (e.g., paste, puree, and juice), most juice
products, soybeans, peanuts, dried potatoes, mint oils, wine, sherry, dried beans), either the
average field trial value corrected for percent crop treated or the entire distribution of monitoring
data (not further adjusted for percent crop treated) may be used. In each case, V2 LOD/LOQ
would be substituted for all non-detected (ND) values. A draft issue paper addressing the use of
V2 LOD/LOQ as a default value will be made available for public comment for 60 days via an
announcement in the Federal Register in the near future. EPA is seeking public comment on a
series of draft documents concerning nine science policy issues in order to make the development
of its FQPA-related science policies transparent and participatory for all interested parties.
4.4.1 Crop Field Trial Data
Office Policy has been previously issued for Tier 3 analyses. Under this policy for non-
blended (single-serving type) commodities, the distribution of residue data points from
field trials should be based on individual field trials conducted at the least restrictive
conditions allowed by the label (maximum rate for individual applications, maximum
seasonal rate, shortest preharvest interval (PHI)). Preferred data sources are field trials
that have been reviewed and accepted by the Agency (for example, for registration or
reregi strati on actions). Registrants should clearly indicate sources of residue data
(petition number, reregistration submission, MRID, Agency review number) so the data
points used in acute assessment can be readily confirmed. Where data points from
appropriate field trials have been excluded in the probabilistic analysis, such exclusion
should be identified and justified.
In general, the distribution of residue data points from field trials can be adjusted by
percent crop treated values, reflecting the untreated part of a crop by introducing "zero"
residues as a weighted proportion of the total number of residue entries. Imported crops
are assumed to be 100% treated unless data are provided.3 OPP has chosen to incorporate
percent crop treated into its (acute) probabilistic exposure estimates since we believe that
it provides a better estimate of real exposure probabilities; When local agricultural
practices differ markedly from national patterns, the use of a single (national) estimate of
percent crop treated may be less appropriate for certain geographic subpopulations of
consumers with specific fresh-produce buying patterns. If there are concerns about certain
subpopulations in defined geographic areas receiving increased exposures as a
consequence of their consumption habits and/or local agricultural practices, then separate
3A11 percent crop treated values must be confirmed by OPP's Biological and Economic
Analysis Division (BEAD). In general, OPP will use BEAD'S "maximum percent crop treated"
estimate in a Tier 3 probabilistic analysis.
12

-------
probabilistic analyses by geographic region and/or season might be necessary for those
subpopulations.
4.4.2 Monitoring Data
Exposure assessments using monitoring data will typically use data from surveillance
monitoring by the Food and Drug Administration (FDA) and the U.S. Department of
Agriculture (USDA), or data from the USDA's Pesticide Data Program (PDP). As noted
previously for Tier 3 assessment of blended commodities, the entire distribution of residue
values from (blended commodity) monitoring data should be used (with no correction for
percent crop treated since this is already incorporated into the monitoring data), with V2
LOD or LOQ substituted for all non-detects. PDP is designed to provide data relevant to
dietary risk assessment, but it is limited to specific commodities and pesticides. Generally,
at least 100 sample points for each (blended) commodity are required for monitoring data
to be used in exposure assessment. Further details on monitoring programs are contained
elsewhere (Draft OPP Policy for the Use of Anticipated Residues of Pesticides in Foods
for Use in Chronic Dietary Exposure Assessments, June 1997).
For Tier 3, the Agency discourages use of monitoring data as a distribution of residues for
most unblended commodities (e.g., apples, bananas, oranges) because data from
composite samples do not adequately represent the range of residues in a single serving
size sample and the relationship between the residues measured in a composite sample and
the range of residues in the individual samples that make up the composite is not
established for most chemical/commodity combinations. From limited data that are
available, OPP has observed that residues in single serving samples can be higher, by as
much as an order of magnitude or more, than residues in the corresponding composite
sample. While field trial data are also based on composite samples, they represent
residues under maximum application scenarios (maximum rate and number of applications
and minimum PHI) and are generally measured at the "farm gate." Because residues may
decline during shipping, handling, and/or processing before food consumption at the
"dinner plate, "field trial data collected as composite samples are considered sufficiently
conservative for use in an acute dietary risk assessment. Monitoring data collected as
composite samples, on the other hand, would consistently underestimate residues levels in
the upper percentiles of agricultural commodities if the samples were measured on a
single-serving basis.
For Tier 4 analysis, assessment may include a distribution of monitoring data points,
provided the residue data reflect a market basket survey based on single serving size
samples. Such market basket surveys should be well-designed and statistically valid; study
protocols should generally be approved by the Agency before initiation. As the Agency
has already noted, a Tier 4 analysis may not necessarily result in lower acute risk estimates
13

-------
than Tier 3 (Final Office Policy for Performing Acute Dietary Exposure Assessment,
Debra Edwards, HED, June 13, 1996).
4.5 Adjustments For "Typical" Use Patterns
As a further refinement, probabilistic assessments may include residue data based on the range of
"typical" application rates that may be more restrictive (lower rates, fewer applications, longer
PHIs) than the maximum label conditions, provided certain conditions are met. Because the
necessary conditions may require the generation of additional data, it may prove to the advantage
of registrants and/or growers to conduct an informal sensitivity analysis to determine to what
extent lower residue levels in given crops may influence overall acute dietary exposure. OPP's
BEAD will verify that the typical usage data are reasonable and reflect actual practice. For
example, if an assessment has been conducted under assumptions that 40% of the crop is not
treated, that 40% is treated at a "typical" rate of 0.5x the label maximum, and 20% treated at the
label maximum, all these assumptions should be confirmed by BEAD. For regions of production
for which use/usage data are not available, or not confirmed by BEAD, residue levels should
continue to be based on use rates allowed by product labeling for incorporation in the
assessment.
Residue data to support the use of typical use/usage data may be derived from a number of
sources. One option would be to perform additional field trials at the reduced rates of interest in
accordance with the Series 860 Residue Chemistry guidelines (OPPTS 860.1500 Crop Field
Trials). In this situation, new residue data would be generated under reduced used conditions and
would be (probabilistically) added to the data originally generated under maximum use scenario
for tolerance assessment. These additional field trials would, in general, be expected to be in
accord with the OPPTS 860.1500 guidelines with respect to the number of field trials and
geographic distribution.
Alternatively, the registrant may desire to instead generate limited "bridging data" to permit a
statistical adjustment to the maximum application rate/minimum PHI residue data originally
generated for tolerance establishment purposes. This option may be particularly desirable for
those crops in which more extensive (more than 5) residue trials are recommended per OPPTS
crop field trial guidelines (e.g., lettuce potatoes, corn). Under such an alternative, at least two
side-by-side field trials comparing residues from the maximum label conditions v. applicable
reduced conditions should be conducted at locations previously used to support the label rate.
The sites chosen should reflect different geographical regions, with one being the region where
the highest average field trial (HAFT) value occurred for a given crop, and the second being the
region representing the highest proportion of production for the given crop (see Tables in OPPTS
Test Guidelines, Residue Chemistry, 860.1500, August 1996). If both these conditions occur in
the same region, then the second field trial should be conducted in the region representing the
second highest proportion of crop grown. The bridging data concept can be extended to include
residue decline information; that is, crops from side-by-side field trials can be harvested at various
14

-------
pre-harvest intervals (e.g., 0, 1, 2, 3, 5, and 7 days) and residue measurements can be made to
develop a residue decline curve from which a residue decay rate can be inferred. With
probabilistic information on the range of actual PHI's observed by growers (i.e., application
timing), residue data reflecting these declines can be generated. We note that with both bridging
and residue decline studies, extrapolation of data between similar crops may be allowed on a case
by case basis, considering similar cultural practices and application patterns.
We emphasize that residue data (or residue decline data) should be available to support the
inclusion of the range of "typical" use patterns. Assumptions of a linear or other relationship
between application rates and resulting residue levels may not be made automatically; data
correlating measured residue values to application rates should be provided. Likewise, an
automatic assumption that residue decline is first order (for example) with respect to time is not
acceptable; data should be provided to demonstrate a relationship between pre-harvest interval
fPHD and residue level. In particular, we note that data establishing reduced application rate or
residue decline relationships cannot be derived from field trials conducted at different sites or at
different times because of the potential impact of environmental conditions and variability in study
conduct on results. Therefore, only data from controlled field trials specifically designed and
collected to monitor the effects of PHI on residues should be used for modeling these
relationships. Data provided should include weather and precipitation records to enhance
evaluation of a study and its results.
It is the Agency's belief that it would be inappropriate to attempt to derive a quantitative
relationship between application rate and/or timing and resulting measured residue levels if these
residue levels lie below a limit of reliable quantitation (i.e., the LOQ). That is, if a registrant
intends to use bridging or residue decline data in an attempt to establish a quantitative (i.e.,
mathematical) relationship between application rate and resulting residue level, it is important
that residue level measurements be sufficiently precise and accurate such that a valid mathematical
relationship can be obtained. Thus, all residue measurements used to establish such a
mathematical relationship should be at levels at least equal to the limit of reliable quantitation (the
LOQ). In no case may measured residue levels below the method LOQ be used when
establishing the relationship between residue level and application rate or application timing.
If data from side-by-side field trials are not available, data produced by commercial growers may
also be useful in establishing the relationship between application rate and resulting residues.
Such data may be used in probabilistic assessments, provided that: 1) they are collected using
procedures documented in standard operating procedures; 2) that samples can be clearly
correlated with given application rates and/or conditions; 3) that the PHI, time from harvest to
analysis, commodity analyzed, and demonstrated storage stability of residues v. storage conditions
used are clearly defined; and 4) that the analytical method used is well defined (including its limit
of quantitation (LOQ)), and detects all residues of current toxicological concern for a given
pesticide.
15

-------
As noted above, use of field trial data based on composite samples has been considered acceptably
conservative for acute dietary assessment. Use of residue data from field trials under application
conditions less stringent than the allowed label rate has the effect of diminishing this conservatism.
Ideally, field trials employing typical use patterns and used for dietary risk assessment should be
based on single serving size samples. Recognizing that such data are rarely available in quantities
sufficient for probabilistic analysis, OPP will evaluate chemical-specific considerations to
determine whether the use of data from composite samples is acceptable for use in risk
assessments. The most important considerations would be the systemic nature of the pesticide,
application type and timing (e.g., short PHI or postharvest fruit dip), and the stability of the
pesticide (especially postharvest and during processing or cooking, as applicable), as these factors
influence the likelihood that data on composited samples at harvest may underestimate residues in
single serving size samples at the time of consumption. If examination of these and other factors
lead the Agency to determine that use of composite samples may underestimate risk to one or
more population subgroups, then other options would be pursued. These other options could
include, but would not be limited to: offering the opportunity to conduct a market basket survey
under Tier 4; reverting to an exposure assessment based only on label maxima conditions; or
calculation of worst-case residues in a single serving size component by assuming all residues of
the composite sample can be attributed to a component single serving size sample.
It should be noted that data needed to support typical use patterns under this section could be
reduced if a registrant changes a label directly to incorporate more stringent conditions, and then
conducts a probabilistic assessment using residue data based on the new label conditions only. In
that case, the revised label would represent the new maximum conditions, and the assessment
would represent a simple Tier 3 approach. Data (which could include limited bridging data)
would still be needed to justify residue levels under the new maximum conditions, but verification
by BEAD, and data comparing residues in composite versus single serving size samples would not
be required.
4.6 Exposure Assessment for Livestock Commodities
In assessments submitted to OPP, the following approach for estimating residues in livestock
commodities has been accepted for Tier 3 analysis. This approach is not meant to be prescriptive;
OPP would consider other approaches to modeling livestock diets, provided they can be justified
scientifically and remain protective of public health from the perspective of acute dietary risk:
The dietary burden for the appropriate animals should first be calculated in a manner similar to
that for determining tolerances in meat, milk, poultry, and eggs (MMPE). That is, all feed items
(as specified in OPPTS Test Guidelines, Residue Chemistry, 860.1000, August 1996, Table 1)
that could be treated with the pesticide of interest should be assumed to have residues with 100%
crop treated.
16

-------
A reasonable worst case dietary burden should then be calculated taking into account the residue
levels for individual feeds and their percentages of the diet with dry matter correction where
appropriate. With respect to residue levels, for dairy cattle feed items the average residue from
field trials reflecting the maximum use pattern may be used for all commodities to account for the
extensive blending of milk. For beef cattle the average field trial residue may be used for blended
feeds (e.g., grains), but the highest average field trial (HAFT) should be employed for non-
blended feeds such as forage and hay. "Reasonable worst case" means that the diet should take
into account regional practices (e.g., local milksheds), but should not be a mixture of feeds on
which the animal would have difficulty surviving. For example, it should not consist primarily of
similar feeds that are high in protein (e.g., combination of meals from soybeans, peanuts and
cottonseed), or that are low in nutritional value (e.g., cotton gin byproducts plus rice straw plus
sorghum stover).
If reliable percent crop treated (%CT) data are available for the feed items, this information may
be used in the following manner. The worst case dietary burden described above may be assumed
to apply to only that % of meat, milk, poultry and egg (MMPE) samples corresponding to the
highest %CT for any one feed item. (This highest %CT should account for all feed items for the
pesticide, not just those used to construct the worst case diet.) All other samples of animal
products will be assumed to have zero residues. For example, if the %CT for alfalfa, soybeans
and corn are 20%, 10%, and 5%, respectively, 20% of livestock commodities may be assumed to
have residues and the remaining 80% assumed to have no residues. On a probabilistic basis, a
much smaller percentage would receive the maximum dietary burden.
With respect to calculating the residue levels in animal tissues and eggs, the maximum ratio of
tissue or egg residues to dose level from the feeding study should be used since these items may
be consumed as a single commodity. That maximum ratio should normally reflect the samples
from the feeding level closest to the worst case dietary burden provided there are quantifiable
residues to calculate a ratio. A linear regression may also be used to determine residue levels
provided the regression is forced through zero (i.e., shows no residues in animal commodities
when dietary burden is zero). In the case of milk, it is acceptable to use the average ratio of
residues to dose level due to the high degree of blending that occurs for this commodity. That
average ratio should be calculated using only those samples collected after residues have reached
a plateau in the milk.
5. OCCUPATIONAL AND RESIDENTIAL EXPOSURE ASSESSMENTS
Prior to issuance of this guidance document, OPP assessed occupational and residential exposure
to pesticides using a deterministic approach, and as such discouraged submissions of probabilistic
assessments on occupational and residential exposure. This policy stemmed from a lack of
adequate criteria to assess the validity of probabilistic exposure assessments and a concern that
highly exposed subpopulations might be obscured in population based assessments.
17

-------
Consistent with current Agency policy that probabilistic analysis techniques are viable statistical
tools for analyzing variability and uncertainty in risk assessments, OPP has now developed
guidance for the preparation and review of probabilistic exposure assessments from occupational
and residential use of pesticides. Accordingly, OPP will now accept for review probabilistic
exposure assessments in support of occupational and residential use of pesticides.
However, registrants and reviewers should be aware that some regulatory restrictions are based
on incidents of pesticide poisoning. These restrictions may include certain articles of personal
protective equipment (PPE) (e.g., protective eyewear), or minimum restricted-entry intervals
which are based on the acute toxicity of the active ingredient. The Agency is unlikely to reduce
or eliminate these types of protections regardless of the results of probabilistic analyses based
upon other than acute toxicological endpoints.
For example, protective gloves may be required on product labeling because of a high incidence
of skin irritation effects to workers handling the product. If a probabilistic risk assessment for
this chemical concluded that workers are sufficiently protected from other effects at baseline attire
(i.e., no gloves), the Agency would not be inclined to eliminate a glove requirement based on
reported skin incidents. Or, a probabilistic risk assessment based on a short- or intermediate-
term toxicological endpoint may indicate that by 12 hours following application, short- or
intermediate-term risks to workers would not exceed the Agency's level of concern. If the
product, however, has a longer REI based on poisoning incidents, the Agency would not be
inclined to reduce the REI to 12 hours, despite the results of the probabilistic assessment.
5.1 Mixer/loader/applicator CM/L/A) exposure
Occupational exposures to pesticides from handling (mixer/loader/applicator) are assessed in a
tiered manner consistent with HED policy described in Series 875 - Occupational and Residential
Exposure Test Guidelines, Group A-Applicator Exposure Monitoring Test Guidelines
(Previously designated Subdivision U). The Applicator Exposure Monitoring Test Guidelines
describe studies and data required to determine dermal and inhalation exposure following
agricultural use of a pesticide. These studies/data are: Dermal Exposure - Outdoors, Guideline
875.1100; Dermal Exposure - Indoors, Guideline 875.1200; Inhalation Exposure-Outdoors,
Guideline 875.1300; Inhalation Exposure-Indoors, Guideline 875.1400; Biological Monitoring,
Guideline 875.1500; Applicator Exposure Monitoring Data Reporting, Guideline 875.1600; and
Detailed Product Use Information, Guideline 875.1700.
The Pesticide Assessment Guidelines, Subdivision U - Applicator Exposure Monitoring, states in
part, "...respiratory exposure monitoring is required if the material in question has been
demonstrated to cause an adverse biological effect that is associated with or accentuated by
18

-------
respiratory exposure; if the formulation or application method is expected to result in significant
respiratory exposure; or if the formulation or application method has an unknown potential for
respiratory exposure." Since respiratory exposure to pesticide handlers is generally much less
significant in comparison to dermal exposure, this document will describe dermal input parameters
only. It should be noted that input parameters for probabilistic inhalation exposure assessments
could be derived in a similar manner.
Dermal exposure for an agricultural pesticide handler is initially determined as a baseline dermal
unit exposure (long pants, long sleeve shirt, no gloves, open mixing/loading, and open cab
tractor), followed, if necessary, by an assessment which takes into account additional personal
protective equipment (PPE; double layer of clothing and chemical resistant gloves) and finally an
assessment which takes into account engineering controls (closed mixing, single layer of clothing,
no gloves, enclosed cockpit, enclosed ca).
Input parameters for these assessments historically have been point estimates reflecting typical,
maximum, or minimum values. Parameters generally entered into deterministic calculations for
dermal exposure from handling pesticides in an agricultural setting are: the subpopulation of
interest (e.g., mixers/loaders, re-entry workers, applicators, etc.); the unit exposure value for the
subpopulation of interest (derived from the Pesticide Handler Exposure Database (PHED)
VI. 1); the application rate (from product labels); lbs active ingredient handled and the area treated
in a typical workday (estimates based on available usage information) and the worker's body
weight (taken from the Agency's draft Exposure Factors Handbook).
It should be noted that frequency of exposure is not explicitly considered as an input parameter
for short- or intermediate- term exposures because this assumption is implicitly accounted for in
the duration of the toxicology endpoint. Additional details regarding these parameters both as
currently used in deterministic assessments and as might be used in a probabilistic assessment are
described below:
 Populations of Interest: Under the deterministic approach, HED considers the
MOEs for each individual subpopulation of exposed users (e.g., mixer/loaders,
reentry workers, pilots). Under a probabilistic approach, the policy of assessing
the risk to each individual subpopulation of interest will continue. Several
exposure assessments can be required for a given subpopulation depending upon
the number of toxicological endpoints that have been identified (e.g., cancer, short-
term, intermediate-term, and chronic). The completion of assessments for short-
and intermediate-term endpoints generally do not require an extensive knowledge
of the population dynamic. For example, OPP would not consider market-share
data reflecting national use of a pesticide (e.g., percent crop treated) if a short- or
intermediate-term endpoint exists, because any single exposure event may pose an
unacceptable risk. The completion of chronic and cancer-based assessments does,
however, require a more extensive knowledge of the population and other
19

-------
dynamics affecting frequency and duration of exposure (how frequently do
agricultural workers pick a crop in a season, in a lifetime, in treated fields, in
untreated fields). In a probabilistic chronic exposure assessment, exposures must
be amortized over a lifetime and information on population dynamics provides a
valuable insight into the subpopulation over a lifetime. For example, market share
information may provide a basis for assessing risks to migrant worker populations
over a lifetime by delineating the use of chemicals on specific crops.
PHED values: Under the deterministic approach, unit exposure values are
derived from chemical-specific studies, and surrogate studies in PHED. If
chemical- specific data are available, these data should be used in conjunction with
PHED data, reflecting similar exposure scenarios, to derive unit exposure
estimates (see PHED: The Pesticide Handler Exposure Database, Reference
Manual, Version 1.1, February 1995, and The Pesticide Handler Exposure
Database (PHED), Evaluation Guidance, Version 1.1, March 1995). PHED
composite point estimates (unit exposure) are assumptions of central tendency
values for each body part from replicate data. It should be noted that there is
typically high variability among replicates in exposure studies and that most of the
studies in PHED do not have exposure data for all body parts. Unit exposure
values are derived from actual exposure studies where the same formulation types,
equipment, and methods were employed. About half of workers doing the same
activity would be expected to have higher unit exposures, and half would be
expected to have lower unit exposures. Under a probabilistic approach,
handler exposures may be estimated probabilistically. That is, the entire range of
exposure values from PHED and chemical-specific studies can be considered and
incorporated into the exposure assessment. However, exposure values derived
from PHED VI. 1 and chemical - specific studies should be fully described with
respect to the source and quality of the data, the subsetting of the data base, and
the process by which data on individual body parts are aggregated to estimate total
dermal exposure. Potential correlations between "body parts" and exposure values
should be considered. If a high correlation exists, then "body-parts" from different
PHED studies should not be randomly mixed to reflect an exposed individual.
Application rates: Under the deterministic approach, application rates are
selected from available data. HED recognizes that varying rates can be applied
depending upon a number of factors (including the degree of the pest problem and
environmental considerations). Typical application rates, when available, are used
to estimate intermediate and long term exposure. Maximum application rates are
used to estimate acute and short term exposure. Under a probabilistic approach,
application rates may be expressed as distributions, provided appropriate data are
20

-------
available and the distribution is fully described. Multiple values may be used in lieu
of a fixed value when the data have been shown to be appropriate for the scenario
being assessed. OPP's Biological and Economic Analysis Division (BEAD) must
verify that application rates less than the maximum label rate reflect actual practice.
	Treated area: Under the deterministic approach, treated area per day for the
various application method/crop combinations are standard values used by OPP.
These values were developed after much internal discussion, and are considered to
represent typical to reasonable high-end acreage. Under a probabilistic
approach, this information can be broken down into its individual components.
That is, acres treated per day is the product of workday length (in hours/day) and
treatment rate (in acres/hour). This latter factor can be considered a function of
tractor (or aircraft) speed, tank capacity, length of run, swath width, finished spray
treatment rate, tank refill time, and distance to tank refill station. Workday length
and treatment rate can be expressed as distributions, provided appropriate data are
available and the distribution is fully described. Multiple values may be used in lieu
of a fixed value when the data have been shown to be appropriate for the scenario
being assessed. Potential correlations need to be considered. OPP's BEAD will
verify that multiple values for treatment rate and acres treated reflect actual
practice.
	Body weight: Under the deterministic approach, 70 kg value for adults or
60 kg for females is routinely used by the Agency. This is identified in the
Agency's draft Exposure Factors Handbook as the mean body weight for both
sexes of adults in all age groups combined, rounded to one significant figure.
Under a probabilistic approach, body weights may be expressed as distributions,
provided the distribution is fully described.
5.2 Residential exposure
Residential exposure assessments generally reflect baseline exposure (i.e., short pants, short-
sleeved shirts, no protective gloves, and no respiratory protection are baseline) and do not
consider additional personal protective equipment or engineering controls. Input parameters may
reflect typical, maximum, or minimum values. Exposure to pesticides from residential uses are
assessed in a deterministic manner consistent with HED policy described in: Standard
Operating Procedures for Residential Exposure Assessments (presented to the SAP 9/97, and
3/98). This document describes algorithms for calculating potential dose rate (PDR) for typical
residential exposure scenarios and summarizes (Appendix A) assumptions, input descriptors,
21

-------
references, and output descriptors for each scenario. PHED data are used to estimate unit
exposure for only a limited number of the residential scenarios described.
Under a probabilistic approach, all input parameters for residential exposure assessments can
be expressed as distributions, provided appropriate data are available and each distribution is fully
described. Multiple values may be used in lieu of a fixed value when the data have been shown to
be appropriate for the scenario being assessed. When PHED is used to determine baseline unit
exposure, the entire range of unit exposures derived from a PHED analysis can be considered and
incorporated into the exposure assessment. Unit exposure values (derived from PHED VI. 1)
must be fully described with respect to aggregating body part exposures, subsetting the data base,
and quality of data (PHED: The Pesticide Handler Exposure Database, Reference Manual,
Version 1.1, February 1995). Potential correlations between "body parts" and exposure values
should be considered. Several exposure assessments may be required for a given subpopulation
depending upon the number of toxicological endpoints that have been identified (e.g., cancer,
short-term, intermediate-term, and chronic). The completion of assessments for short- and
intermediate-term endpoints generally do not require an extensive knowledge of the population
dynamic. For example, HED would not consider market-share data reflecting national use of a
pesticide (e.g., percent households using the product) if a short- or intermediate-term endpoint
exists, because any single exposure event may pose an unacceptable risk. The completion of
chronic and cancer-based assessments does, however, require a more extensive knowledge of the
population dynamic (i.e., market share and demographic data) as these exposures must be
amortized over a lifetime and these data provide a valuable insight into the subpopulation over a
lifetime. For example, market share information may provide a basis for assessing risks to
residential populations over a lifetime by delineating the use of chemicals in specific geographical
regions and across demographic groups. OPP's BEAD will verify market share data used to
amortize long term exposure from residential use of pesticides.
5.3 Postapplication exposure
Post application exposure to pesticides from agricultural use generally reflects baseline exposure
which for this scenario assumes clothing of long pants and long-sleeved shirts. Postapplication
exposure to pesticides from residential use generally reflects baseline exposure which for this
scenario assumes clothing of short pants and short-sleeved shirts. Additional PPE (gloves and
respirators) and engineering controls are generally not considered effective options by HED
except under very specialized circumstances. Input parameters may reflect typical, maximum, or
minimum values. Exposure to pesticides from entering treated areas are assessed in a
deterministic manner consistent with HED policy described in: Series 875 - Occupational and
Residential Exposure Test Guidelines, Group /^-Postapplication Exposure Monitoring Test
Guidelines (previously designated Subpart K); and Standard Operating Procedures for
Residential Exposure Assessments. The Postapplication Exposure Monitoring Test Guidelines
22

-------
describe studies and data required to determine reentry intervals following use of a pesticide.
These studies/data are: Dislodgeable Foliar Residue (DFR) Dissipation Study, Guideline
875.2100; Soil Residue Dissipation (SRD) Study, Guideline 875.2200; Indoor Surface Residue
(ISR) Dissipation Study, Guideline 875.2300; Dermal Exposure, Guideline 875.2400; Inhalation
Exposure, Guideline 875.2500; Biological Monitoring, Guideline 875.2600; Product Use
Information, Guideline 875.2700; Description of Human Activity, Guideline 875.2800. The
Standard Operating Procedures for Residential Exposure Assessments describes algorithms for
calculating potential dose rate (PDR) for typical residential postapplication exposure scenarios
and summarizes (Appendix A) assumptions, input descriptors, references, and output descriptors
for each scenario.
Under a probabilistic approach,, all input parameters (i.e., for both occupational and
residential scenarios) for assessing exposure from entering treated areas can be expressed as
distributions, provided appropriate data are available and each distribution is fully described.
Multiple values may be used in lieu of a fixed value when the data have been shown to be
appropriate for the scenario being assessed. Several exposure assessments may be required for a
given subpopulation depending upon the number of toxicological endpoints that have been
identified (e.g., cancer, short-term, intermediate-term, and chronic). The completion of
assessments for short- and intermediate-term endpoints generally do not require an extensive
knowledge of the population dynamic. For example, OPP would not consider market-share data
reflecting national use of a pesticide (e.g., percent crop treated or households using the product) if
a short- or intermediate-term endpoint exists, because any single exposure event may pose an
unacceptable risk. The completion of chronic and cancer-based assessments does, however,
require a more extensive knowledge of the population dynamic (i.e., market share and
demographic data) as these exposures must be amortized over a lifetime and these data provide a
valuable insight into the subpopulation over a lifetime. For example, market share information
may provide a basis for assessing risks to occupational populations over a lifetime by delineating
the use of chemicals in specific geographical regions across crops as the same population could
provide hand labor services for various crops within a region. OPP's BEAD will verify market
share data used to amortize long term exposure from entering pesticide treated areas.
23

-------
Attachment 1: Glossary
Al-1

-------
Attachment 2 : Probabilistic Risk Assessment and Monte-Carlo Methods:
A Brief Introduction
A2-1

-------
Attachment 3: Distribution Selection
A3-1

-------
Attachment 4: June 13,1996 Final Office Policy on Performing Acute
Dietary Exposure Assessment.
A4-1

-------