DATA QUALITY
          JECTIVES
      WORKSHOP
     More
TIME and
MONEY
More Uncertainty
    DATA
   QUALITY
     Less
Less Uncertainty
     U.S. ENVIRONMENTAL PROTECTION AGENCY
     QUALITY ASSURANCE MANAGEMENT STAFF

-------
[ ‘AlE: 1. 15. I 7
SUE JECT: DQO Workshop Handouts
FROM: Kevin i-iuii4’ 4 i’
Quality As r nce Management Staff (RD-680.
TO: DQO Workshop Participants
Thank you for your decis:ton to participate in the April 17
presentation of the Data Quality Objectives (DQO’s) workshop. In
order to prepare you for this session, I am providing you with:
1. the workshop agenda, and
2. a document summarizing key points of the DQO concept and
process.
A brief review of these materials should help to focus your
initial impressions of DQO’s and make your participation in the
workshop a more rewarding experience.
As the attachment indicates, QAtIS conceives of DOD development
as a three-stage process. The April 17 workshop will deal primarily
with Stages I and II of the process. Our primary aim is to enhance
your understanding of and sensitivity to the management issues
associated with these stages.
I look forward to working with you beginning at 9:00 a.m. on
Friday, April 17.
Attachments

-------
DQO WUFd HOP AGENDi
9:00 - Init3.al Discussion r . Hull
— Introduction of course participants
— Initia l impressions of DQO’s
9:15 - Course introduction K. Hull
- Background/purpose of DQO training efforts
— Preview of course content, format, logistics
9:25 — Presentation on DQO’s as management and D. Neptune
communications tool (LECTURE 1)
— Purpose/value of the DQO concept
— Overview of the DQO process
- Status of QANS’ support efforts
10:00 - BREAK
10:15 - Open discussion -- do participants face real-world K. Hull
situations where the DQO process could help?
10:35 - Presentation on DQO’s as a quantitative tool G. Brantly
(LECTURE 2)
— What are performance criteria?
- How are they developed?
11:30 - LUNCH
12:30 - DQO exercise #1 (WORK SHEETS) D. Michael
1:10 - DOD exercise #2 (GROUP EXERCISE) D. Michael
2:20 - BREAK
2:35 - DQO exercise #2 (comparison of small group results) D. Michael
3:15 — Presentation on how performance criteria are used G. Brantly
in the design of data collection programs
3:30 - Open discussion -- have the participants’ K. Hull
perspectives on DQO’a changed?
3:45 - Complete workshop evaluation form
4:00 - ADJOURN

-------
INTRODUCTION TO THE DATA QUALITY OBJECTIVES CONCEPT
The critical role of environ3nenta.]. data in the EPA decision-
making process has long been recognized. Despite this recognitLon.
many Agency data collection programs and monitoring requirements have
not adequately emphasized such key factors as the decision to be made
with the collected data and the possible consequences of an incorrect
decision. The historical approach used by the Agency has often been
to collect the “best data possible,” with the responsibility of
defining the “best data possible” usually assumed by technical
experts, rather than by EPA decision makers. Typically these
technical experts, presented with a pre-esteblished budget, have
first identified the best available sampling and analytical methods,
and then determined the number of samples and measurements that were
affordable using these methods. To ward off the possibility of
lawsuits, extensive negotiations with industry and environmental
groups have often been conducted in order to assure that data
collection requirements are defensible and appropriate.
While this traditional approach may have ensured that the beat
possible measurements were obtained, it has not always guaranteed
that the resulting information is adequate for making a decision.
Although Agency accomplishments have shown that this general approach
to designing data collection activities can be successful, it can
also be expensive and time-consuming, and will not necessarily lead
to the selection of a data collection design likely to provide data
adequate for defensible decision-making.
The Quality Assurance Management Stall (QAMS), in response to a
requirement established by the Deputy Administrator in May 1984, has
proposed a different approach to designing environmental data
collection programs, based on the concept of Data Quality Objectives
(DQO’s). The DQO process does not use a pre-estebliahed budget as
the sole constraint on the design of the data collection program.
Rather, it also considers the quality of data needed to achieve an
acceptable level of confidence in the data dependent elements that
will play a role in the decision-making process. The DUO process
provides a logical., objective, end quantitative framework for finding
an appropriate balance between the time and resources that will be
used to collect data and the quality of the data that will be needed
to make the decision.
One of the important aspects of the DUO process is that decision
makers must be involved. DUO’s are developed using a top-down/
iterative approach. The initial input and perspective of the
decision maker, which can be expressed in tentative and qualitative
terms, is crucial. to the successful development of DUO’s. Up to now,
the absence of a well—defined framework for obtaining the decision
maker’s input and for focusing the activities of senior program staff
has been a significant obstacle to the design of effective data
collection programs. QAMS recognizes that the role of the decision
maker may vary to some degree, from one of directly providing the
information and participating in planning, to one of reacting
to/concurring with options presented by key staff to the decision
maker. Through their personal involvement, the decision makers can
ensure that the DUO process will become a “way of life” at EPA.
Systematic implementation of the DQO process will improve the
probability that the quality of EPA data is compatible with the
requirements of the decision-makinq procems.

-------
OVERVIEW OF THE DQO PROCESS
Simply stated, DOO’s are statements of the level of uncertainty
that a decision maker is willing to accept in results derived from
environmental data, when the results are going to be used in a
regulatory or programmatic decision (e.g., deciding that a new
regulation is needed, setting or revising a standard, taking an
enforcement action). To be complete, these statements must be
accompanied by clear statements of:
o the decision to be made;
o why environmental data are needed and how they will be used;
o time and dollar constraints;
o descriptions of the environmental data to be collected;
o specifications regarding the domain of the decision; and
o the calculations, statistical or otherwise, that will be
performed on the data in order to arrive at a result.
Developing DQO’a should be the first step in initiating any
significant environmental data collection program to be conducted by
or for EPA. The DQO process helps data users and data generators to
communicate clearly with each other about the purposes for which
environmental data will be used and the design of the data collection
program that will meet the decision maker’s requirements. Once the
qualitative and quantitative data performance requirements have been
developed, a suitable design option can be selected for the data
collection activity.
DOO’a are used to define quality assurance (QA) and quality
control (QC) programs specifically tailored to the data collection
activity. Once DOO’s have been established a “QA Project Plan” is
prepared, documenting all of the activities needed to assure that the
data collection program will produce environmental data of the type
and quality required to satisfy the DQO’s. Without prior development
of DQO’s, a QA program can be used merely to document the quality of
data obtained, rather than to assure that the quality of data
obtained will be sufficient to support an Agency decision.
As envisioned by QANS, the DQO process consists of three stages
with several steps in each stage. The process described in the first
two stages results in proposed DQO’s with accompanying specifications
(constraints). In the third stage, an evaluation of potential
designs is performed, leading to the selection of a design which is
compatible with the constraints associated with the DQO’s. The
process is meant to be iterative among all stages (arid among steps
within a stage) if the proposed DQO’s and corresponding constraints
are found to be incompatible.
QAMS recognizes that its approach to DQO’a is not the only one
for all circumstances. We encourage the development of alternative
approaches designed to achieve the same goals, and will be happy to
provide support to EPA organizations attempting to apply the DQO
concept to their particular situations.

-------
SUMMARY OF THE THREE STAGES OF THE DQO PROCESS
STAGE I: DEFINE THE QUESTION OR DECISION
In this stage, the decision maker states his/her initial
perceptions of what question should be addressed or decision should
barnacle, what information is needed, why it is needed, how it will be
used, and what the consequences will be if information of adequate
quality is not available. It is expected that the decision maker’s
input at this point will be tentative, and expressed in
non-quantitative terms. Initial estimates of the available time and
resources for the data collection activity are stated..
STAGE II: CLARIFY AND THEN STATE PRECISELY THE INFORMATION NEEDED
FOR THE QUESTION OR DECISION
In this stage, the senior staff (management and technical), with
periodic involvement of the deciion maker, carefully examine the
decision maker’s Stage I statements. Senior staff then ask whether
new environmental data are really needed to answer the question. If
so, then the technical staff define precisely the domain or universe
of inference (physical, chemical, temporal, and spatial elements and
factors) for collecting the necessary environmental data. The staff
then help the decision maker to understand and state in quantitative
terms how good (certain) the decision maker requires the data to be.
Quantitative statements, to the extent possible, of the data quality
required (most frequently in terms of false positive and false
negative error rates) are the important outputs of Stage II. The
senior staff develop these quantitative statements for and with the
decision maker after they have provided the decision maker with an
intuitive feel for their implications.
STAGE III: DESIGN THE DATA COLLECTION PROGRAM
This stage is primarily the responsibility of the technical
staff, but involves both the senior management and the decision maker
to assure that the outputs of Stages I and II are understood by the
technical design staff. The objective of Stage III is to develop
data collection plans (numbers of samples, where to sample, type of
laboratory analysis, type of data analysis, etc.) that will meet the
quantitative criteria and constraints defined by the important
outputs of Stage II. In Stage III, we evaluate all steps in the data
collection process with associated errors, and selecting an optimal
design that ach .eves the overall control of error as defined by the
DQO, with the minimum cost. It is the prerogative of the decision
maker to select the final design that provides the best balance
between time and resources available for data collection on the one
hand, and the level of uncertainty expected in the final results on
the other.

-------
/ yy
/7 £

L / - (.rr 6 ) C J Ga / 4), f L! 4p
c - 7 i / ‘ - - e 1 J
i e
tLe CO — --

T
-
/
-
— iJ -

-------
BOO WORKSHOP AGENDA
9:00 — Initial Discussion K. Hull
— Introduction of course participants
— Initial impressions of DOO’s
9:15 — Course introduction K. Hull
— Background/purpose of BOO training efforts
— Preview of course content, format, logistics
9:25 — Presentation on as management and D. Neptune
communications tool (LECTURE 1)
— Purpose/value of the DOG concept
— Overview of the DOO process
— Status of QAhlS’ support efforts
10:00 — BREAK
10:15 — Open discussion —— do participants face real—world K. Hull
situations where the DQO process could helo?
10:35 — Presentation on DOO’s as a quantitative tool 6. Bra tly
(LECTURE 2)
— What are performance criteria?
- How are they developed?
11:30 — LUNCH
)( 12:30 — DQO exercise *1 (WORK SHEETS) B. Michael
/1:10 — DUO exercise *2 (GROUP EXEACISE) D. Michael
ro t 2 ,
2:20 — BREPK
/12:35 — UQO exercise *2 (comparison of small group results) D. Michael
tIs, o ,w
3:3A — Presentation on how performance criteria are used 6. Brantly
in the design of data collection programs
I /o
3: C— Open discussion —— have the participants’ K. Hull
perspectives on DUO’s changed?
3:45 — Complete workshop evaluation form
4:00 — ADJOURN

-------
WORKGROUP DISTRIBUTION
GROUP A GROUP B
B. Haneker C. Strouc
J. Blake P. Robinson
D. Kelly C. Koch
B. Wood—Thomas F. Prizner
B. Moody R. Trovato
P. Powers D. Cook
M. Otto M. H ftov
B. Coakley
GROUP C
E. Hanlon
N. Frankenberry
E. Sterrett
N. McCall
R. Ramsey
A. Jover
C. Gaurn

-------
OVERVIEW OF THE DUO PROCESS
Simply stated, DUO’s are statements of the level of uncertainty
that a decision maker is willing to accept in results derived from
environmental data, when the results are going to be used in a
regulatory or programmatic decision (e.g., deciding that a new
regulation is needed, setting or revising a standard, taking an
enforcement action). To be complete, these statements must be
accompanied by clear statements of:
o the decision to be made;
o why environmental data are needed and how they will be used;
o time and dollar constraints;
o descriptions of the environmental data to be collected;
o specifications regarding the domain of the decision; and
o the calculations, statistical or otherwise, that will be
performed on the data in order to arrive at a result.
Developing DUO’s should be the first step in initiating any
significant environmental data collection program to be conducted by
or for EPA. The DUO process helps data users and data generators to
communicate clearly with each other about the purposes for which
environmental data will be used and the design of the data collection
program that will meet the decision maker’s requirements. Once the’
qualitative and quantitative data performance requirements have been
developed, a suitable design option can be selected for the data
collection activity.
DUO’s are used to define quality assurance (GA) and quality
control (UC) programs specifically tailored to the data collection
activity. Once DUO’s have been established, a “OP Project Plan” is
prepared, documenting all of the activities needed to assure that the
data collection program will produce environmental data of the type
and quality required to satisfy the DUO’s. Without prior development
of DUO’s, a OP program can be used merely to document the quality of
data obtained, rather than to assure that the quality of data
obtained will be sufficient to support an Pgency decision.
As envisioned by GAllS, the DUO process consists of three stages
with several steps in each stage. The process described in the first
two stages results in proposed DUO’s with accompanying specifications
(constraints). In the third stage, an evaluation of potential
designs is performed, leading to the selection of a design which is
compatible with the constraints associated with the DUO’s. The
process is meant to be iterative among all stages (and among steps
within a stage) if the proposed DUO’s and corresponding constraints
are found to be incompatible.
GAllS recognizes that its approach to 1300’s is not the only one
for all circumstances. We encourage the develooment of alternative
approaches designed to achieve the same goals, and will be hanpy to
provide support to EPA organizations attempting to apoly the DUO
concept to their particular situations.

-------
SUMMARY OF THE THREE STAGES OF THE DQO PROCESS
STAGE I: DEFINE THE QUESTION OR DECISION
In this stage, the decision maker states his/her initial
perceptions of what question should be addressed or decision should
be made, what information is needed, why it is needed, how it will be
used, and what the consequences will be if information of adequate
quality is not available. It is expected that the decision maker’s
input at this point will be tentative, and expressed in
non—quantitative terms. Initial estimates of the available time and
resources for the data collection activity are stated.
STAGE II : CLARIFY AND THEN STATE PRECISELY THE INFORMATION NEEDED
FOR THE QUESTION OR DECISION
In this stage, the senior staff (management and technical), with
periodic involvement of the deciion maker, carefully examine the
decision maker’s Stage I statements. Senior staff then ask whether
new environmental data are really needed to answer the question. If
so, then the technical staff define precisely the domain or universe
of inference (physical, chemical, temporal, and spatial elements and
factors) for collecting the necessary environmental data. The staff
then help the decision maker to understand and state in quantitative
terms how good (certain) the decision maker requires the data to be.
Quantitative statements, to the extent possible, of the data quality
required (most frequently in terms of false positive and false
negative error rates) are the important outputs of Stage II. The
senior staff develop these quantitative statements for and with the
decision maker after they have provided the decision maker with an
intuitive feel for their implications.
STAGE III: DESIGN THE DATA COLLECTION PROGRAM
This stage is primarily the rpsponsibility of the technical
staff, but involves both the senior management and the decision maker
to assure that the outputs of Stages I and II are understood by the
technical design staff. The objective of Stage III is to develoo
data collection plans (numbers of samples, where to sample, type of
laboratory analysis, type of data analysis, etc.) that will meet the
quantitative criteria and constraints defined by the important
outputs of Stage II. In Stage III, we evaluate all steps in the data
collection process with associated errors, and selecting an optimal
design that achieves the overall control of error as defined by the
DQO, with the minimum cost. It is the orerogative of the decision
maker to select the final design that provides the best balance
between time and resources available for data collection on the one
hand, and the level of uncertainty expected in the final results on
the other.

-------
INTRODUCTION TO THE DATA QUALITY OBJECTIVES CONCEPT
The critical role of environmental data in the EPA decision—
making process has long been recognized. Despite this recognition,
many Agency data collection programs and monitoring reauirements have
not adequately emphasized such key factors as the decision to be made
with the collected data and the possible consecuences of an incorrect
decision. The historical approach used by the Agency has often been
to collect the “best data possible,” with the responsibility of
definihg the “best data possible” usually assumed by technical
experts, rather than by EPA decision makers. Typically these
technical experts, presented with a pre—established budget, have
first identified the best available sampling and analytical methods,
and then determined the number of samples and measurements that were
affordable using these methods. To ward off the possibility of
lawsuits, extensive negotiations with industry and environmental
groups have often been conducted in order to assure that data
collection requirements are defensible and appropriate.
While this traditional approach may have ensured that the best
possible measurements were obtained, it has not always guaranteed
that the resulting information is adequate for making a decision.
Although Agency accomplishments have shown that this general aporoach
to designing data collection activities can be successful, it can
also be expensive and time—consuming, and will not necessarily lead
to the selection of a data collection design likely to provide data
adequate for defensible decision—making.
The Quality Assurance Management Staff (QAMS), in response to a
requirement established by the Deputy Administrator in May 1984, has
proposed a different approach to designing environmental data
collection programs, based on the concept of Data Quality Objectives
(DQO’s). The DQO process does not use a pre—established budget as
the sole constraint on the design of the data collection program.
Rather, it also considers the quality of data needed to achieve an
acceptable level of confidence in the data dependent elements that
will play a role in the decision—making process. The DQO process
provides a logical, objective, and quantitative framework for finding
an appropriate balance between the time and resources that will be
used to collect data and the quality of the data that will be needed
to make the decision.
One of the important aspects of the DOO process is that decision
makers must be involved. DQO’s are developed using a too—down/
iterative approach. The initial input and perspective of the
decision maker, which can be expressed in tentative and qualitative
terms, is crucial to the successful development of DQO’s. Up to now,
the absence of a well—defined framework for obtainina the decision
maker’s input and for focusing the activities of seni or oroqrar.i staff
has been a sinnificant obstacle to the design of effective data
collection programs. QAMS recognizes that the role of the decision
maker may vary to some degree, from one of directly providing the
information and participatina in planning, to one of reacting
to/concurring with options presented by key staff to the decision
maker. Through their personal involvement, the decision makers can
ensure that the OQO process will become a “way of life” at EPA.
Systematic implementation of the DQO process will improve the
probability that the quality of EPA data is compatible with the
requirements of the decision—making process.

-------
LECTURE 1

-------
CO
LU
o
LU
CD

O
O

<
<
O

-------
          QAMS QUALITY ASSURANCE PROG RAM
                       DQO
QAPROGRAM
   PLANS
QA PROJECT PLANS
                 DATA COLLECTION
REVIEWS
AUDITS
                     DECISION

-------
                    DQO'S STRIKE A BALANCE
         INCREASING
INCREASING UNCERTAINTY
 TIME
MONEY
       DECREASING
                  DATA
                 QUALITY
DECREASING UNCERTAINTY

-------
DATA QUALITY OBJECTIVES
STATEMENTS OF THE LEVEL OF UNCERTAINTY A
DECISION MAKER IS WILLING TO ACCEPT IN
RESULTS DERIVED FROM ENVIRONMENTAL DATA

-------
DATA QUALITY OBJECTIVES
I WOULD LIKE TO LIMIT THE CHANCE OF
DATA LEADING TO AN INCORRECT CONCLUSION:
a) that a facility is out of compliance (when it’s in)
or
b) that a facility is in compliance (when it’s out)

-------
          IMPORTANCE TO MANAGERS
          SAVINGS
CLARIFICATION
                                    MANAGEMENT TOOL
                                      COMMUNICATION
        STRUCTURE

-------
                DQO PROCESS
 STAGE
                            III
PURPOSE
 DEFINE
DECISION
 ESTABLISH
 QUALITATIVE
    AND
QUANTITATIVE
CONSTRAINTS
DESIGN DATA
 COLLECTION
PROGRAM TO
   MEET
CONSTRAINTS
  LEAD
  ROLE
DECISION
 MAKER
PROGRAM AND
  TECHNICAL
   STAFF
 TECHNICAL
   STAFF

-------
                          STAGE  I
 1
STEPS
                      STATE RESOURCES
               ASSESS CONSEQUENCES OF ERROR
                     DEFINE USE OF DATA
                    DESCRIBE INFORMATION
DEFINE THE DECISION

-------
STAGE
5. F
7 fl PROPOSE DQO’S
CONFIRM NEED FOR
6 NEW DATA
DESIRED PERFORMANCE
RESULT
DOMAIN
DATA NEEDED
II
I
DECISION ELEMENTS
I

-------
                 STAGE   II
1.   DECISION ELEMENTS
                    DECISION
  DATA-DEPENDENT
NON DATA-DEPENDENT
                     ELEMENTS

-------
STAGE II
2. SPECIFY DATA
DECISION
DATA-DEPENDENT ___ I
I I ________ I______
?
/ [ \
DATA DATA DATA

-------
                 STAGE  II
3.
DEFINE DOMAIN
                    DECISION
    DATA-DEPENDENT
           I
               I
                           O.WHAT POPULATION
                           WILL THE DECISION
                           APPLY?
   FROM WHAT POPULATION
   SHOULD SAMPLES BE TAKEN?

-------
                   STAGE  II
4.   DEFINE RESULT
                      DECISION
       DATA-DEPENDENT
   RESULT
RESULT 	  _ RESULT

-------
                       STAGE   II
5.
STATE DERIVED PERFORMANCE
T
R _
U
T +
H
7

•
\J
                            DATA-DEPENDENT
                        RESULT
                  LACE CONSTRAINTS
                    ON ERROR
                                            DECISION
                           RESULT —
RESULT

-------
STAGE II
6. DETERMINE NEED FOR NEW DATA
DOES ERROR IN RESULTS DERIVED
FROM EXISTING DATA MEET CONSTRAINTS?

-------
STAGE II
7. PROPOSE DQO’S
* DECISION
* RESOURCE CONSTRAINTS
* ELEMENTS OF DECISION
Data Needed
Domain
Results
Limits on error in result

-------
LECTURE 2

-------
      OBJECTIVES
1.  WHAT ARE PERFORMANCE CRITERIA?
2.  HOW ARE THEY DEVELOPED?

-------
 WHAT ARE PERFORMANCE
          CRITERIA?
* SPECIFICATIONS FOR A MONITORING PROGRAM

* DECISION ERRORS ATTRIBUTABLE TO DATA

* DESIRED LIMITS ON UNCERTAINTY

-------
     HOW ARE PERFORMANCE
       CRITERIA STATED?
•  FALSE POSITIVES AND FALSE NEGATIVES
  CONFIDENCE INTERVALS
  POWER

-------
     COMPLIANCE DECISION
                   TRUTH
DECISION
          IN
                 IN
               CORRECT
          OUT
 OUT
CORRECT

-------
COMPLIANCE DECISION
TRUTH
____L_±N__I_ i_1
DECISION
I CORRECT I I
IN
_________-_1
FALSE CORRECT I
OUT POSITIVE
_______-___1
FALSE POSITIVE: Declaring non-compliance when
permittee is in compliance

-------
COMPLIANCE DECISION
TRUTH
L_±N__ OUT
DECISION
I CORRECT FALSE
IN NEGATIVE
___L___ ______
I I CORRECT I
OUT
___L____±____1
FALSE NEGATIVE: Declaring compliance when
permittee is in non-compliance

-------
COMPLIANCE MONITORING FOR COMPOUND H
           CONTROL LEVEL: 95 ppm
                    IN
OUT
               95 ppm

-------
COMPLIANCE MONITORING FOR COMPOUND H
CONTROL LEVEL: 95 ppm
IN
OUT
5%
95 ppm

-------
COMPLIANCE MONITORINvJ FOR COMPOUND  H
             CONTROL LEVEL: 95 ppm
                 IN
OUT
             15°/c
                        125 ppm

-------
    COMPLIANCE MONITORING
        FOR COMPOUND X
PERFORMANCE CRITERIA:
    P[FALSE POSITIVE]  < .05 at 95 ppm

    P[FALSE NEGATIVE]  < .01 at 2,000 ppm

    PfFALSE NEGATIVE]  < .15 at 125 ppm

-------
HOW ARE PERFORMANCE
CRITERIA DEVELOPED ?
PROCESS - DQO STAGES I AND II
EXAMPLE - MOBILE SOURCE I/M PROGRAM

-------
 TOTAL AUTO EMISSIONS
PURPOSE OF I/M PROGRAM IS TO REDUCE
EXCESS EXHAUST EMISSIONS.

-------
          STAGE 1  INFORMATION
DECISION:     ARE HC AND CO EMISSIONS EXCESSIVE?




INPUTS:       EMISSION  DATA
CONSEQUENCES:
               Corrective  Maintenance
               Retesting

-------
STAGE I INFORMATION
CONSEQUENCES OF AN INCORRECT DECISION:
- Unnecessary Maintenance and Retesting
- Missing an Auto with Excessive
Exhaust Emissions
J o

-------
STAGE II: STEP 1
IDENTIFY DECISION ELEMENTS:
- Are Exhaust Emissions Excessive for HC?
- Are Exhaust Emissions Excessive for CO?
r - CL4 /V

-------
STAGE II: STEP 2
SPECIFY THE ENVIRONMENTAL DATA NEEDED:
- Level of HC Emitted in Exhaust (ppm)

-------
STAGE II: STEP 3
SPECIFY THE “DOMAIN”
p
SPATIAL AND TEMPORAL BOUNDS THAT —
DEFINE THE POPULATION OF INTEREST.> —
- HC Emissions at Idle Speed
- All Cars in the Region
- Annual or Semi-annual Tests

-------
STAGE II: STEP 4
DEFINE THE RESULT -
A DATA SUMMARY FOR USE IN MAKING
THE DECISION.
- Concentration of HC (ppm)
- Stable average or instantaneous concentration
- Result will be compared to a “Cut-point”

-------
STAGE II STEP 5
STATEMENT OF DESIRED PERFORMANCE
DEFINITIONS:
FALSE POSITIVE:
- Finding that exhaust emissions are excessive,
when they are not.
FALSE NEGATIVE:
- Finding that exhaust emissions are acceptable,
when they are not.
7 1 co cJT

-------
LIKELIHOOD OF PASSING EMISSIONS TEST
NEVER
RARELY
OFTEN
ALWAYS -
((I
150 300 1500 3000
lD’1 lo
ppm

-------
LIKELIHOOD OF PASSING EMISSIONS TEST
NEVER *
RARELY *
OFTEN *
ALWAYS - *
I 221
150 300 1500 3000
ppm

-------
LIKELIHOOD OF PASSING EMISSIONS TEST
PROBABILITY
NEVER *
RARELY * • —
OFTEN *
ALWAYS - * —
I I a
150 300 1500 3000
ppm

-------
LIKELIHOOD OF PASSING EMISSIONS TEST
PROBABILITY
NEVER *
RARELY * f o) • - .20
OFTEN - *(1_.’ D) u1 - -.90
ALWAYS * - .01 U — 99
I 22!
150 300 1500 3000
ppm

-------
STAGE
II.
II.
STEP
STATEMENT OF DESIRED PERFORMANCE.
Desired
Power
Probability
of Failing
.
.01 at 150 ppm
.10 at 300
.80 at 1500
ppm
ppm
.99 at 3000 ppm
5
of Emissions
Test

-------
STAGE II: STEP 5
STATEMENT OF DESIRED PERFORMANCE.
P [ FALSE POSITIVE] .01 at 150 ppm
P [ FALSE NEGATIVE] .90 at 300 ppm
P [ FALSE NEGATIVE] .20 at 1500 ppm
P [ FALSE NEGATIVE] .01 at 3000 ppm

-------
WORK SHEETS

-------
DQO QUANTITATIVE WORKSHEET
AMBIENT AIR EXAMPLE
(A) A given area (usually defined by the political boundaries of a
city) is classified by EPA as non—attainment with short term ambient air
quality standards for Ozone if:
— on 2 or more days per calendar year,
— the maximum one hour average Ozone concentration measured on a given day
is found to be greater than 0.12 ppm (40 CFR Part 50).
Continuous monitoring data are collected at fixed stations to determine
hourly average ambient Ozone concentrations for each area. The results
of this data collection activity are used to determine compliance (attain-
ment) with ambient standards.
For this situation:
1) State what a false positive result would be:
2) State what a false negative result would be;
3) Why should EPA be concerned with false positive errors?
4) Why should EPA be concerned with false negative errors?
EXTRA: Which type of error would cause you greater concern?

-------
SULFUR REDUCTION EXAMPLE
(B) A hypothetical group of coal—fired power plant companies in the
Ohio River Valley have just agreed to install state—of—the—art scrubbers
designed to significantly reduce sulfur emissions resulting from the use
of locally mined high-sulfur coal. Congressmen and Governors in the New
England area have asked the EPA Regional nunistrator to collect data
in the region so that, if a reduction occurs as a result of this action,
it can be detected. The Governors from this region reached an unprece-
dented agreement: they agreed to split costs borne by Ohio residents and
utility companies, if a reduction of greater than 20% in sulfur compounds
associated with rain is detected in the New England area during the
first year following instatlat on of these scrubbers (which are currently
scheduled to go on—line in Jan., 1988). EPA agreed to monitor sulfur
deposition in the region in both 1987 (pre—scrubbers) and 1988 (post—
scrubbers). This data will he available for EPA to determine if a > 20%
reduction can be documented when 1987 and 1988 sulfur deposition data are
compared.
For this problem:
1) State what a false positive result would be:
2) State what a false negative result would be:
3) What type of error would Ohio taxpayers and utilities be most
concerned with? Why?
4) Is this the same type of error that would be of concern to the
New England residents? Why or why not?
5) How about environmental interest groups?

-------
(C) NON POINT SOURCE MITIGATION EXAMPLE
As part of the Chesapeake Bay Program effort to control non—point source
(NPS) run of f of phosphorous C?) from farms into the Ray (including run—off into
all major tributaries leading into the bay), EPA Region III has decided to
cor 1uct an evaluation of the relative efficacy of two potential NPS mitigation
alternatives. Mvocates of each method CM—i and M—2) both claim that their
method should yield substantial reduction in P loading from non—point agricultural
sources into the Bay and its tributaries, based on limited data collected from
the Great Lakes region ar elsewhere. P4—i involves planting a 50’ buffer strip
with an effective scavenger crop. P4—2 depends on use of low—till farming
practices that require a much higher use of pesticides for weed ai pest control.
To determine if M—1 will in fact result in a greater reduction in P runoff
than the M—2, Region III is planning a field—test of both methods. The decision
to be made from these field studies is whether M—1 is more effective than M—2,
or if the methods are equally effective. The study will produce data that wifl.
be used to calculate the percent reduction in P for both methods under varying
coudition . If the d fference between % reduction resulting from M—1 versus
M—2 is greater than 10% [ e.g., is % red. Ml— % red. M2 > 10%, or < 10%?] (10%
is the smallest difference considered by Regional Scientists to be meaningful),
then EPk will conclude that M—1 should be adopted for use in the Chesapeake Bay
Program. Otherwise EPA will recoinmeud M—2 for use in this program.
1. State what a false positive would be in this case.
2. State what a false negative would be in this case.
3. Which type of error would he of qreater concern, ar why? (Hint: what are
the consequences of each error situation?)
4. List three false negative scenarios where the consequences of error ar*
of increasing magnitude due to the magnitude of difference missed:
(Hint: what % change in P, if it occurred, would you want the study
to be able to detect: always, most of the time, sometimes?)

-------
GROUP EXERCISE

-------
COUNTY’S SOLE DRINKING
WATER SOURCE CONTAMINATED
WITH PERCHLOROETHYLENE (PCE)
PROBLEM
DRY TANK
COUNTY
(POP. 60,000)

-------
HISTORY OF CONTAMINATION PROBLEM
APRIL MAY JUNE JULY AUG. SEPT. OCT. NOV. DEC.
*I
E WATER 1 WELL CHARCOAL
DISCOVERED CONSERVATION PCE1 TO 20 ppb FILTERS FOR
MEASURES IN 2 WEEKS CONTAMINATED
WELLS
CONTAMINATION CONTAMINATED
CONFIRMED, GROUND
5 WELLS WATER;
REMOVED FROM COUNTY
SERVICE MONITORING;
EPA
ASSISTANCE

-------
DRYTANK COUNTY WATER SUPPLY
SYSTEM
GALLONS/DAY
- 31 WELLS
INDIVIDUAL PUMPS/TREATMENT
TOTAL CAPACITY: 72 MILLION
- NO CENTRAL TREATMENT
- MINIMAL ABOVE GROUND
STORAGE

-------
DECISIONS THAT REQUIRE MONITORING DATA
• WHETHER TO TAKE A GIVEN WELL OUT OF SERVICE.
• WHETHER A GIVEN WELL CAN BE RETURNED TO
SERVICE AFTER INSTALLATION OF GAC COLUMNS.
• WHETHER A GIVEN WELL WITH GAC TREATMENT
CAN REMAIN IN SERVICE.

-------
CONSEQUENCES OF DECISIONS
• COST OF GAC TREATMENT AT EACH WELL HEAD
• MITIGATE LONG-TERM HEALTH EFFECTS FROM
PCE EXPOSURE
• INABILITY TO MEET PEAK WATER DEMANDS IF
MORE WELLS ARE REMOVED FROM SERVICE

-------
DATA NEEDED FOR DECISION
PCE
CONCENTRATION
IN WATER
NO GAC
FILTRATION
PCE
CONCENTRATION
IN WATER
GAC
FILTRATION

-------
PCE STANDARDS
• NO EXISTING FEDERAL STANDARDS
• PROPOSED EPA HEALTH-BASED STANDARD
IS ZERO (SUSPECTED CARCINOGEN)
• STATE DRINKING WATER STANDARD IS 3 ppb

-------
HEALTH EFFECTS PCE
MOUSE - LIVER CANCER: ORAL (GAVAGE) EXPOSURE
- RISK ASSESSMENT 6
lppb - 1.5x10 6
10 ppb - 15.0 x 10
5oppb-75.OxlO -
RAT - LIVER CANCER: INHALATION EXPOSURE
HUMAN - TRANSIENT LIVER DAMAGE
SHORT TERM EXPOSURE (lOOppm)
EPIDEMIOLOGIC DATA INCONCLUSIVE

-------
PCE CONTAMINATION PROBLEM: SUMMARY FACT SHEET
PROBLEM:
o PCE (Tetrachloroethylene) contamination has been found in “Drytank Co”,
Florida ground water that supplies 100% of the drinking water in the
this area.
PCE HEALTH EFFECTS:
o Health effects studies correlated with liver damage and predict cancer
at concentrations as low as 1 ppb (EPA Risk Assessment I ppb 1.5 x l(V”)
COMMUNITY WATER SUPPLY:
o 31 wells in system, each provides about 2000 gal/mm
o No central. treatment, minimum above ground storage
o System serves 60 K people
o Aquifer is unconfined loose gravel and sand, rapid movement of ground
water evidenced by sudden appearance of PCE in one well
ACTIONS TO DATE:
o wells with PCE ) 3 pph found and removed front service
o Mandatory water conservation measures in summer (no watering gardens)
o Search for source(s) of PCE underway by State DF.R: results not yet
available
REQUEST TO LISEPA REG IV: INITIATION OF D00 PROCESS:
o Region to assist in design for monitoring program that will give Co and
State officials data adequate for application of the 3 ppb Florida PCE
standard.
STAGE I OUTPUT
DECISION: Data will be used to decide:
o Whether to take a given well out of service
— Whether a given well can be returned to service after installation
of GAC columns
— Whether a given well with GAC treatment can remain in service
DATA NEEDS
o Concentration of PCE in water from each well prior to being pumped into
main system (with or without GAC filtration)

-------
—2—
POSSIBLE CONSEQUENCES OF DATA LEADING TO AN INCORRECT DECISION
o If a well is removed from service, when it really had < 3 ppb due to
a false positive result:
— unnecessary installation and maintenance of GAC filters,
(initial cost: approx 80K/filter: 4 filters/well)
— possible water shortages during peak demand periods
o If a well is left in service, when It was really > 3 ppb due to a false
negative result:
— unchecked potential health hazard,
— community concern
o Dry Tank Co and State decision makers n re concerned with falsely con-
cluding a we].l Is clean (especially as levels of PCE are Increasl.nglv
greater than 3 ppb).
RESOURCE AND TIRE CONSTRAINTS
o EPA informed that $1 mil/ year can be made available through State and ( o funds
o Need EPA design within 2 months
ADDITIONAL COST INFORMATION
o Average cost per water sample for PCE analysis: S75.OO/ sample using
EPA Method 601 (Purge and Trap CC)

-------
—3—
Workshop Tasks
The rest is up to you. Your task is to complete the following steps:
1.) Define the domain of the decision. From what portion of the environment
will data be collected? - “-0-;,
r djvi ,
hr
What are the spatial. and temporal boundaries associated with this
portion of the environment (over what period of time and boundaries
on space do you want to obtain estimates for use in the decision)?
What portion of the environment is your decision going to be made
for?
3! It1I € -°’ h r
2) Define the result to be derived from environmental data. This result
should indicate the way in which environmental, data will be used to
draw the conclusions of interest. This amounts to answering the follow-
ing questions:
What summary statistic will be calculated?
I 3 &O ek ,L,._Jj /
M,Wo Df/ L 4 L - d4i h 9
4 Ø
4d
ill this be used? Will you compare it to some standard or other
reference value? How will this comparison be made?
U

-------
—4—
3) Go through the steps leading to specification of quantitative performance
criteria:
— Scenarios should be anticipated in which the new environmental data
might lead to an incorrect result and thus cause the final decision
about a well to be incorrect or questionable. To do this, look at
the questions and identify what a false positive and a false negative
result would be in relation to the Florida 3 ppb limit. Then list
at least one additional false positive scenario and three additional
false negative scenarios where the consequences of error are of
increasing magnitude due to the magnitude of PCE concentrations missed
or misrepresented (Hint: what level of PCE, if it occurred, would you
want the monitoring program to detect accurately always, most of the
time and sometimes).
— Rank the above error situations based on the amount of concern that
being wrong in different ways and by varying degrees would cause
you if you were tue decision maker. For example, concern over
incorrect compliance decisions increases as the magnitude and
frequency of non-compliance increases.
— Assign a probabiUty of occurrence that would be acceptable for
each scenario, with the values corresponding to the level of concern
associated with each. This statement should indicate the level of
uncertainty that can be tolerated and the results still used in the
decision making process. This statement represents a policy decision
(by the decision maker) on the acceptable risk of being wrong in
different ways. For your convenience, the following “desired
power” curve might be used to develop performance statements.

-------
—5—
QA pw
U BAWFUL R E
A 0 OL
II U B L
IT A
T B I
A F IS
TI L
IN 10
V D T K
El I
N BAD
F C 0
E F
E W
L E F
L I
L N
D
OK I
N
G
00D
I I I I I I I I 1 I I•I I I
True Concentration of PCE in Drinking Water

-------
iV5WE1 S ET
r is i ’ôr “e rra.d
DQO QUANTITATIVE WORKSHEET
AMBIENT AIR EXAMPLE
(A) A given area (usually defined bj the political, boundaries of a
city) is al.assified by EPA as non—attainment with short term ambient ir
quality standards for Ozone if:
— on 2 or more days per cal.endar year,
— the maximum one hour average Ozone concentration measured on a given daj
is found to be greater than 0.12 ppm (40 CFR Part 50)
Continuous monitoring data are coUected at fixed stations to determine
hourly averdge ambient Ozone concentrations for each area. The results
of this datc* collection activitj are used to determine compliance (attain-
ment) with dmbient standards.
For this situation:
1) State what a false positive result would be:
s ‘ ‘ ‘ >
2 m r days ’ LL Q I or n e 4 i 4 os . c145 ‘ 03
4 -’aj
2) State what a false negative result would be:
t . , z-,4 1 a/,me, ., 2 op fr ere d4/r
ias’
3) Why should EPA be concerned with false positive errors?
PA , ci i n t po rJ /y / 9Cid
.. i d’.’ri’ ’ or b, peø, / *i Mj 4.a4 .ti. h 1 ‘s i o
47%/
/e1 gIs C.
4) Why shoul.d(EPA be concerned with false negative errors? 1/
. , E’FA — i. ., ie
o 4 s ’ p,’ ,: a 1 4 j o’ . o a .ia i a. a I
4 ’e 03 /a&i /s’ 1a 7’4/r esi-oVc ,d
EXTRA: Wnich tjpe of error would cduse ‘ou greater concern?
) e 4 ;s hen ’.,.,, h // jr I D’Y1 ”TD ‘ ‘E P # ’
hI.k 441 ,

-------
SULFUR REDUCTION EXAMPLE
(a) A hypothetical group of coal—fired power plant companies in the
Ohio River Valley have just agreed to install state—of—the—art scrubbers
designed to significantly reduce sulfur emissions resulting from the use
of locally mined high—sulfur coal. Congressmen and Governors in the New
England area have asked the EPA Regional ministrator to collect data
ifl the region so that, if a reduction occurs as a result of this action,
it can he detected. The Governors from this region reached an unprece-
dented agreement: they agreed to split costs borne by Ohio residents and
utility companies, if a reduction of greater than 20% in sulfur compounds
associated with rain is detected .n the New England area during the
first year following installation of these scrubbers (which are currently
scheduled to go on—line in Jan., 1988). EPA agreed to monitor sulfur
deposition in the region in both 1987 (pre—scrubbers) and 1988 (post—
scrubbers). This data will be available for EPA to determine if a > 20%
reduction can be documented when 1987 and 1988 sulfur deposition data are
compared.
For this problem:
1) State what a false positive result would be:
c . LOc ?u ( !dcAc O 1 ‘ cSj cQd
2.-O 7a cA +‘rl
2) State what a false negative result would be:
F d 1 ‘fL k . 2O o V(c4C &v, ‘ O C L4( J. I tIf
&ct
)2é’
rA u ck, + - J , oc i r d.
3) What type of error would Ohio taxpayers and utilities be most
concerned with? Why? :r ça ie .d &# .‘i’ 4 4 4 (—) 4t2oZ
a. p. Jec ’o , hAI’ h d c .cii tol) !/NCs i bit p .Q. *4Lfr fl a ss Al. E.
/ ,1 CAr&S. ‘ Y atko i ’d e. cc. ,ceps d S ce ‘e c ø-/d c c/ ’d , ‘IiS ,
c ij ‘C ’ . i”) ‘ bi e 4op fl. . ac -Id de 6e c/1’ 1 , 75,9 ’
S, o dd C, &vri S Coa4. l iv o ‘r ( ), 04,.
b e 4 d.se 1 # /i*ks (4 Irdy) O4, _ -. - a I 4 ) A) G. ae
. -- .ff_
4) Is this the same type of error that would be of concern to the
New England residents? Why or why not? ii
‘-. 1.4 z’io ,S4’C 7’S i,, is ri y
ff5ylfly ;r - S i-tzdc ’c .i ) /4 e
he..wcc IVb I1. de.7 r,oya .1L,ob
5) How about environ entdl interest groups?
G +t APr’e-4) t$ ‘ft 6 . pr.ei ”
4 Jlic 3&o lc eQ’3J4 . 1 CC. tL€Id I 4 .’teLlS .: M4& I .4ls
i .& o.4ed 4V d l ’ SC P 4, J tS O dt*v.r,rn I.vh* .+ co.iid be # eh, v#d i.’ ‘. 4.
u4t r0 r rrr J o

-------
(C) NON POINT SOURCE MITIGATION EXAMPLE
As part of the Chesapeake Bay Program effort to control non—point source
(NPS) run off of phosphorous (P) from farms into the Ray (including run—off into
all major tributaries leading into the bay), CPA Region III has decided to
conduct an evaluation of the relative efficacy of two potential NPS mitigation
alternatives. Mvocates of each method CM—i and M—2) both claim that their
method should yield substantial reduction in P loadinq from non—point agricultural.
sources into the Bay and its tributaries, based on limited data collected from
the Great Lakes region and elsewhere. M—1 involves planting a 50’ buffer strip
with an effective scavenger crop. M—2 depends on use of low—till farming
practices that require a much higher use of pesticides for weed and pest control.
To determine if M—1 will in fact result in a greater reduction in P runoff
than the M—2, Region III is planning a field—test of both methods. The decision
to be made from these field studies is whether M—1 is more effective than M—2,
or if the methods are equally effective. The study will produce data that will.
he used to calculate the percent reluction ifl P for both methods under varying
cond tions. If the difference between % reduction resulting from M—1 versus
M—2 is greater than 10% (e.g., .s ¾ red. Ml— % red. M2 > 10%, or < 10%?) (10%
is the smallest difference considered by Regional Scientists to be meaningful),
then EP i will conclude that ‘ 4—1 houl1 he adopted for use in the Chesapeake Bay
Program. Otherwise EPA will. recc)inn nc1 ‘4—2 f r ust ...r this proqrain.
I Mi —Ma /O
1. State what a false positive would be in this case. — % ‘
Mi ‘t/d a. /OZ r Jucv / Mz. M I ,‘
2. State what a false negative would be in this case.
fi ?d, 1t 1 / yi /o’ . / Z I - P d ’ci M2., i
4 1 .ci’ ,1// y, / /ej > / & ‘ P p d *&i / t MZ .
3. Which type of error would be of greater concern, and whj? (Hu’.t: what
tne consequences of each err c situation?)
• ‘7-
(-‘ : !) A U d b 4SlrI wJ.e Th-l j g rC4lI(j 12 fliQYC *(( )uQ
4v r i? wi 9, ft iv a &r d i( ch r us r?litv eiw, pro Ie 4 
AS ccg,jzJ iJ,4 i 4 dei ( , ,,j piiLiI? 4 $,jØ t tj )
4. List trIre false neg ttve scenarios where tne corsequences of error
cf Lrcre sing rnagn tude due to tne magnitude of difference miaaed:
(Hint: what % change in P, if it occurred, woul.d jou want the study
to be Qbl.e to detect: i.wa s, most of tne tLme, sometimes?)
/
1f ,
,# 1 4J
y, /o/c
>/OZ
p
, ,
412
--
d
io ‘
,
,
I,
i
,;
.z s—z
St.
‘I
— —
—,
I
St
i
“
>5o Z
‘
“
j
“
— —
‘I ’
‘

-------
Aws t)Eie__c r7 -
—3—
3) Trartsiertt 1.iver damage in short term human exposures (100 ppm) , and
Acute Central Nervous Sjstem effects at 100 ppm pulmonary exposure,
4) Inconclusive epidemiological study correlating Dry Cleaning Workers
with increased mortality due to colon cancer.
The rest is up to you. Your task is to complete the following steps:
1) Define the domain of the decision. From what portion of the environment
will, data be collected?
‘TZg ijg,/&c,, t ,9 .O i t1 ‘/h
 4oz , we I, s 4I
r -e ca’bi. 3 4Ie 7 74 (e.g. ! 7 ’ 7¼2
5i e Ik.
What are the spatial and temporal boundaries associated with this
portion of the environment (over what period of time and boundaries
on space do you want to cbtain estimates for use in the decision)?
What portion of the environment is jour decision going to be made
for?
‘ ; - i
r ’pr#sen* éi ..Q. PCE /i t • O;1’ fop
- ,ç p ad &/in flt 7 i zQ p ’oI : J/
Pc 2 €2kJ ci ef/J t/ 1p 1 ,’b E ?L kekI
W Ck. n fr ’ i S A 7 6e- /-3 , ,, - r
-e Mo e o. C’7 ,s b $4’ 7Z
‘ r’ ,s , 7 h1L /2 ai ,p i f1 cz?, , e4 k
2) Define the result o be derived rum environmental data. This result
should indicate the waj in which envtronmerital data will be used to
draw the conclusions of interest. This amounts to answering the follow-
ing questions:
What summary statistic will be cdlculated?
/flea ’t. (P .e3 ,.. o ip/ ’$ t ? P17 Q tz j I tr/; 7 -
.c I (#1
How will this be used? Will. jou compare it to some standard or otner
reference value? How will. tnis compori3on be made?
,7fea ‘ ‘ ‘° / J >
L JJf /1 SE- O/c

-------
—4—
3) Go through the steps leading to specification of quantitative performance
criteria:
— Scenarios should be anticipated in which the new environmental data
mignt lead to an incorrect result and thus cause the final decision
about a well to be incorrect or questionable. To do this, look at
the questions and identify what a false positive and a false negative
result would be in relation to the Florida 3 ppb limit. Then list
at least one additional false positive scenario and three additional
false negative scenarios where the consequences of error are of
increasing magnitude due to the magnitude of PCE concentrations missed
or misrepresented (Hint: what level of PCE, if it occurred, would you
want the monitoring program to detect accurately always, most of the
time and sometimes)
“ : >
-‘a. recz l ‘
&L ( SC QMaAIb ; 1 )L C €7, d ‘41 A
— -‘V U) Ci eJ
C ccl  r &I l + (c e.
- ,; J I, • 3 I.’ “ rr
ii 3 ) ¼ /0
It it ) ‘
— CC&1Ct(,p
— Rank the above error situations based on the amount of concern that
being wrong in different ways and. by varying degrees would cause
you if you were the decision maker. For example, concern over
incorrect compliance decisions increases as the magnitude and
frequency of non—compliance increases.

-------
—5—
— Assign a probabitity of occurrence that woul.d be acceptable for
each scenario, with the values co:responding to tne tevel. of concern
associated with each. This statement shoul.d indicate the level, of
uncertainty that can be tolerated and the results still used in the
decision making process. This statement represents a pol.icy decision
(by the decision maker) on the acceptabl.e risk of being wrong in
different ways. Foe your convenience, the following “desired
power” curve might be used to develop performance statements.
QA 1
U B ____ — — - - - ““ R
A 0 N CVER — — — *‘ o
—- - I
L U -Jo 9
IT , .- / . A
T / / / _.o , tt, q 4 ?viv - £
/ ( !‘‘ -
E I /
N BAD / L
C / .10
W I
I’ ?Q3D:.bDI
L I
/ 0
S i
OK
0
K ‘ I I
I I I I I I I I I I
2 Y Z 1° ‘- ‘V / ‘P 2- ZZ — Zr • i y 0
True ConcentratLon of PCE in Drinking Water

-------
a,

-------
QUALITATIVE
FEEL ABOUT T L E EPCE PROBABILITY OF
FINDING WELL FINDING WELL
ISOK ISOK
NEVER —
RARELY -
,x- -
/
/
OFFEN —
/
/
ALWAYS - -x/ —
—
I I -
HHHHH’ ’
2 4 6 8 10 12 14 1618 20 22 24 26 28
(ppb)

-------
HOW ARE PERFORMANCE
CRITERIA USED?

-------
PCE EXAMPLE
1
30%
3
5
7
115% 3 5
7 9

-------
PCE EXAMPLE
/
(ppb)
30%
1 2 3 5
1 2
15% 25%
4 5

-------
PCE EXAMPLE
7c9
$
1
30%
$
5
5
1

-------
COMPOUND X EXAMPLE
95
30%
125
15%
125
135

-------
COMPOUND X EXAMPLE
95
125
±
15%
I
1•
125
+
25%
(ppb)

-------
COMPOUND X EXAMPLE
$
95 30% 125
95
125

-------
APPENDIXES

-------
O S7q 1
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON. D.C. 20460
‘4L PROW’
OFFICE OF
RESEARCH AND DEVELOPMENT
N 41986
SUBJECT: Draft Information Guide on Data Quality Objectives
FROM: Dean Neptune, Environmental
Quality Assurance Management Staff ( —680)
TO: QA Management Meeting Participants
The attached draft Information Guide is an effort to summarize the essential
elements of Stages I and II of the Data Quality Objectives (DQO) process. It is
offered as a logical framework for addressing important issues that require
attention in designing an effective data collection activity. This is
not the only framework--others may be equally effective. QAMS is continuing to
improve on this draft, as we gain more experience in working with the Agency on
how best to present our ideas on DQO’s.
Please contact me at 8—382—5763 with any questions or comments on this
document.
Attachment

-------
DRAFT
DEVELOPMENT OF DATA QUALITY OBJECTIVES
Description of Stages I and II
Quality Assurance Management Staff
July 16, 1986

-------
TABLE OF CONTENTS
INTRODUCTION .................................•••••••••••••••••••••• ii
OVERVIEWOFDQO’SANDTHEDQOPROCESS............................... 1
DESCRIPTION OF STAGES I AND II
Initial Assumptions . . . . . . . . . . . . . . . . . . . . . . . 3
STAGE I
Step 1. Define the Decision ..............................s•ss•s• 3
Step 2. Describe the Information Needed For the Decision ........ 4
Step 3. Define the Use of Environmental Data .................... 5
Step 4. Define the Consequences of an Incorrect Decision
Attributable to Inadequate Environmental Data ........... 5
Steps.StatementofAvailableResOurCes ........................ 6
STAGE II
Step 1. Break Down the Decision Into Decision Elements .......... 6
Step 2. Specify the Environmental Data Needed ................... 7
Step 3. Define the Domain of the Decision ....................... 7
Step 4. Define the Result to be Derived From the
Environmental Data.... •.... .s.Is . •.. . . •......... . . . . .. . . . 8
Step 5. Statement of the Desired Performance .................... 9
Step 6. Determine the Need For New Environmental Data .......... 11
Step 7. Summary of Stage II Outputs: Statement of the DQO’s .... 12

-------
—11—
INTRODUCTION
Environmental data play a critical role in many EPA decisions. Because
of the importance of environmental data to EPA, the process used to design
data collection programs should place substantial emphasis on defining the
regulatory objectives of the program, the decision that will be made with the
data collected, and the possible consequences of the decision being incorrect.
A design process that fails to explore these issues and focuses only on collect-
ing the “best data possible” can result in serious problems, especially when
the final responsiblity for defining “best data possible” is assumed by
technical experts rather than EPA decision makers. Technical experts, present-
ed with a pre—established budget, may identify the best sampling and analyt-
ical methods available and then determine the number of samples and measure-
ments that can be afforded using these methods. While this approach may
ensure that each individual measurement obtained is the best possible, it
does not always ensure that adequate information is obtained for making a
decision .
Before a data collection program can be initiated, EPA must frequently
demonstrate (to regulated industries, the environmental community and to 0MB)
that its requirements for data are justified. Negotiations are often required
to satisfy industry representatives that the data are in fact needed and to
satisfy environmental groups that the monitoring requirements are sufficiently
stringent. Although Agency accomplishments have shown that the process of
designing programs to collect the “best data possible” can be successful,
this approach is not scientifically rigorous and may fail to produce a scientific
record that will support EPA’S position. Furthermore, such a non—quantitative
approach cannot be expected to uniformly result in data collection designs
that will generate data of adequate quality for defensible decision making.
The Quality Assurance Management Staff (QAMS), in response to a require-
ment established by the Deputy Admintstrator In May, 1984, has proposed an
approach to designing environmental data collection programs based on the
development of Data Quality Objectives (DQO’s). The DQO process does not use
a pre—established budget as the sole constraint on the design of a data
collection program. Rather, equal consideration is given to defining the
quality of the product needed, i.e., the degree to which total error in the
results derived from data must be controlled to achieve an acceptable level of
confidence in a decision that will be made with the data. The DQO process
provides a logical, objective, and quantitative framework for finding an
appropriate balance between the time and resources that will be used to collect
data and the quality of the data needed to make the decision. Therefore,
data collection programs based on DQO’s will be more likely to meet the needs
of EPA decision makers in a cost effective manner.
One of the most important aspects of the DQO process is the involvement
of decision makers. DQO’s are developed using a top—down approach; the
initial input and perspective of the dectsion maker is critical to the success—

-------
—111—
ful development of DQO’s. QAMS recognizes that the role of the decision
maker may vary to some degree among programs, from directly providing input
and direction throughout the process, to reacting to or concurring with
options presented by key senior staff. However, through their personal
involvement, decision makers can ensure that the DQO process is used to
properly design all significant data collection efforts. As the DQO process
becomes a “way of life” in the Agency, it will provide a more effective system
than currently available for ensuring that the quality of EPA data is compat-
ible with the requirements of the decision making process.
The absence of a well—defined framework for obtaining the decision maker’s
input and for focusing the activities of senior program staff presents a signi-
ficant obstacle to implementing the development of DQO’s. QAMS has prepared
the following discussion of the DQO process by building on the October, 1984
DQO guidance and experience gained in subsequent efforts to develop DQO’s.
This document presents a more detailed description of the DQO process than
the initial guidance, focusing on the role and activities of the decision maker
and the senior program staff. The discussion defines the stages and steps of
the process and describes the information that is developed in each step.
This document should be used to help familiarize decision makers and their
senior staff with the DQO process and, more importantly, with their specific
roles and responsibilities in that process.

-------
—1—
OVERVIEW OF DQO’S AND THE DQO PROCESS
Data quality objectives (DQO’s) are statements of the level of uncertainty
that a decision maker is willing to accept in results derived from environmental
data, when the results are going to be used in a regulatory or programmatic
decision (e.g., deciding that a new regulation is needed, setting or revising
a standard, or determining compliance). To be complete, these quantitative
DQO’s must be accompanied by clear statements of:
o the decision to be made;
o why environmental data are needed and how they will be used;
o time and resource constraints on data collection;
o descriptions of the environmental data to be collected;
o specifications regarding the domain of the decision; and
o the calculations, statistical or otherwise, that will be performed
on the data in order to arrive at a result.
This document explains the information needed for each of the items above and
suggests a step—by—step process by which all of the items may be prepared.
Developing DQO’s should be the first step in initiating any significant
environmental data collection program that will be conducted by or for the
EPA. The DQO process helps to define the purposes for which environmental
data will be used and sets guidelines for designing a data collection program
that will meet the Agency’s regulatory objectives. Once DQO’s have been
developed, and a design for the data collection activity expected to achieve
these objectives has been selected, DQO’s are used to define quality assur-
ance (QA) and quality control (QC) programs that are specifically tailored to
the data collection program being initiated. A QA Project Plan” is prepared,
documenting all of the activities needed to ensure that the data collection
program will produce environmental data of the type and quality required to
satisfy the DQO’s. Without first developing DQO’s, a QA program can only be
used to document the quality of data obtained, rather than to ensure that the
quality of data obtained will be sufficient to support an Agency decision.
The DQO process consists of three stages with several steps in each stage.
The first two stages result in proposed DQO’s with accompanying specifications
and constraints for designing the data collection program. In the third stage,
potential designs for the data collection program are evaluated. Stage III
results in the selection of a design that is compatible with the constraints
and is expected to meet the DQO’s. The process is meant to be iterative
between stages, if the proposed constraints from Stage I, the proposed DQO’s
from Stage II and the design alternatives analyzed in Stage III are found to
be incompatible.
STAGE I: Define the Decision
This stage is the responsibility of the decision maker. The decision
maker states an initial perception of what decision must be made,

-------
—2—
what information is needed, why and when it is needed, how it will be
used, and what the consequences will be if the information of adequate
quality is not available. Initial estimates of the time and resources
that can reasonably be made available for the data collection activity
are presented.
STAGE II: Clarify the Information Needed for the Decision
This stage is primarily the responsibility of the senior program staff
with guidance and oversight from the decision maker and input from
technical staff. The information from Stage I is carefully examined
and discussed with the decision maker to ensure that senior program
staff understand as many of the nuances of the program as possible.
After this interactive process, senior program staff discuss each
aspect of the initial problem, excercising their prerogative to recon-
sider key elements from a technical or policy standpoint. The outcome
of their work, once explained and concurred upon by the decision
maker, leads to the generation of specific guidance for designing the
data collection program. The products of Stage II include proposed
statements of the type and quality of environmental data required to
support the decision, along with other technical constraints on the
data collection activity that will place bounds on the search for an
acceptable design in Stage III. These outputs are the proposed DQO’s.
STAGE III: Design the Data Collection Program
This stage is primarily the responsibility of the technical staff but
involves both the senior program staff and the decision maker to assure
the outputs from Stages I and IT are understood. The objective of
Stage III is to develop data collection plans that will meet the cri-
teria and constraints established in Stages I and II. All viable
options should be presented to the decision maker. It is the preroga-
tive of the decision maker to select the final design that provides
the best balance between time and resources available for data collec-
tion and the level of uncertainty expected in the final results.
The following text lays out the steps that are performed in the first two
stages of the DQO process. It is during these stages that proposed DQO’s for a
data collection activity are developed and stated in such a way that they can
be used in Stage III. On close examination, the reader will discover that
several of the steps can occur simultaneously, especially in Stage I. Further-
more, there are some situations in which the process does not have to include
all steps. For example, when enforcement or compliance monitoring programs are
being developed for regulations already in place, many of the steps described
in Stage I may have already been completed. Also, when activities in either
stage reveal that new environmental data are not needed to make the decision,
the process can be stopped.

-------
—3—
The process described in this document is not the only way to develop
DQO’s. However, QAMS is convinced that offices will find the DQO process
described in the following pages to be a logical and efficient approach to
initiating the design of an environmental data collection program and its
associated QA/QC program. Programs that implement the steps addressed in
Stages I and II will find that their data collection programs are able to
satisfy the needs of decision makers in a cost effective manner.
DESCRIPTION OF STAGES I AND II
Initial Assumptions
Two assumptions are needed to justify initiating the DQO process. These
assumptions will be tested at several points during the process:
1) There is a regulatory or program decision to be made, and environmental
data will be required for the decision.
“Regulatory and program decisions” are decisions to take an
action, such as to:
o determine whether a regulation is needed;
o develop or revise a standard or regulation;
o issue or revise a permit;
o find a permittee in or out of compliance;
o take enforcement action;
o study a problem further;
o determine program policy, direction or priorities;
o implement a corrective action program.
This decision will involve many different inputs, or “elements”
(e.g., information on the environment, public health, process
and control technology, economy, and social and legal issues).
2) The DQO process is being initiated because there is an expectation
that existing environmental data will not provide the information
required and that new environmental data will be needed for the
decision.
STAGE I
The steps involved in Stage I are listed below. The adequate completion
of Step 1 is essential to the success of all subsequent steps. The remaining
steps of Stage I can be completed to a greater or lesser degree. However, the
extent to which the decision maker can provide the information required in
Steps 2—5 will directly affect the efficiency of the DQO process and the number
of iterations required to complete Stage II.

-------
—4—
STAGE I: Step 1. Describe the Decision
— The decision maker gives a preliminary description of the decision
for which environmental data are thought to be needed.
— This step provides an initial explanation of why environmental infor-
mation is needed.
— It is important for the decision maker to provide as much background
as possible on the regulatory or programmatic context of the problem.
STAGE I: Step 2. Describe the Information Needed for the Decision
— The decision maker describes his or her initial thoughts on all of
the inputs that will be considered in making the decision. This
step is the first opportunity for defining the “elements” of the
decision, and provides an initial description of what information
the decision maker feels will be needed for the decision.
— The initial description of the information needed for the decision
does not need to be technical; tt may simply be an identification of
some characteristics of the environment, geographic scope, economy,
industrial technology, and other social and legal concerns that are
related to the decision.
— This step allows the decision maker to address general questions that
will guide the data collection activity. Examples of such questions
are:
Do we need data from the entire U.S. or only densely populated
cities?,
Do we need information on the health effects of a pollutant, or
only data on ambient levels?,
During what time period (season, time of day, etc.) must data
be collected?,
Do we need to monitor sources or ambient concentrations?,
What regulations would provide EPA broadest authority?
The level of detail in answers to questions such as these will increase
later in the process. The purpose here is to place initial bounds on
the problem, as seen by the decision maker.

-------
—5—
STAGE I: Step 3. Define the Use of the Environmental Data
— After describing all of the inputs to the decision, the decision
maker should explain how environmental information will be used in
the decision.
— The explanation may be effectively phrased as a series of “if, then”
statements. For example, “If data indicate that the pollutant of inter-
est is present in the environment at levels potentially harmful to
human health, then a decision will be made to regulate its use, set
ambient or source standards, or ban the use of the substance entirely.”
— The decision maker should also state his or her initial impression of
the importance of the environmental data for making the decision,
relative to the other inputs (not dependent on environmental data).
— This step is the first opportunity for testing the initial assumption
that environmental data will be needed for the decision.
O To the extent that it is possible to define how environmental
data will be used in the decision, and that such data seem
to be to be a significant input to the decision making
process, the assumption that environmental data are needed
has been tentatively confirmed and the DQO process should
continue.
o If it proves difficult to define how environmental data
would be used in the decision, and such data seem not to be
a significant input, then it may be appropriate to conclude
that new environmental data are not needed and the DQO
process should be terminated at this point.
STAGE I: Step 4. Define the Consequences of an Incorrect Decision Attributable
to Inadequate Environmental Data
— The decision maker should try to imagine how environmental data might
lead to an incorrect decision, and what the conseqences of making an
incorrect decision might be.
— The possible environmental, public health, and economic consequences
of the following two situations should be considered:
o deciding to take an action when environmental data have
incorrectly indicated that a problem exists (a “false
positive”);
o deciding not to take an action when environmental data have
incorrectly indicated that a problem does not exist (a
“false negative”).
If it is clear at this point that false negatives would be of more

-------
—6—
concern than false positives, or vice versa, this should be stated;
otherwise, simply stating the possible consequences of each is suff i—
cient.
STAGE I: Step 5. Statement of Available Resources
— The decision maker should provide an initial estimate of the amount
of time, number of FTE’s, and level of extramural funds that can
reasonably be made available for the data collection program. This
estimate should be based upon the decision maker’s experience with
data collection activities of the general type under consideration
and knowledge of budgetary constraints.
— At ‘this early point in the process, the purpose of these resource
estimates is to provide gross guidance and to propose some initial
constraints on the resources available for the data collection activity.
The decision maker will have an opportunity to make more specific time
and resource decisions during Stage III of the DQO process, when design!
cost alternatives are available. The estimates generated during Stage I
should be considered subject to modification pending the results of
Stage III, when the balance between desired data quality, time and
resources is quantitatively assessed and the decision maker can readily
grasp the trade—offs inherent in different proposed options.
STAGE II
To enter Stage II, the information generated by the decision maker during
Stage I must now pass to the senior program staff. The Stage I outputs, at
minimum, should include statements of: the decision to be made, the information
needed for the decision, why data are needed, how data will be used, and what
the constraints are on time and resources. After making sure they understand
the input of the decision maker, the senior program staff do a more rigorous
evaluation of all aspects of the problem, present their findings to the decision
maker and then work with the decision maker to specify constraints and to
develop proposed DQO’s for the data collection activity.
In those situations where regulations are already in place and the need
for compliance and enforcement monitoring is being defined, Stage I activities
may have been completed and the DQO process might be entered at Stage II. The
decision maker’s involvement is still important, and will focus on interpreting
the data needs specified implicitly or explicitly by the regulations and deter-
mining the level of uncertainty tolerable in enforcement and compliance data.
STAGE II: Step 1. Break Down the Decision into Decision Elements
— The senior program staff should identify all of the questions that
need to be answered to make the decision. The list will include, but
should go beyond , questions identified by the decision maker in Stage I.

-------
—7—
Answers to these questions will be referred to as “elements” of the
decision in the remainder of the document.
— Each of the elements of the decision should be classified in one of two
categories:
o elements that are dependent on environmental data;
o elements that are not dependent on environmental
data.
The senior program staff should now identify the “significant” elements
among those that are dependent on environmental data. The activities
above may reveal that certain of the data—dependent elements will
have a negligible contribution to the decision as compared to other
data— or non—data—dependent elements. A data—dependent element is
“significant” if it appears to be required for the decision.
— This step provides another opportunity for testing the assumption that
environmental data will be needed for the decision; it is the first
opportunity for the program staff to formally and rigorously address
the issue.
o If it is possible to identify significant elements that
depend on environmental data, then the assumption has been
tentatively confirmed and the DQO process should continue.
o If it proves difficult to identify significant elements that
depend on environmental data, then it may be appropriate to
conclude that environmental data are not needed and to consider
terminating the DQO process.
— From this point on, the DQO process will only address significant
elements of the decision that are dependent on environmental data.
STAGE II: Step 2. Specify the Environmental Data Needed
— Specify the data that will be needed for each significant data—depend-
ent element (i.e., the data needed to answer each question that requires
data) . The data should be specified in terms of the variables (e.g.,
specific pollutant(s)) for which quantitative estimates are desired
and the matrix or medium in which they will be measured.
STAGE II: Step 3. Define the Domain of the Decision
— Define the “domain” to which the decision will apply. The domain is
that portion of the environment or physical system, delineated by
spatial and temporal boundaries, from which samples will be collected
and to which the decision will apply. The domain typically consists

-------
—8—
of, and is restricted to, a particular medium (soil, air, etc.) or
group of objects (people, factories, storage tanks, etc.) about which
information is collected in order to arrive at some decision.
If the decision will apply to a different or larger domain than that
being considered in the study design, the decision will have to be
qualified, since the degree to which the results will be representative
of the larger domain is not quantifiable.
— Although it is recognized that this will be a first attempt at defining
the domain of the decision, the definition should be as detailed as
possible. The definition of the domain should contain the following
information:
o the definition must incorporate all of the important charac-
teristics that distinguish the areas, time periods, or
groups of objects that are part of the domain from those
that are not;
o the definition must specify the largest unit (area, time
period, or group of objects) that the data will be used to
represent and to which the decision will apply;
o the definition must specify the smallest unit (area, time
period, or object/group of objects) for which a separate
decision might be made;
o the definition must specify parts of the domain (e.g.,
distinct sub—areas, time Intervals, or subgroups of objects)
that are of special interest or importance in making the
decision.
STAGE II: Step 4. Define the Result to Be Derived from the Environmental Data
— Define the result that will be derived through calculations or
operations performed with the environmental data:
o the result consists of analyzed environmental data used
in making the decision;
° .the result will constitute an answer to the environmental
question first posed in Stage I, Step 3 and formally stated
in Stage II, Step 1.
— The definition of the result should include the following items:
o the statistic(s) that will be used to summarize the data
(e.g., mean, range, maximum);
o for compliance and enforcement programs, the standard,
“action level”, or other value to which the summary statistic
will be compared;

-------
—9—
o for trends monitoring and research programs, the reference
value (if any) to which the summary statistic will be
compared (e.g., baseline values for detecting trends,
background or control values for detecting environmental
effects, or ambient concentrations of a pollutant that
might be of concern);
o if possible, a statement of rationale for the mathematical
or statistical procedures that will be used to derive the
result.
— Increasing the level of detail with which the result is defined will
improve the efficiency of later work in Stage III.
STAGE II: Step 5. Statement of the Desired Performance
— The collection of environmental data always involves some error.
Error is an inherent characteristic of any sampling design, methods
used for sample collection or sample analysis and statistics employed
for raw data interpretation. With these potential sources of error
in mind, the senior program staff works with the decision maker to
establish limits on the total error that can be accepted in the
results of the data collection program, in order to be able to use
these results in the decision making process. This effort will build
on the Stage I, Step 4 description of potential effects of data
errors on the decision by establishing the acceptable probability of
such effects and the level of concern they would cause the decision
maker in making the decision.
— This step is referred to as establishing the desired performance of
the data collection program. Performance, as the term is used here,
refers to the likelihood (probability) that the data collected will
correctly and accurately reflect the environmental characteristics
being measured. Each design will have its own level of performance.
The measure of performance appropriate to most monitoring programs
is the probability of false positives (finding that a problem exists
when in fact it does not) and false negatives (finding that a problem
does not exist when in fact it does). The frequency with which false
positives and false negatives occur will be a function of the total
uncertaLnty associated with the result, based on the error contributed
by both the analytical and sampling portions of the design.
For example, if data are collected to determine compliance with some
discharge standard, the combined error associated with sampling of
the discharge from the facility (which varies in concentration in
time and space), and analyses of the substance in the laboratory (using
a method with imprecision and bias), may contribute to an incorrect
compliance determination. The data may indicate that the facility is
in compliance, when it is really out, or that the facility is out of
compliance, when it is really in compliance. If enough is known

-------
—10—
about the variability of the discharge, a monitoring program can be
designed that will limit the likelihood of both false positives and
false negatives to acceptable levels.
— The following activities will lead to the specification of desired
performance.
0 The program staff should describe situations for each data—
dependent element in which the error associated with the
environmental data collected might result in a false positive
or a false negative. The same degree of control on uncertainty
may not be required (or achievable) for all of the elements.
0 Working with the decision maker, the senior program staff
should rank these situations according to the relative level
of concern that the actual occurrence of each would cause.
o The magnitude of concern over each false positive and false
negative situation should be considered in ranking how
important each situation is to the decision maker in making
the decision. While subjective to a certain degree, concern
is related to the potential effects of being in error. For
example, if data are collected to determine if a standard
has been exceeded and both the seriousness of the health
effect and size of the population affected go up with
increasing concentration of the regulated substance, the
amount of concern over false negatives would increase as
the magnitude of the potential exceedence increases.
— After the relative rankings have been established, the probability of
occurrence that would be acceptable for each situation should be
expressed quantitatively. Stating the desired performance in quanti-
tative probability terms is important in order to establish a goal
for designing the data collection program and to compare the perfor-
mance of potential alternative designs. It should be stressed that
values assigned at this time may need to be adjusted if no design can
be found that controls false positives arid false negatives to the
desired levels under the existing constraints. If this happens, the
decison maker may consider the option of increasing resources, or
accepting greater chances of being in error than originally desired.
0 It will be helpful to start with the two situations that would
cause the greatest and the least concern, and then “work to
the middle.” In other words, start by identifying the situa-
tion that you feel should never occur (such as not detecting
a life—threatening level of a substance if tt is there), and
assign an acceptable probability which reflects your level
of concern over this situation, such as 1 in 106 (one
chance in a million). Next, identify situations that would
cause very little concern if you miss them (minor, infrequent
exceedences of low magnitude) and assign acceptable probab-

-------
—11—
jutes to these situations (e.g., 1 chance in 10 might be
acceptable, depending on the consequences of missing a
level just above the standard). Then proceed to assign
acceptable probabilities to all of the other situations,
checking to make sure that the assigned probabilities are
consistent with the order in which the situations were
ranked.
The probability statements for each situation should then be
combined into a formal statement of the levels of uncertainty
that can be tolerated in the result (a separate statement
should be developed for each significant data—dependent
element). This formal statement might take the form of a
table which contains a listing of possible situations (false
positives and false negatives) , along with the acceptable
probability associated with each. Whatever form the formal
statement takes, the information should be sufficient to
allow design experts in Stage III to understand the level
of uncertainty that can be tolerated in the result to be
derived from data. Results with a higher level of uncer-
tainty may not be of sufficient quality to support the
decision to be made.
STAGE II: Step 6. Determine the Need for NEW Environmental Data
— Determine whether there are any existing environmental data that
could provide some of the information needed for the data—dependent
element. For example, the following questions could help begin the
search for existing data: Are data available on the appropriate
variables in the media of interest (identified in Stage II, Step 2)?
Were the available data taken from the locations of interest? Were
the available data taken during the season, or time period of interest?
In summary, do the available data adequately represent the domain of
the decision?
— Using the performance criteria developed in Step 5 above, determine
quantitatively whether the existing environmental data will be sufficient
for the decision.
o This should be done if QC data are available to assess
the level of uncertainty associated with the existing data.
If QC data do exist, one must determine whether the level
of uncertainty in the results derived from existing data is
within acceptable bounds. If an extensive quantitative
analysis is required to determine adequacy, then this
should be an early step performed by technical staff in
Stage III.
— Identify all of the new environmental data that will be needed in
order to provide information not provided by the existing data.

-------
—12—
STAGE II: Step 7. Summary of Stage II Outputs: Statement of the Proposed DQO’s
— The purpose of this step is to ensure that sufficient documentation
of the results of Stages I and II is produced and provided to the
technical design experts who will be responsible for the bulk of Stage
III activities. This summary is, in the most complete sense, the
proposed Data Quality Objectives for the data collection activity.
It should contain all of the quantitative information required by
technical design experts to unambiguously proceed with developing and
evaluating alternative designs. Each of these designs should reflect
the quality of data specified by the decision maker to answer each
data dependent question.
— The summary should include final statements by the senior program
staff, approved by the decision maker, which include:
O A clear statement of the decision to be made.
o Initial estimates of the time, FTE’s and dollar resources
that can reasonably be made available for the data collection
effort.
o A description of the new environmental data to be collected
for each significant data dependent element stated in terms
of the variables that will be measured and the medium from
which samples will be collected.
o The domain of the decision including: parts of the domain
of particular interest to the decision maker and data
user(s); the smallest part of the domain for which a separate
result will be calculated; and the largest unit (area, time
period, or group of objects) which the data will be used to
represent and to which the decision will apply.
O The way in which the result for each data dependent element
will be stated, including the statistic(s) that will be
used to summarize the data, any standards, reference values
or action levels to which the statistic will be compared,
and a statement of the rationale for and specifics of the
mathematical and/or statistical procedures that will be used
to derive the result.
O A statement of the desired performance for each data depend-
ent element, which specifies in as much detail as possible
acceptable probabilities associated with false positive and
false negative situations of varying degrees of magnitude.
This statement should indicate the level of uncertainty
which can be tolerated to use the result in the decision
making process.

-------
Q S7 41
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON. 0 C 20460
pqO
18
OFFICE OF
RESEARCH AND DEVELOPMENT
SUBJECT: Individual Office Insti.tutiorialization of Data Quality
Objectives
•L’..._. 1 L. ..
FROM: Vaun A. Newill, Assistant Administrator
for Research and Development (RD-672
Milton Russell, Assistant
for Policy, Planning, and Evaluation (PM-2i.9)
TO: See Addressees Below
The Deputy Administrator’s memorandum states his intentions that
individual Offices begin to institutionalize Data Quality Objectives
(DQO’s) for all significant environmental data collection activities.
Recently, we met with your program office Assistant Administrators, the
Administrator and Deputy Administrator to discuss the Deputy
Administrator’s memorandum. In this meeting, we concluded that we
discuss with each of you the next steps in implementing the DQO
process.
We see these steps as follows:
o Introduce you and your senior program staff to:
— the DQO process,
— DQO’s as an important management tool,
- the value of DQO’a to your program,
— the importance of your senior management’s direct
involvement and attention in developing your DQO. and
— answer questions that arise regarding the DQO process and
its application.
o Exchange views on how we may beat implement the Deputy
Administrator’s request for DQO institutionalization.
o Review candidate environmental data collection selections for
your office so that we may agree on a final selection.

-------
-2-
o Identify lead individuals within each of your offices to:
- be the decision maker in DOO development, and
- coordinate information flow to you.
This individual should be a senior Line manager responsible
for recommending to you alternative policies and designs for
carrying out the data collection effort.
o Identify each of our expectations of what a successful DQO
application will be and how success may be assessed.
We would like these lead individuals to define a set of interim
outputs and to establish scheduled dates for completing your DQO. A
key element in achieving the Deputy Administrator’s request is that we
agree to a mechanism for following DQO implementation progress and
surfacing problems as they arise for resolution.
We look forward to working with you on this ambitious and valuable
effort. We will, be working with your office to arrange a date for this
important meeting.
Attachment (Barnes Memo, 11/14/86)
Addressees
Rebecca Hanmer, DAA, OW (WH-556)
Jack W. McGraw, DAA, OSWER (WH-562A)
Don R. Clay, DAA, OAR (ANR-443)
Victor 3. Kimm, DAA, OPTS (TS—788)
Stanley L. Laskowaki, DRA, Region III

-------
-3-
cc: Thomas L. Adams, AA, OECM (LE-133)
Robert S. Cahill, AA, ORO CA-lOl)
David P. Ryan, Acting Comptroller (PM-225)
Sheldon Meyers, Director, ORP (ANR-458C)
Lewis Battiat, OAO, ORP (AHR-461)
Gerald Emison, Director, OAQPS (MD-lO)
Richard Rhoads. QAO, OAOPS (MD-14)
Henry L. Longest, II, Director, OERR (WH-548)
Duane Geuder, QAD, OERR (WH-548A)
Marcia Williams, Director, OSW (WH-562)
Florence Richardson, QAO, OSW (WH-562B)
Ronald Brand, Director, OUST (WH-562A)
Joseph Italiano, OAO . OUST (WH-562A)
Douglas Campt, Director, OPP (TS-766C)
Elizabeth Leovey, QAO, app (TS-769C)
Michael Cook, Director, ODW (WH-550)
Irwin Pornerantz, QAO, ODW (WH-550)
James Elder, Director, OWEP (EN-338)
Samuel To, OAO, OWEP (EN-338)
William Whittington, Director, OURS (WH-551)
Martin Broasman, QAO, OURS (WH-553)
Charles Spooner, Director, Chesapeake Bay Program
Charles Jones, Jr., OAO, Region III
Mary Blakealee, OAR, OW CWH-556M)
William Houck, OAR, OAR (ANR-445)
Marylouise Uhlig, OAR, OPTS (TS-788)
Tom Pheiffer, OAR, OSWER (WH- 62A)
Stanley Blacker, Director, QAMS (RD-680)
John Warren OPPE-SPB (PM-223)

-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON D C 20460
NOV 14 1986
OFFICE OF
rH ADMINISTRATOR
MEMORANDUM
SUBJECT: Agency Institutionalization of Data Quality Objectives
TO: See Below
The process of developinc and implementing Data Quality
Objectives (OQO’s) has been underway in EPA since the Deputy
Administrator’s DQO policy memorandum to the AAs was issued in
May 1984. The DQO’s, one of the key elements of the Agency’s
quality assurance program, are explicit statements of the quality
of data needed to support a regulatory decision. As you know,
the DQO process is a key tool in making our extensive data
collection activities cost effective, but I believe that progress
in the institutionalization of the DQO process can be improved.
The value and benefits that can result from following the
three—stage DQO process are real and worth extra attention from
senior managers. Accordingly, I am:
asking the Assistant Administrator for Research and
Development (the Agency lead for quality assurance)
and the Assistant Administrator for Policy, Planning
and Evaluation to work with each program office
Assistant Administrator to:
— assure that each program office understands
what the DOO process is and its utility,
— reach agreement with each appropriate
National Program Office and Recion
prior to data collection on developing
DQO’s for their proposed major data col-
lection activities (see attachment), and
— develop a reasonable schedule for preparing
DQO’s for each major data collection activ-
ity;
o requesting that an appropriate staff person in each
AA’s (or RA’s) office be assigned the responsibility
for working with ORD, OPPE, and the program to assure
that DQO’s are prepared; and

-------
—2—
0 requesting that the AAs for ORD and OPPE periodically
discuss the status with each program AA on developing
their individual DQO’s, and for ORD to report each
quarter to me in writing on indFvidual DQO progress.
The staffs of ORD and OPPE will be available to provide
technical assistance in understanding what should go on in
each step of the DQO process, but the development of the re-
quired DQO’s will remain the responsibility of the program
office. From time to time, I will ask that each program
office report on progress or problems in the biweekly ATS
meetings. Prior to the actual commitment of resources for
establishing major field data collection operations, I feel
it would be valuable if the DQO team CORD, OPPE and the re-
sponsible program office) brief the ALAs for ORD and OPPE on
the developed DQO’s.
The DQO concept is not new. It is simply the institution-
alization of sound management planning. I am convinced that
DQO’s should provide benefits both in cost—savings and improved
data quality. The steps that I have outlined here for these
selected environmental data collection activities should
facilitate the institutionalization of DQO’s for all ongoing
and future significant data collection activities. I am
looking forward to following your progress on this ambitious
and valuable effort.
ThJ 1 1 L 0
A. James Barnes
Deputy Administrator
At t a C hme n t
Addressees:
J. Craig Potter, AA, OAR (ANR—443)
3. Winston Porter, AA, OSWER (WH—562A)
Lawrence 3. Jensen, AA, OW (WH—556)
John A. Moore, AA, OPTS (TS—788)
James M. Seif, RA, Region III
Howard M. Messner, AA, OARM (PM—208)
Vaun A. Newill, AA, ORD (RD—672)
Milton Russell, AA, OPPE (PM—219)
cc: Francis S. Blake, General Counsel (LE—130)
Thomas L. Adams, AA, OECM (LE—133)
Robert S. Cahill, AA, ORO (A—lOl)
David P. Ryan, Acting Comptroller (PM—225)
Regional Administrators, Regions I—Il & IV—X
Sheldon Meyers, Director, OR? (ANR—45SC)
Lewis Battist, QAO, ORP (ANR—461)
Gerald Emison, Director, OAQPS (MD—b)

-------
—3-.
Richard Rhoads, QAO, OAQPS (MD-14)
Henry L. Longest, II, Director, OERR ( H-548)
Duane Geuder, QAO, OERR (WH—548A)
Marcia Williams, Director, OSW (WH —562)
Florence Richardson, QAO, OSW (WH—562B)
Ronald Brand, Director, OUST (WH-562A)
Joseph Italiano, QAO, OUST (WH—562A)
Douglas Campt, Director, OPP (TS—766C)
Elizabeth Leovey, QAC, OPP (TS —769C)
Michael Cook, Director, ODW (WH—550)
Irwin Pomerantz, QAO, ODW (WH-550)
James Elder, Director, OWEP (EM—338)
Samuel To, QAO, OWEP (EN—338)
William Whittington, Director, OWRS (WH—551)
Martin Brossman, QAO, OWRS (WH—553)
Charles Spooner, Dir., Chesapeake Bay Program
Charles Jones, Jr., QAO, Region III
Mary Blakeslee, OAR, OW (WH—556M)
William Houck, QAR, OAR (ANR-445)
Marylouise tJhlig, OAR, OPTS (TS—788)
Torn Pheiffer, QAR, OSWER (WH—562A)
Stanley Blacker, Director, QAMS (RD—680)

-------
Partial Listing of Major Agency Environmental Data Collection Programs
o Next Candidates for DQO’s
— Office of Air and Radiation
* Radon (Indoors)t [ ORPI
* Air Toxics [ OAQPSJ
— Office of Solid Waste and Emergency Response
* RI/FS (Superfund) [ OERR]
* Ground Water Monitoring at Owner/Operator
Facilities (OSW]
* Leaking Underground Storage (Exempt) Tankt [ OUST]
— Office of Pesticides and Toxic Substances
* Pesticides in Drinking Water [ OPP/ODW/ORD]
— Office of Water
* Pesticides in Drinking Water [ ODW/OPP/ORD]
* NPDES Organic Chemical Industries [ OWEPI
* Basic Surface Water Monitoring [ OWRS]
— Regions
* hesapeake Bay Study [ Region III]
— Office of Research and Developmenttt
o How the above candidate projects were selected:
— Major Agency data collection activity (S, time, FTC’s)
— At or near the beginning of the project planning phase
— Represent each of the program offices and significant
environmental data collection projects
DQO implementation underway through joint agreement among
Program, ORD, and OPPE.
tfBecause of the research nature of data collection activities
conducted by ORD, ORD is on an active parallel track in
implementing DQO’s.

-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON. D.C 20460
i, 4t
NOV 141986
OFFICE OF
THE ADMINISTRATOR
MEMORANDUM
SUBJECT: Agency Institutionalization of Data Quality Objectives
TO: See Below
The process of developing and implementing Data Quality
Objectives (DQO’s) has been underway in EPA since the Deputy
Administrator’s DQO policy memorandum to the AAs was issued in
May 1984. The DQO’s, one of the key elements of the Agency’s
quality assurance program, are explicit statements of the quality
of data needed to support a regulatory decision. As you know,
the DQO process is a key tool in making our extensive data
collection activities cost effective, but I believe that progress
in the institutionalization of the DOO process can be improved.
The value and benefits that can result from following the
three—stage DQO process are real and worth extra attention from
senior managers. Accordingly, I am:
asking the Assistant Administrator for Research and
Development (the Agency lead for quality assurance)
and the Assistant Administrator for Policy, Planning
and Evaluation to work with each program office
Assistant Administrator to:
— assure that each pr gram office understands
what the DQO process is and its utility,
— reach agreement with each appropriate
National Program Office and Region
prior to data collection on developing
DQO’s for their proposed major data col-
lection activities (see attachment), and
— develop a reasonable schedule for preparing
DQO’s for each major data collection activ-
ity;
o requesting that an agpropriate staff person in each
AA’s (or RA t s) office be assigned the responsibility
for workin with ORD, OPPE, and the program to assure
that DQO’s are prepared; and

-------
—2—
requesting that the AAs for ORD and OPPE periodically
discuss the status with each program AA on developing
their individual DQO’s, and for ORD to report each
quarter to me in writing on individual DQO progress.
The staffs of ORD and OPPE will be available to provide
technical assistance in understanding what should go on in
each step of the DQO process, but the development of the re-
quired DQO’s will remain the responsibility of the program
office. From time to time, I will ask that each program
office report on progress or problems in the biweekly ATS
meetings. Prior to the actual commitment of resources for
establishing major field data collection operations, I feel
it would be valuable if the DQO team CORD, OPPE and the re-
sponsible program office) brief the AAs for ORD and OPPE on
the developed DQO’s.
The DQO concept is not new. It is simply the institution-
alization of sound management planning. I am convinced that
DQO’s should provide benefits both in cost—savings and improved
data quality. The steps that I have outlined here for these
selected environmental data collection activities should
facilitate the institutionalization of DQO’s for all ongoing
and future significant data collection activities. I am
looking forward to following your progress on this ambitious
and valuable effort.
//M
A. James Barnes
Deputy Administrator
Attachment
Addressees:
3. Craig Potter, AA, OAR (ANR—443)
J. Winston Porter, AA, OSWER (WH—562A)
Lawrence 3. Jensen, AA, OW (WH—556)
John A. Moore, AA, OPTS (TS—788)
James M. Self, RA, Region III
Howard M. Messner, AA, OARM (PM—208)
Vaun A. Newill, AA, ORD (RD-672)
Milton Russell, AA, OPPE (PM—219)
cc: Francis S. Blake, General Counsel (LE—130)
Thomas L. Adams, AA, OECM (LE-133)
Robert S. Cahill, AA, ORO fA—lOl)
David P. Ryan, Acting Comptroller (PM-225)
Regional Administrators, Regions I—lI & IV—X
Sheldon Meyers, Director, ORP (ANR-458C)
Lewis Battist, QAO, ORP (ANR-461)
Gerald Einison, Director, OAQPS (MD—1O)

-------
—3—
Richard Rhoads, QAO, OAQPS (MD-14)
Henry L. Longest, II, Director, OERR (WH—548)
Duane Geuder, QAO, OERR (WW-548A)
Marcia Williams, Director, OSW (W —562)
Florence Richardson, QAO, OSW (WH562B)
Ronald Brand, Director, OUST (WH—562A)
Joseph Italiano, QAO, OUST (WH—562A)
Douglas Campt, Director, OPP (TS—766C)
Elizabeth Leovey, QAO, OPP (TS—769C)
Michael Cook, Director, ODW (WH—550)
Irwin Pomerantz, QAO, ODW (c 1H—55O)
James Elder, Director, OWEP (EN—338)
Samuel To, QAO, OWEP (EN—338)
William Whittington, Director, OWRS (WH—551)
Martin Brossmart, QAO, OWRS (WH—553)
Charles Spooner, Dir., Chesapeake Bay Program
Charles Jones, Jr., QAO, Region III
Mary Blakeslee, OAR, OW (WH—556M)
William Houck, OAR, OAR (ANR—445)
Marylouise Uhlig, QAR, OPTS (TS—788)
Tom Pheiffer, QAR, OSWER (WH—562A)
Stanley 8lacker, Director, QAMS (RD—680)

-------
Partial Listing of Major Agency Environmental Data Collection Prqgrams
° Next Candidates for DQO’s
— Office of Air and Radiation
* Radon (Indoors)t (ORPI
* Air Toxics [ OAQPSJ
— Office of Solid Waste and Emergency Response
* RI/FS (Superfund) [ OERR]
* Ground Water Monitoring at Owner/Operator
Facilities (OSWJ
* Leaking Underground Storage (Exempt) Tankt [ OUST]
— Office of Pesticides and Toxic Substances
* Pesticides in Drinking Water (OPP/ODW/ORDJ
— Office of Water
* Pesticides in Drinking Water (ODW/OPP/ORD]
* NPDES Organic Chemical Industries [ OWEP]
* Basic Surface Water Monitoring [ OWRS]
— Regions
* Chesapeake Bay Study [ Region III]
— Office of Research and Developmenttt
How the above candidate projects were selected:
— Major Agency data collection activity (S., time, FTC’s)
— At or near the beginning of the project planning phase
— Represent each of the program offices and significant
environmental data collection projects
tDQO implementation underway through joint agreement among
Program, ORD, and OPPE.
ttBecause of the research nature of data collection activities
conducted by ORD, ORD is on an active parallel track in
implementing DQO’s.

-------
I UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
A WASHINGTON. D.C. 20460
MAY 2 4 1984
OFFICE OF
ThE ADMINISTRATOP
MEMORANDUM
SUBJECT: Data Quality Objectives
FROM: Alvin L. Aim
Deputy Administrator
TO: Assistant Administrators
On April 3, 1984, EPA Order 5360.1, “Policy and Program
Requirements to Implement the Quality Assurance Program” was
issued. In my accompanying memorandum of April 17, I identi-
fied two major steps that have to be taken in order to assure
the reliability of environmental measurements. One of these
steps requires a data user to specify the quality of data he
or she needs. In the Order, each National Program Manager is
responsible for establish’ .ng data quality acceptance criteria
(i.e., data quality objectives) for all of their projects
[ 4b(2)]
Defining data quality objectives will not be easy. They
need to be defined during the planning stage of any data collec-
tion activity. Otherwise, too little quality assurance will
cause data to be of insufficient quality, or too much quality
assurance will result in unnecessary costs. Your personal
involvement in defining data quality objectives is needed in
carder to obtain the important policy perspective on how that
data needs to be used in the regulatory process. I request that
you be an active participant during the stages where policy
guidance will be crucial. I have attached a brief statement
describing data quality objectives in more detail.
The Quality Assurance Management Staff (QAMS) has been work-
ing with key individuals from the Air, Water, Toxics, Superfund,
and Research Offices to develop example data quality objectives.
These examples should be completed by July 1984 and will provide
a ‘blueprint” for developing data quality objectives for each
new and existing data collection activity. As a result of this
effort, several of the offices will have individuals quite knowl-
edgeable about how to define data quality objectives.

-------
—2—
While we are awaiting completion of the example data quality
objectives, I request that your staff work with QAMS to develop a
listing in priority order of those significant ongoing data collec-
tion activities for which data quality objectives need to be
defined. One basis for establishing priorities should be the
priority list in “Agency Operating Guidance——FY 1985—1986.” Many
of these items depend on analysis of environmental data. Once
the examples are completed and all of us have a clear understand-
ing of what is required, I request that your staff work with
QAMS to prepare by September 1984 a schedule for developing data
quality objectives for each of your significant ongoing environ-
mental data collection activities. Beginning in September 1984,
data quality objectives should be an integral part of each signif-
icant new data collection activity.
Quality assurance is an important mechanism for ensuring
that all EPA decisions can be supported by a sound data base.
Data quality objectives are a key element in that mechanism.
Your personal assistance in this undertaking is desired.
Attachment
bccz AA/ORD RF
ORD PENDING FILE
OMSQA CHRON FILE
QA!’tS CHRON FILE
AUTHOR FILF
AX (3 COPIES)
Prepared by: R .68O/S.B1aCker/rar/31OOWSM/3825763/05 ’ S 4 /
S.Blacker Disl $2

-------
QAMS STATEMENT ON DATA QUALITY OBJECTIVES
Using EPA Order 5360.1, “Policy and Program Requirqinents to
Implement the Mandatory Quality Assurance Program,” the Quality
Assurance Management Staff (QAMS) of ORD will require, as a
necessary element in preparing quality assurance (QA) project
plans for each major data collection effort, that data quality
objectives (DQO) be established. The responsibility for develop-
ing data quality objectives will reside with primary data users;
these are, in most cases, the headquarters offices responsible
for administering EPA programs. QAMS will depend on data quality
objectives as the tool to ensure that programs have clearly
defined before the fact the level of QA that must be included in
data collection efforts.
What are data quality objectives?
Data quality objectives are descriptors of the quality of
data needed to support a specific environmental decision or
action. Quantitative and qualitative descriptors of data quality
must be considered in order to determine whether data are appro-
priate for a particular application. The person or organization
that will use the data must decide what quality of data is needed
for the specific applicat ion intended.
Data quality objectives are target values for data quality
and are not necessarily criteria for the acceptance or rejection
of data. If data quality objectives are not met, it is still the
responsibility of the data user to consider the limitations of
the data and determine whether they may be used for the intended
purpose.
What if data quality objectives are not developed?
Methods and procedures may be selected for a project without
consideration of data quality objectives. In some cases when
DQO’s are not used, data quality will exceed that required causing
more resources to be spent than necessary. In other cases, data
quality may be less than that required and may be useless if the
information needed to characterize data quality was not obtained.
DQO’s are a starting point for cost—effective project design.
What are the responsibilities of QAMS and the Program Offices?
QAMS will develop guidance on data quality objectives with
the help of the Program and Regional offices. The guidance will
explain what data quality objectives are and will discuss the
types of descriptors available for the various data applications.
The guidance will include examples showing how DQO’s are developed

-------
—2—
for different media and programs. QAMS will also develop guidance
that specifies the technical materials that Program Offices need
to provide to the organizations responsible for data collection.
Employing this guidance, Program Offices will be responsible
for establishing data quality objectives for each of their major
monitoring programs. This will require a careful consideration
of what data are needed for each major program, why the data are
needed, and how the data will be used. Program Offices will also
be responsible for preparing the technical guidance required by
data collectors to produce data meeting the established data
quality objectives.
QAMS will perform program plan reviews and management audits
to assure that Program Offices establish and use data quality
objectives. QAMS will not evaluate the intended use of the data
or the appropriateness of the established data quality objectives.
QAMS will determine only whether data quality objectives have
been established, whether data collection programs have been
designed so the necessary descriptors of data quality will be
collected, and whether the data and associated descriptors of
quality have been collected in a manner consistent with the data
quality objectives.

-------
UNITED STATES ENV IRON MENTAL PROTECTION AGENCY
WASHINGTON D.C. 20460
APR 17 1984
OFFICE OP
TH ADMINISTRATOR
MF 4ORANWM
SUBJECr: EPA Order 5360.1, “Policy and Program Requirements to
Implement the Quality Assurance Program”
Alvin L. Aim ( 2L r-
Deputy Administrator
TO: Addressees
One of my major goals is to ensure that all decisions by EPA can be
supported by a sound data base. An important step toward achieving this
objective is to require that quality assurance becane an integral part of
all data collection activities. Quality assurance is the total integrated
program for assuring the reliability of environmental measurements and
consists of multiple steps undertaken to ensure that. all acquired data
are suitable for the user’s intended purpose. Two of the major steps are:
the user must first specify the quality of data he needs; then t:he -degree_
of quality control necessary to assure that the resultant da a satisfies his
specifications must be determined. Central to this process is-assuring that
the data is of known quality. The quality of data is known when all ccxnpo-
nents associated with its derivation are thoroughly documented, such
documentation being verifiable and defensible.
In order to establish quality assurance solidly in all data collection
activities, the important step of issuing this order on quality assurance
is being taken. The implementation of the elements in th.s order will
require dedication and hard work by the Quality Assurance Management and
Special Studies Staff, by quality assurance officers. throughout the Agency,
and by senior management. This order identifies the goals, objectives, and
general responsibilities of each program area. T carry out the order,
specific policy and technical guidance materials need to be prepared. I
will be following that progress.
The attached order reflects my canmitnient to the Agency’s QA program
and to the pranotion of good science in all EPA monitoring and measurement
activities. Therefore, I expect that each of you work cooperatively to
ensure that the appropriate level of quality assurance is embedded in all
data collection undertaken by or for the Agency.
Attaclunent

-------
EPA ORDER 5360.1
APR 3 1984
POLICY AND PROGRAM REOUIR 4E TS
TO IMPLEMENT ThE MANDATORY QUALITY ASSURANCE PROGRAM
1. PURPOSE . This Order establishes policy and program requirements for the
conduct of quality assurance (QA) for all environmentally related measurements
performed by or for this Agency.
2. BACKGROUND . Agency policy requires participation in a centrally managed
QA program by all EPA organizational units supporting environmentally related
measurements. Under Delegation of Authority 1—41, “Mandatory Quality Assurance
Program” (dated 4/1/81), the Office of 1 search and Developiient CORD) is the
focal point in the Agency for quality assurance policy and is responsible for
developing OA requirements and overseeing Pçencywide implementation of the QA
program. ORD established the Quality Assurance Management and Special Studies
Staff (Q MSS) to serve as the central management authority for this program.
The QAMSS activities involve the develoçrnent of policies and procedures; co-
ordination for and direction of the implementation of the Agency QA program;
and review, evaluation, and audit of program activities involving environmental
monitoring and other types of data generation.
The Agency QA program embraces many functions including: establishing QA policy
and guidelines for developuent of program and project operational plans; establishing
criteria and guidelines for assessing data quality; serving as a QA information focal
point; auditing to ascertain effectiveness of QA implementation; and identifying and
developing OA training programs.
3. GOALS AND POLICY . The primary goal of the OA program is to ensure that
all environmentally related measurements supported by the EPA produce data of
known quality. The quality of data is known when all canponents associated
with its derivation are thoroughly docunented, such docirnentation being verif i—
able and defensible. It shall be the policy of all EPA organizational units to
ensure that data representing environmentally related measurements are of known
quality. Decisions by management rest on the quality of environmental data;
therefore, program managers shall be responsible for: 1) specifying the quality
of the data required fro n environmentally related measurements and 2) providing
sufficient resources to assure that an adequate level of QA is performed.
All routine or planned projects or tasks involving environmentally related
measurements shall be undertaken with an adequate OA project plan that specifies
data quality goals acceptable to the data user and assigns responsibility for
achieving these goals.
In discharging its responsibility for implementing the Agency-mandated Quality
Assurance Program, the ORD/OAMSS will strive for consensus by sutxnitting for
review proposed policies and procedures to affected program offices and regions.
sponsibi1ity for adjudication of unresolved issues, with respect to the abcve
and QAMSS conducted audits, will be at the lowest level of authority consistent
with the scope of the issues. The OAMSS will refer issues which remain un-
resolved at lower levels of authority to the AA/ORD for decision, after Con-
sultation with the appropriate AA or RA.

-------
EPA ORDER 5360.1
APR 3 1984
The following activities are basic to the implementation of the O program:
a. Preparation and annual update of a program plan based on guidelines
established by QMSS.
b. t velopment of a O project plan for all projects and tasks involving
environmentally related measurements in accordance with guidelines established
by O MSS.
c. Assuring inpienentation of Q for all contracts and financial assistance
involving environmentally related measurements, as specified in applicable EPA
regulations, including subcontracts and subagreements.
d. Conducting audits (system, performance evaluations, data quality, bench,
etc.) on a scheduled basis of organizational units and projects involving environ-
mentally related measurements.
p. t veloping and adopting technical guidelines for estimating data quality
in terms of precision (variability), bias (accuracy), representativeness,
ccinpleteness and ccxrparability, as appropriate, and incorporating data quality
requirements in all projects and tasks involving environmentally related
measurements.
f. Establishing achievable data quality limits for methods cited in
regulations based on results of methods evaluations arising frm the methods
standardization process, e.g., ASI?’l Standard D2777—77.
g. Inpleinentation of corrective actions, based on audit results, and for
incorporating this process into the management accountability system.
h. Provision for appropriate training based on perceived needs, for all
levels of Q management, to assure that Q responsibilities and requirements are
understood at every stage of project inplementation.
4. RESPONSIBILITIES .
a. In conformity with the oversight responsibility for the mandatory Q
program, the AA/ORD shall:
(1) Establish Agency policies and procedures for implementing the
mandatory Q program.
(2) Provide guidance for determining precision, bias, representativeness,
ccxnpleteness, and canparability of data.
(3) Review O Program Plans fran Agency ccxnponents involved in
environmentally related measurements.
(4) Conduct Q audits of all organizational units supporting environ-
mentally related measurements based on established audit criteria and procedures.
—2—

-------
EPA ORDER 5360.1
3
(5) Recarmend corrective actions, based on audit results, for inclusion
in the management accountability system.
(6) Establish achievable data quality limits for methods provided by ORD
for citation in regulations, lased on results of methods evaluations arising fran
the methods standardization process, e.g., ASIN Standard D2777—77, to help project
officers define data quality goals.
(7) Serve as the Agency Q information focal point.
(8) t velop generic training programs, based on perceived needs, for all
levels of management to assure that Q responsibilities and requirements are
understood at every stage of project implementation.
(9) Ensure that all ORD investigations involving data collection are
covered by an acceptable Q plan with resources adequate to acca plish program
objectives.
(10) Ensure that deficiencies highlighted in review of ORD program plans
or in audits of ORD ccziiponents are appropriately addressed.
b. In accordance with policies and procedures established by A A 4 /ORD,
National Program Managers shall:
(1) Ensure that 0 is an identifiable activity with associated resources
adequate to accomplish program goals in the development and execution of all pro-
jects and tasks, both intramural and extramural, involving environmentally related
measurements.
(2) Ensure that appropriate Q1 criteria are included in operating guidance.
(3) Establish data quality acceptance criteria for all projects and tasks
conducted by the program office.
(4) Ensure that an adequate degree of auditing is performed to determine
canpliance with Q requirements.
(5) Ensure that deficiencies highlighted in audits are appropriately
addressed.
(6) Ensure that all projects and tasks involving environmentally related
measurements are covered by an acceptable Qt project plan and that the plan is
imp1 nted.
(7) Identify program—specific QP training needs and provide for the
required O training.
c. In accordance with policies and procedures established by AA/ORD, Regional
Administrators shall:
—3--

-------
EPA ORDER 5360.1
3 i92
(1) Ensure that QP is an identifiable activity with associated resources
adequate to acccii p1ish program and regional goals in the development and execution
of all projects and tasks involving environmentally related neasurenents, both
intramural and extramural.
(2) Ensure that QP guidelines are specified for estimating data quality
in terms of precision, bias, representativenesS 1 ccupleteness, and canparability,
for all environmentally related measurements which meet the operating guidance
established by the program offices.
(3) Establish data quality acceptance criteria for all projects and tasks
initiated by the Region.
(4) Ensure that all projects and tasks involving environmentally
related measurements are covered by an acceptable Q project plan and that the
plan is implemented.
(5) Ensure that an adequate degree of auditing is performed to determine
conpliance with O . requirements.
(6) Ensure that deficiencies highlighted in audits are corrected
expeditiously.
(7) Identify program—specific QP training needs and provide for the
required O training.
d. The AA for Administration shall establish a mechanism for incorporating
Q in the Agency’s planning and budgeting cycle.
5. DEFINITIONS . The fol1c iing terms have special meanings in relation to this
Order.
(a) E cumentation . The use of documentary evidence; a written record
furnishing information that a procedure has been performed. When applied to
environmentally related measurements it includes all calculations related to
sampling design; all steps in the chain of custody, where appropriate; and
all notes and raw data generated in the sampling, analysis, or data validation
process.
(b) E fensible . The ability to withstand any reasonable challenge related
to veracity or truthfulness.
(c) Environmentally Related Measurement . Any laboratory or field data
gathering activity or investigation involving the determination of chemical,
physical, or biological factors related to the environment.
The following are representative examples of environmentally related
measurements. I ta collection or investigation of chemical, physical, or bio—
logical factors for determination of:
—4—

-------
EPA ORDER 5360.1
3
(1) pollutant concentrations from sources, in the ambient environment,
or pollutant transport and fate;
(2) response of organisms to pollutants;
(3) the effects of pollutants on human health and on the environment;
(4) risk/benefit analysis;
(5) environmental or econcinic impact.
(6) the environmental impact of cultural and natural processes;
(7) pollutant levels, exposure levels, etc., used in modeling.
(d) Organizational Unit . Any administrative entity (national program
off ice, regional office, ORD or NEIC laboratory) which engages in environmentally
related measurements.
(e) Project . An organized undertaking or specified unit of investigation
involving environmentally related measurements.
(f) Quality Assurance . The total integrated program for assuring the
reliability of monitoring and measurement data.
(g) Verifiable . The ability to prove or substantiate any claim or result
related to the documented record.
6. ADDITIONAL REFERENCE . This Order will be amplified by a detailed iinplenen—
tation plan.

Howard M. Messner
Assistant Administrator
Office of Administration and Resources Management
—5—

-------