Atmospheric Sciences Research Laboratory
  CHEMISTRY, MATHEMATICS. METEOROLOGY. MODELING, PHYSICS
             GUIDANCE FOR PREPARING

     STATEMENT OF PROJECT QUALITY OBJECTIVES
                  June 2, 1987
Office of Acid Deposition, Environmental Monitoring and Quality Assurance
            Office of Research and Development
            U.S. Environmental Protection Agency
          Research Triangle Park. North Carolina 27711

-------
~
A~HERIC OCIENCES RESFARCH IABORAroRY
GUIDAOCE FOR PREPARnl;
STA'I'EMENI' OF PROJECT QUALITY OBJECTIVES
Contract NUmber 68-02-4174
June 2, 1987
SUbmitted to:
Dr. Jack Durham, Project Officer
Atmospheric Sciences Research Laboratory (ASRL)
U.S. Environmental Protection Agency
Research Triangle Park, North Carolina 27711
SUbmitted by:
Research and Evaluation Associates, Inc.
1030 15th Street, N.W., Suite 750
Washington, D.C. 20005
(202) 842-2200
727 Eastowne Drive, Suite 200A
Chapel Hill, N.C. 27514
(919) 493-1661
PROPERTY OF
EPA UBRARY, RTP, NC

-------
NOl'ICE
THIS DCX:UMENI' IS A PRELIMINARY DRAFl'. It has not
been formally released by the u.S. EPA/ASRL
project Officer. It is being circulated for
comment on its technical accuracy and policy
inplications.

-------
DISCIAIMER
"Although the procedure described in this report
has been funded by the United States Environmental
Protection Agency through Contract Number
68-02-4174 to Research and" Evaluation Associates,
Inc., it has not been subjected to Agency review
and therefore does not necessarily reflect the
views of the Agency and no official endorsement
should be inferred."
Atm:>spheric Sciences Research Laboratory
Office of Research and Development
u.s. Enviromnental Protection Agency
Research Triangle park, North Carolina 27711
i i

-------
A~
The work on this project was performed by the QA/QC staff of
Research and Evaluation Associates, Inc. Mr. Don Cox, Project Leader,
and Ms. Sharron Rogers contributed to the project. As the external
reviewers of IX»s, we are fortunate to have had the support and advice
of numerous individuals who know the problems of research project
management, data collection, analyses, reporting, and quality control.
Dr. Jack Durham and Mr. Ron Patterson of u.S. EPA's AtrrDspheric
Sciences Research Laboratory, Research Triangle Park, NC directed our
efforts and provided technical advice. Ms. Brenda White graciously
typed this document and was willing to make our numerous revisions.
i i i

-------
ABSl'RACl'
This document presents a procedure for preparing a Statement of
Project Quality Cbjectives (SPQO). The procedure involves a dialogue
am::>ng clients, program/project managers, and technical implementers.
The SPQO evolves from the three stages of the Data Quality Cbjective
(IX)O) Developnent Process. . This guidance document is intended to
guide ASRL Project ~fanagers and project Officers in developing a
personalized process for gathering and conveying this information.
Supporting checklists are provided in appendices.
iv

-------
ABBREVIATIONS
AOOM
AES
AMS
APIOS
ASRL
CAPMON
CV
DQO
EPA
EPRI
Hi -Vol
HQ
ME
MOI
NAPAP
NCAR
Nl'N
OEN
a1E
PCA
PPA
PI
PM
PO
PQO
QA
QAMP
QAMS
QAO
QAPjP
QC
LIST OF ABBREVIATIONS
- Acid Deposition and Oxidant Model
- Atnospheric Environment Service
- American Meteorological SOciety
- Atnospheric Sciences Research Laboratory
- Canadian Air and Precipitation Monitoring Network
- COefficient of Variation
- Data Quality O:>jecti ves
- Environmental Protection Agency
- Electric Power Research Institute
- High-Volume
- Headquarters (EPA)
- ~~el Evaluation
- l-1eroc>randurn of Intent
- National Acid Precipitation Assessment Program
- National Center for Atnospheric Research
- National Trends Network
- Operation Evaluation Network
- Ontario Ministry of the Environment
- Principal Component Analysis
- Planned Program Accomplishments
- Principal Investigator
- Program Manager
- Project Officer
- Project Quality Q:>jectives
- Quality Assurance
- Quality Assurance Management Plan
- Quality Assurance l1anagement Staff
- Quality Assurance Officer
- Quality Assurance Project Plan
- Quality Control
v

-------
RADM
RFP
sav
- Regional Acid Deposition Model
- Request for Proposal
- Statement of ~rk
- Statement of Project Quality Cbjectives
- Standard Deviation
- Transition-Flow Reactor
- Task Officer
SPQO
S)
TFR
TO
vi

-------
TABLE OF CONrENrS
~tice . . . . . . . . . . . . . . . . . . . . . . . . . . . .

DisclaiIner . . . . . . . . . . . . . . . . . . . . . . . . . .
Acknowled.gement ..... . . . . . . . . . . . . . . . . . .

Abstract. . . . . . . . . . . . . . . . . . . . . . . . . .

List of Abbreviations . . . . . . . . . . . . . . . . . .
Introduction
. . .
. . . . . . .
.....
. . .
. . .
The SPQO Developnent Process. . . . . . . . . . . . . .
Steps in Preparing a SPQO . . . . . . . . . . . . . . . .
General Background. . . . . . . . . . . . . . . . . . .
Statement of the Problem. . . . . . . . , . . . . . .
Potential Application of the Product. . . . . . . . . .
Constraints of TiIne and Deli verables .........
Constraints of Budget. . . ,. . . . . . . . . . . . . . .
Alternatives and Selection of Approach. . . . . . .
Data Quality Statement. . . . . . . . . . . . . . . . .
Concurrences
........
.....
......
Appendices A - C
vii
~
i
ii
iii
iv
v
1
3
6
6
8
9
9
10
11
11
12

-------
LIS!' OF TABLES
Table
l.
2.
3.
4.
SPQO Responsibilities in Relation to DQO Process.
. . . .
Initial SPQO Checks. . . . . . . . . . . . . .
.....
~tions Available to ASRL Director. . . ~ . . . . . . . .
Sanple SPQO Format and Corresponding Roles. . . . . . . .
v;;;
~
3
5
5
7

-------
LIsr OF FIGURES
Fiaure
1.
Interaction of DQO Stages and SPQO Deyelopnent Process. . .
ix
~
2

-------
GUIDAOCE FOR PREPARnX;
A STATEMENr OF PROJECT QUALITY OBJECI'IVES
INl'ImUCXION
The primary purpose of the ASRL Statement of Project Quality
Cbjectives (SPQO) is to provide a logical framework for defining the
quality of data and the uncertainty inherent in the products to be
produced by a research project needed to support policy and regulatory
decision making. Develo~nt of the ASRL Statement of Project Quality
Cbjectives (SPQO) supports the Quality Assurance Management Staff's
development process for Data Quality Cbjectives (J:QO). Figure 1
illustrates the three-stage J:QO process in relation to ASRL' s SPQO
development process. A checklist is provided in Appendix A to
facilitate preparation of the SPQO.
.
A Statement of Project Quality Cbjectives will provide:
. General background about the national policy need for the
research project
Statement of the problem that the research project is to
solve, as seen by the client
Description of the product uses/users
Client's constraints of time, deli verables, and budget
Discussion of alternatives, selection of research approach
Data quality objectives
Recommendations and approvals.
.
.
.
.
.
The SPQO provides the ASRL Program Manager with measurement
standards to assess the quality of data produced during conduct of the
resulting research program. In addition to agency management uses,
the. coIIq;>leted SPQO, subsequent statement of work, and other quality
assurance support materials can be used by organizations responding to
a subsequent Request for Proposal (RFP) or proposing for a Cooperative
Agreement. The SPQO can provide critical information for development
of a work plan and quality assurance plan for the research project.
1

-------
I I Initial input
I
I by Decision L
--
I --
Maker. . . - -
L.- - - - - T- - - ~ .;,.j--...- '> Gath " Inf t"
. . - - '"'" " en.ng orma lon

r-- - - - - j - - - - - . . I Characterize Project Quality


'--- - - --""',
I II Clarifica- ~Fornulate the Problem
"
: tion of ,- - ---- --v. Define project objectives

I the problem. : Identify use and users
r----J-----'T
I III Development of '-------""> Develop alternatives,

I alternatives. r - - - - ~ approaches and initial DQOs
I I
I Selection of I
I . the approach ,
I ,
L. - - - to 1;e_USed. - -1

I
I

r---t----,
I
I

&- - - - - - - - - -.&
QAMS
DQO DEVELOPMENl'
PROCESS STAGES
r--~ --.-----,
L. - - - - - - - - '- _..J
I
Complete DQO
Figure 1.
ASRL
SPQO DEVELOPMENl' PROCESS
Define application of product,
limitations, opportunities,
goals, and deli verables
Develop project quality
objectives
Define constraints on budget
and deliverables
Prepare draft of SPQO
Selection of alternatives,
approach and DQOs
Complete SPQO
Interaction of DQO Stages and SPQO Development Process
2

-------
THE SPQO DEVELOPr-1ENI' PROCESS
The SPQO is developed in an interactive process involving the
client, HQ PM, ASRL Program Manager (ASRL PM), with sUPIX>rt from the
Project Manager, Project Officer (PO), Quality Assurance Officer
(QAO), and technical staff merrbers (Table 1). The ASRL Program
Manager is responsible for the execution of technical and management
functions for developnent of the SPQO with the approval of the ASRL
Laboratory Director and Division Director. Performance of these
functions is dependent on 1) the status of planning at the
client/decision maker levels and 2) the completeness of information
given in the early stages of the !XX> develo{neI1t process. Quality and
quantity of the background information provided in Stages I and II of
the !XX> process by the client/decision maker influence the selection
of applicable project quality objectives and the approach to the
project.
TABLE 1.
SPQO RESPONSmILITIES IN REIATION TO tQO PROCESS
DQO Stage Principal 
Stage Description Resoonsibili ty Interactions
I Initial inp.lt HQ Pr.1 Client, HQ PM, ASRL
 by Decision  Lab Director and PM
 Maker  
II Clarification ASRL Pf.1 HQ PM, ASRL PM,
 of the problem  ASRL Division
   Director
III Developnent ASRL Lab Director ASRL Project Manager,
 of alternatives and Division Dir Principal Investigator
 and selection ASRL PM and 
 of approach Project Manager 
 to be used  
3

-------
To ensure an acceptable SPQO, the ASRL Program Manager rust
perform the following five tasks during the development of the SPQOs:
. Establish clients r quantitative project objectives
. Identify the users and uses of the product to be prod\Jced
. Identify the resources and/or technical requirements needed
to produce the product
Identify the duration of the project
Identify the approach suggested by the client.
.
.
To assist the ASRL PM and supporting technical and planning
staff, questions relating to the collection of initial information for
the SPQO process are given in Appendix A. The objective of these
questions is to aid the information gathering process and identify
where further information is needed.
The ASRL PM initially characterizes the overall project quality
specifications and identifies alternate research approaches that could
be taken to meet client or decision maker needs. The recomnended
approach should be the IroSt cost effective that can ensure an
acceptable level of performance by an, as yet, unspecified PrinciPal
Investigator. A descriptive statement of the research problem to be
addressed and a draft statement of work are written into an incomplete
draft of the SPQO and approved by the ASRL Director and HQ PM.
Questions relating to this process are found in Appendix A, Part 2,
Sections I and II and a checklist given in Table 2.
The ASRL PM, next, develops specific project quality objectives
in terms of accuracy, precision, representativeness, and completeness.
After an evaluation of the project quality objectives and alternative
approaches that can be taken, the ASRL Laboratory Director selects one
of the following options (Table 3) to be used by the PM.
4

-------
TABLE 2. INITIAL SPQO CHOCKS
I. Client's descriptions of:
A. General background
B. project background
C. Product uses
D. Product users
E. Qualitative product quality.

II. Client information included:
A. Statement of the problem
B. Sources of error
C. Quantitative project quality objectives
D. Project restrictions
E. Linkages to other projects.
III. Is it apparent that:
A. Planning staff participated in the development of the
above
B. Project objectives can be iIrplemented
C. Project use and/or needs are clearly identified by the
client?
TABLE 3. OPTIONS AVAILABLE TO ASRL DI~R
cption I
The Program r~ager defines the approach that will
be taken and the project quality objectives that
the PI must achieve during the project.

The PM provides the basic project quality
objectives and alternative research approaches
from which the PI must select. The PI develops
the data quality objectives, which are
incorporated into the ASRL SPQO.
cption II
cption III
The PM provides the SPQO and requires the PI to
respond with the approaches and project
quality objectives.
If cption III is followed, the ASRL PM and Division Director
review the PI's alternatives to the research approach, when received,
and select the research approach that best meets the client's or
decision maker's needs. No matter which option is selected the
product of this process is the SPQO.
5

-------
STEPS IN PREPARING A SPQO
A st~y-step process is described to provide guidance for
writing a statement of project quality objectives with enphasis on the
respective roles of the client and the proposed inplementer of the
anticipated research project. The SPQO format and the corresponding
client and iIrplernenter roles are given in Table 4. This suggested
format is designed .to produce a dOClm1eI1t that will set project
performance measurements and provide a planning tool for the PI to use
in developing the Quality Assurance Project Plan (QAPjP). In Appendix
B, a discussion of the relationship between project quality objectives
and QC criteria necessary for the project iIrplementation is provided.
Preparation of the SPQO for a new project/task is a highly
individualized process. The format and examples given are presented
to assist the writer of the statement in providing the detail and
content necessary for production of an acceptable statement.
Throughout the following narrative, reference will be made to a Data
Quality C1:>jective Statement for an existing project, "Deploy and
<:perate a Daily SUrface Monitoring Network," written following the
SPQO format. This example is presented in its entirety in Appendix c.
General Background
Typically, this section will describe to the project iIrplernenter
the background leading to the need for the product. This description
is written from the client's perspective on the issues and policies
requiring the product. This section presents the objectives for the
entire project in the broadest terms, as seen by the client or
decision maker. Usually the client's written description, the NAPAP
objectives, or EPA' s work plan provide a good starting point. In some
cases, EPA's PPAs, Peer Review, or workshop documentation are
excellent sources of material for this background. A written
statement by the decision maker or client should be incorporated into
this section.
6

-------
TABLE 4. SAMPLE ~ FORMAT AND CORRESPONDm; ROLES
.
GENERAL BACKGRroND about the source of the need for the products.
Client: Provides a perspective on the need for the product.
Written by: BJ PM.

srATEMENl' OF THE PROBLEM and quanti tati ve objectives that will
lead to a product or products needed by the client.
Client: Provides an overview of the problem to be solved, lists
uses and users of the product.
Written by: HQ PM, but may be Jrodified or compiled from
interviews by the ASRL PM.
.
.
Pal'ENI'IAL APPLICATION OF THE PROOUCT and potential users of
results of the project.
Client: Provides a description of the user's anticipated
product application, conclusions to be based on uses
of the product, and states deliverables.
Written by: BJ Program Office, HQ PM, and ASRL PM.

CONSTRAINrS OF TIME AND DELIVERABLES to serve as an indicator of
satisfactory performance to the client.
Client: Provides milestones and deliverables.
ID PM: Provides HQ milestones, deliverables, and decision points.
ASRL PM: Provides information on time constraints and
deliverables to the client and HQ for approval.
Written by: ASRL PM.
.
.
CONSTRAINrS OF BtJDGEl' set by the client to determine and assign
the resources required to reach the user's objectives.
Client: Provides funding information limitations.
ID PM: Provides allocation of funds.
ASRL PM: Provides information on budget constraints and specific
budget restrictions.
Wri tten by: ASRL PM.
.
DIOCUSSION OF ALTERNATIVES AND SELECTION OF APPRa\CH describes
the alternatives to the client's approach and the selection of
the approach to be used.
ASRL PM: Provides the alternatives to client approach and
recommends approach to be taken.
ASRL Lab Director: Selects which approach meets client needs.
Written by: ASRL PM.
.
DATA QUALITY srATEMEm' by the implementer of the project that
describes the project quality objectives required to meet the
client's, users', and implementer's needs.
ASRL PM: Provides specific data quality requirements for project.
ll: Provides detailed quality specifications for project.
Written by: ASRL PM.
7

-------
Appendix C illustrates how both a general and specific background
statement can be made. Most of the information was obtained from the
client, EPA's PPAs, and \tJOrkshop documentation by the HQ PM. Using
this initial input from the client, the ASRL PM will be to identify
and write materials answering the questions: why the product was
needed, who will be users, and what is needed by the client as a
product?
Statement of the Problem.
This section is designed to give the reader of the statement,
i.e., the implementer of the project, sufficient information about
what is needed by the client. The HQ PM and staff provide a
"statement of the Problem" and the quantitative project objectives.
The level of detail regarding the product that can be discussed at any
given phase of project definition will depend on how well the
information is gathered and how well the problem has been addressed by
the client. Specifically, the client should identify the following
points.
.
Sources of error or the acceptable uncertainty
Primary and secondary uses of the product
Primary and secondary users of the product
Problem(s) to be solved by the project
Quantitative objectives
Restrictions that apply
Linkages to other projects or products.
.
.
.
.
.
.
In Appendix A (Part II, Section II), a list of questions is
provided to assist in defining the content and writing of the
statement of the problem. Utilizing the questions as a checklist, a
simple or complex statement of the problem can be drafted. Detailed
points addressed in this section include:
. Project objectives in specific terms
. Product quality
. Reference to documents containing the scope of \tJOrk to be
performed.
8

-------
The background statement identifies the basic parameters of
interest to the users and the client. In the exanq;>le presented in
Appendix C, the BJ PM provided this level of detail based upon
documents produced by t\ttU \ttUrkshops held by the client. At this
stage, the BJ PM is writing from the client's perSPeCtive to identify:
1) what type of information is needed, 2) why this information is
needed, 3) the criteria and specifications for the product, and 4)
what type of products are being produced from the project outputs.
Identification of the fitness of the data and the intended use of
the products and quantitative objectives/goals of the project by the
BJ PM will lead to project quality objectives that can be
quantitatively measured and/or assessed. Fuzzy or qualitative
objectives/goals often lead to non-quantitative project quality
objectives. In stating the client's and users' goals, rernenber that
it is the BJ PM's responsibility to state the level of quality needed
for the intended use of the data and to lead the principal
Investigator to the eXPeCted precision and accuracy of each goal.
Potential Application of the Product
The HQ PM and HQ technical staff next provide a Statement of the
Potential Application of the Product (s) of the project. One or roore
statements can be given to define the conclusions and/or uses that
will be made from the product of the research. If this information
has been adequately stated, the following can be more fully detailed
. by the ASRL PM.
. Anticipated application of by-products from the project.
. Conclusions to be based on the product.
. Milestones or project deliverables for other projects
depending on this project.
Impact of project deliver abIes on the project products.
.
9

-------
Constraints of Time and Deliverables
An exarrple of a Constraints of Time and Deli verables is provided
in Appendix C. The BQ PM provides the information that identifies,
clarifies, and/or defines the critical constraints on the products
needed by the client. Questions 2 and 3 in Part 2, Section TV of
Appendix A provide the basic framework for this section. At this
point, the ASRL Program Manager with other ASRL staff should be able
to prepare a conplete statement of work for the anticipated research
effort. Major points addressed in this section are:
. Points for project "go" or "no go" decisions.
. Constraints imposed on scientific effort, data collection,
time frame, and deliver abIes.
Trade-offs that are acceptable and the resources available.
Basic project quality requirements.
.
.
Constraints of Budqet
As in every project, the Constraints of Budget for the project
nust be clearly specified. These constraints can be in the form of
detailing the allocation of funds to the eventual Principal
Investigator for specific COItp>nents of the project effort, e.g.,
start-up costs and QA/CC program costs. In some cases, the ASRL PM
may base financial decisions on the life time of the project and on
the quality of the work to be produced. Specific points to be
addressed are:
Planned budget for:
Life time of the project
First year start-up cost
First year operational cost
First year QA/OC cost
10

-------
Budget breakdowns for:
Project \YOrk plan
Peer review
Training
QA/QC activities
Financial report.
The following format is an exanple which could be used:
FY:
Constraints of Budget
87 88 89 90 91
92
93
94
m-HooSE
Manpower
Travel, $I{
(Fill in as required)
EX-HOOSE

Modeling, $I{

Fld Meas., $I{
Likewise the following statement could be used as appropriate:
NOI'E: Planning budgets are confidential. This section will be
completed upon award and negotiation of revisions to this SPQO.
Alternatives and Selection of APProach
This section, written by the ASRL PM, describes in detail the
alternatives to the client's approach and justifies selection of the
specific approach to be used. Appendix C (pp 18-22) provides an
exanple of this section. Appendix A (Part 2, Section TV question 2)
may provide assistance through a list of points to be checked by the
writer of the statement. Due to the detail of these points, they will
not be repeated here.
11

-------
Data Ouality Statement
This section is based upon negotiations between the client, HQ
Staff, ASRL staff, and the PI. The statement is usually written by
the ASRL PM, Project Manager, Project Officer, and/or the PI and is
based upon the ~rk plan for the project.. If option III has been
selected (Table 3) by the ASRL Laboratory Director, the section on
Project Quality will be conpleted upon approval of the quality
sPecifications submitted by the princiPal Investigator. Appendix A
(Part 2, Section VII, question 2) provides a list that can be used as
guidance to write this section. In Appendix C (pp 22-24), a detailed
exanple of a sPecific Data Quality Statement is given.
Concurrences
The conplexity and dynamic nature of ASRL research projects
demand the following concurrences on the content of the statement.
Recommended by ASRL
Project Officer:
Branch Chief:
Division Director:
Concurrences
(Signature)
(Date)
Project Manager:
QA Officer:
Laboratory Director:
12

-------
AI

-------
Appendix A
Checklist for Statement of Project Quality Cbjectives
THIS CHEX:KLIST IS DESIGNED 'ID AID IN ccm>IIATION AND PRESENl'ATION OF
INFORMATION. ccm>LETION OF THIS CliOCKLIST PERMITS ASRL MANAGERS TO
ASSESS THE s:mWS AND PROBLEMS IN DE.VELOPING THE SPQO. IN PARI.' 2,
SEX::TIONS OF THIS CtiJxKLIST CORRESPOND TO ~IONS OF A SUOOESTED
FORMAT FOR A WRITl'EN s:mTEMENr OF PRDJECr QUALITY OBJECTIVES.
Questions in the following materials are assigned one of b«> ratings,
"G" or "R". A question rated as "G" for guidance is provided to
assist or to provide guidance to the Project Officer during develop-
ment of the DQO. It will not necessarily be used in evaluation of the
conpleted process. A question with an "R" for recommended strong], v
addresses an issue or process that the Project Officer should complete
during this stage of developing the DQO inasnuch as such information
will be necessary to denonstrate the quality of the project.
IDENl'IFICATION
Program name:
Project name:
Provide names for:
a) Client
b) Decision Maker
c) Program Manager (HQ)
d) Project Manager
e) Project Officer
f) Quality Assurance Officer
i) Principal Investigator (PI)
Title of Statement:
Date:
Revision 0
Revision 1
Revision 2
13

-------
Part 1 - DQO Development Process
I.
INFORMATION PROVIDED BY HQ PR(X;RAM MANAGER
A. Information about the Client
1. Has the client been identified?
a) Name
b) Agency
c) Office
2. Bas the decision maker been identified?
a) Name
b) Agency
c) Office:
B. Information from the Client provided by HOs
1. Have the following been given or identified by the
client or decision maker?
a) Planned Program Accomplishments (PPAs)?
1) Goals?
2) Rationale?
3) Resources?
4) Description?
5) Deliverables?
6) Qualitative statements of use?
7) Program Quality Objectives?
b) The decision to be made or product needed?
c) The level of uncertainty that is acceptable to the
decision maker?
d) The anount of time available?
e) The level of resources available?
f) Intended use of the product?
g) Users of the product?
h) Why the product is needed?
i) Background information?
2. Has a statement of the quality of the product been
given?
a) In qualitative form?
b) In quantitative form?
c) As a hypothesis?
3. Has a written statement (qualitative) of the problem
been:
a) Described in sufficient detail to implement?
b) Approved by the:
1) Client?
2) Decision Maker?
4. Has the decision maker or client background information
on the context of the problem been given?
II. Clearances
. A. Clearances other than ASRL
1. Has the decision maker or client reviewed and commented
on:
a)
b)
c)
d)
Statement of problem and decisions to be made?
Level of uncertainty or quality goals?
Use and users of product?
Background information?
14

-------
2.
Has decision maker or client provided program quality
objectives?
Has the HQs' technical staff reviewed and approved the
QA Project Plan?
Has the client approved both the \\1Ork plan and
Quality Assurance Project Plan?
3.
4.
15

-------
Part 2. Statement of Project Quality Cbjectives
I. CLIEN!' I S DEOCRIPl'ION OF THE GENERAL MCKGRaJND
1. Has information been provided/acquired and a statement
written to address:
a) Key policy questions?
b) National goals that will be inpacted by
the product of this project? .
2. Has description of general background been provided:
a) Project background (including source of
the need for the products)?
b) Product uses clearly identified?
c) SUImnarY of project needs and objectives
clear to reader?
d) Basic qualitative statement of quality
given?
e) Interrelated projects given as reference?
f) Description written by decision maker
(client)?

II. CLIEN!' I S srATEMENl' OF THE PROBLEN AND PROJECT OBJECl'IVES
1. Has client provided or has project management staff
written:
a) General statement of the problem?
b) Statement on sources of error or
acceptable uncertainty?
c) Clear indication of who are the decision
maker and potential data users?
d) Clear identification of secondary user(s)
and use (s)?
e) SUfficient information on what is needed
from the Project Manager/project Officer?
f) Statement of the problem to be solved
by the project given?
2. Has client or project management staff provided
a statement of the project objectives in terms
that the Principal Investigator can utilize to
provide for a quality product:
a) Level of detail sufficient to determine
quantitative objectives?
b) Critical objectives identified by the
writer (client)?
c) Intended use (s) of product given?
3. Have the restrictions that apply to this project
been identified and/or described by the:
a) Client?
b) Program Manager?
c) Laboratory Director?
4. Have an identification of primary and/or secondary
user(s) needs been made and given in sufficient detail
to address:
a) List of primary users needs?
b) List of secondary users needs?
16
R( )
R( )
R( )
R( )
G( )
G( )
G( )
G( )
R( )
R( )
R( )
G( )
R( )
R( )
R( )
R( )
R( )
R( )
R( )
R( )
G( )
G( )

-------
5.
Are the following clearly understood:
a) Linkages to other projects?
b) Dependency of other projects on this
project's products?
Is a statement of product quality provided, specific
to which the PrinciPal Investigator can/IIUSt respond
to in:
a) Project ~rk Plan?
b) Project Quality Assurance Plan?
Is specific reference provided to support documents
containing product 'quality requirements or
criteria?
Has a specific statement of the problem that the
PrinciPal Investigator rust respond to in
the \\1Ork plan been provided/written?
Has specific reference been made to documents
containing the scope of \\1Ork to be conpleted?
Is it apparent that the decision maker and
planning staff participated in the developnent
of this section of the project quality statement?
6.
7.
8.
9.
10.
III. CLIENl" S DEOCRIPl'ION CF THE APPLICATION (USE) OF THE PRCDUCl'
1. Is the description of the application of
the product:
a) Clear?
b) In sufficient detail to be inplemented?
2. Are statements defining the conclusions of the
product:
a) Clear?
b) In sufficient detail to be inplemented?
3. Are limitations, opportunities, or options for any
by-products of this project specified?
4. Are goals for application given in quantitative
tenns?
5. Have milestones or deli verables for other projects/
tasks depending on this project been identified?
6. Has a clear statement of how the project deli verables
will inpact on the project products been given?
a) Are there critical inputs?
7. Is it apparent that the decision maker, planning
staff, and user (s) participated in the development
of this section of the project quality statement?
IV. CLIENl" S CONSTRAINTS OF TIME AND DELIVERABLEs
1. Are decision points clearly identified?
2. Do the statements of deli verables support the
initial client/users' needs?
3. Are the constraints inposed reasonable for:
a) Scientific effort?
b) Data collected?
c) Products to be delivered?
d) Time frame to be conpleted within?
17
G( )
G( )
R( )
R( )
G( )
R( )
G( )
R( )
R( )
R( )
R( )
R( )
G( )
G( )
R( )
R( )
R( )
R( )
R( )
R( )
R( )
R ( )
R( )
R( )

-------
4.
Is there sufficient information to draw
conclusions about what is needed, what
resources are available, and what tradeoffs
are acceptable?
Can a reader of the project quality statement
at this level of detail determine what can be
provided as a quality product from the infonnation
provided?
Will the statements derived from Stages I and
II (QAMS DQO Development) give the Principal
Investigator enough information in Stage III to
write an approach to .the client's needs/problems?
Is it apparent that the decision maker and planning
staff participated in the development of this
section of the project quality statement?
5.
6.
7.
v. CLIEN!" S CONSI'RAINl'S OF BUDGEr
1. Are the constraints of the project budget clearly
stated?
2. Is there sufficient information to draw
conclusions about what tradeoffs are acceptable?
3. Are financial decision points clearly identified?
4. Has a planned budget been given for:
a} Life time of project?
b} First year start up cost?
c} First year operational cost?
5. Does the budget breakdown include:
a} Project ~rk plan costs?
b} Peer Review costs?
c} Training costs?
d} Financial report costs?
e) Quality assurance and control costs?
6. Is it apparent that the decision maker and planning
staff participated in the development of this
section of the project quality statement?
R( }
G( }
G( }
R( }
R( }
R( }
R( }
R( }
R( }
R( }

G( }
G( }
G( )
G( }
G( }
R( }
VI. srAGE II IMPLEMENl'ER' S DIOCUSSION OF ALTERNATIVES AND SELECTION
OF APPRQ\CH
1. Identify who is serving as the implementer at
this stage (Stage II) of the development of
the project quality objectives
a) Is it the Project Officer?
b) Is it the PrinciPal Investigator?
2. Is it apparent that the Stage II implementer
understands and has responded to the:
a} ())jectives of the project?
b) Stated product needs?
c} Stated user needs?
d} Stated constraints?
e) Stated deliverables?
f} Stated data analysis procedures?
g} Stated database requirements?
h) Stated OC database requirements?
18
R( }
R( }
R( )
R( )
R( )
R( )
R( )
R( )
R( )
R( )

-------
Is it apparent that the Stage II implementer and the
decision maker have agreed on the approach taken?
Does the approach given best balance the stated
objectives, resources, needs, and constraints given? R()
Is it clear to a reader of this section of the project
quality statement how the needs of the decision maker
and user will be met?
Are there statements in this section concerning
what products will be provided?
Are these products:
a) Related to client/user needs?
b) Related to the secondary user needs?
c) Sufficient to meet the stated project
objectives?
Is the information provided sufficient in detail to
determine requirements for project quality objectives
to be given in the next stage?
Is it apparent that the planning staff was involved
in the development of the selected approach?
VII. PROOECT QUALITY OBJECTIvE srATEMENr FOR SELECrED APPRO!\CH
1. Is a qualitative statement of project quality
objectives given? If so, is there a clear statement
of the way in which each conclusion was reached?
2. Is a quantitative statement of the project quality
. objectives given?
If so, does the statement:
a) Relate to the approach given?
b) Relate to the stated product need?
c) Relate to the intended use of the
data/project?
d) Presented in terms of:
1) Precision and accuracy?
2) Frequency of measurement?
3) Representativeness?
4) Completeness?
3. will the project quality objectives, as represented
by the IXJOs be effective? Do they: R( )
a) Create passive quality assurance procedures? R( )
b) Create interactive quality assurance procedures? R()
c) Characterize the data quality as needed
by the client and user(s)?
d) Characterize the data quality for a
data cOllection/database effort?
e) Consider the use of EPAs databases?
f) Support the approach selected by the
Stage II implementer?
g) Require quality control data reports?
4. Are critical project objectives supported by the
D'JOs given?
5. Are the products needed by the project's secondary
users supported by the IXJOs given?
3.
4.
5.
6.
7.
8.
20
R( )
R( )
R( )
R( )
R( )
R( )
G( )
G( )
R( )
R( )
R( )
R( )
R( )
R( )
R( )
R( )
R( )
R( )
R( )
G( )
R( )
R( )
G( )

-------
6.
Are the DQOs presented in sufficient detail to
support definition of the accuracy and precision
values for the Quality Assurance Project Plan?
Are the DQOs given in sufficient detail to support
definition of Quality Assurance reporting require-
ments and evaluation?
7.
VIII. SIJMMARY
Provide as needed
IX. PROJECT QUALITY STATEMEN1' RECCJt1MENDED BY:
Provide names
x. PROJECT QUALITY STATEMENl' APPROVED BY:
1. Client
a) Name
b) Title
c) Signature
2. Decision Maker
a) Name
b) Title
c) Signature
3. Program Manager (HQ)
a) Name
b) Title
c) Signature
4. Laboratory Director
a) Name
b) Title
c) Signature
5. Project Manager
a) Name
b) Title
c) Signature
6. Quality Assurance Officer
a) Name
b) Title
c) Signature
7. Senior Program Staff
a) Name
b) Title
c) Signature
8. Senior Technical Staff
a) Name
b) Title
c) Signature
. 9. Principal Investigator (if known)
a) Name
b) Title
c) Signature
Date
Date
Date
Date
Date
Date
Date
Date
Date
21
G( )
G( )

-------
B

-------
APPENDIX B
REIATIONSHIP OF THE r.QO PRCX:ESS AND OC CRITERIA
The relationship of the Data Quality <1:>jective (r.QO) Development
Process, the Statement of Project Quality <1:>jectives (SPQO), and the
Quality Control (OC) criteria given in the QA Project Plan (QAPjP)
differ only in the anount of detail given in each. The SPQO draws
upon the initial Program Quality Objectives and enhances these
objectives to a higher level of sPecification for use in the Scope of
Work (sav). The result is the Project Quality Objectives (PQQs).
Depending on the project inplementation process selected by the
Project Officer (PO), the Principal Investigator (PI) will either
inplernent the PQOs specified by the PO or provide the his or her own
PQQs to the PO. Upon acceptance of the PIs ~rk plan, the PQQs agreed
upon become the DQOs for the project and represent conpletion of the
DQO development process. Table 1 illustrates this process.
mBLE 1. LEVELS OF QUALITY C>BJE:CrIVES IN r.QO PRCQ:SS
IXX> Process
Stage
I
Responsibility
Client and HQ
Program Manager
Statement of Project Quality Project Manager and
PO Objectives
PO
Oualitv Level
Program Quality Objectives
II
From II to III
sav
III
End of III
PIs ~rk Plan
QAPjP
PI and PO
PI
The DQOs provide a simple and straight forward statement of data
quality to be achieved by the project data collection, analysis, and
reporting activities. The relationship between the DQOs and
development of OC limits or criteria incorporates a series of sPecific
23

-------
ste~ from the given DQOs to the QAPjP. How this is achieved by the
PI will be an iterative and somewhat complex process. It is the role
of the PIs technical staff to refine the DQOs to the awropriate level
of detail enabling a specific quality control process to be
implemented, documented, and audited.
During the detailed. preparation of the OAPjP sections,
quantitative statements of the types of errors/biases that will be
controlled, the level of this control, and Supporting data that is
needed nust be given. These statements will characterize known
sources of error and bias that BUSt be controlled and the QC checks
that will be performed to ensure their control. To illustrate this
process, an example is provided.
PR(X;RAMMATIC EXAMPLE
A IIDdel for air quality acidic deposition is needed to perform an
assessment of the relative importance of local versus regional sources
to mesoscale acid deposition. The specific requirements of the IIDdel
are:
a.
To provide a means of predicting acidic deposition loading
within + 100% of observed field test values to economically
iITportant surfaces within urban areas;
b.
To determine the relative importance of local, as opposed to
long-range, transport to deposition loading at sensitive
receptors within 200 to 300 kIn of large point or area
sources;
c.
To determine the relative irnportance of the deposition of
primary, as opposed to secondary, sulfate near large point
and area sources.
The client 's requirements for field and laboratory measurements
state that only standard analytical procedures can be used for the
determination of rainfall and the ionic concentration related to
24

-------
acidity.
Table 2.
The objectives for these measurements are given below in
mBLE 2. PRCGRAM ~ GIVEN BY CLIENT
Precision:
+ 10% as a standard deviation of duplicate
analysis.
Accuracy:
~ 10% of standard reference values as differences.
Now that the overall measurement quality objectives for the
program (Table 2) have been defined, the next step in our example is
to establish the data quality indicators for each. During the
detailed planning and preparation of the PI IS QAPjP, the SPQOs and
DQOs are used as the starting point for developing explicit,
quantitative statements of the types of error that will be accepted
am controlled, and the QC information that will be collected in order
to characterize the quality of the data and identify errors or biases.
These indicators are needed to select the appropriate collection
method (s), field and/or laboratory analytical techniques, and data
processing and management approaches for the results. The indicators
also serve as the basis for selection o~ the PI's QA activities and QC
criteria and procedures for the project to be given in the QAPjP. For
example, the data quality indicator for sulfates is given in Table 3.
mBLE 3. EXAMPLE DA'I2\ QUALITY INDICA'IDR FOR SULFATE ANALYSIS
Precision:
~ 10% difference as a meaQ/RSD
Accuracy:
~ 10% difference from a known standard.
Both high volume and dichotomous samplers are required for
collecting sulfate samples. The PI has identified that total mass of
the filter and 002 gaseous sampler are also required for the
25

-------
evaluation of the accuracy for sulfate flux. At this point, a system
of project quality control objectives is required to provide an
assessment of the data being collected. The sanpler flows, transport
of the filter, QC checks, collocated sanplers, transport to the
laboratory, extraction analysis by ion cbronotography method, am data
processing will be part of the total data base for doannenting
achievement of project data quality objectives. In addition, the use
of the 502 nonitor provides the CO-analysis relationship needed by the
modelers for the model accuracy estimates of sulfate prediction.
Based on reconunendations of the PI I S technical staff for data
collection activities identified above, the Project Quality Cbjectives
for sulfate measurements (Table 4) are used in preparing the QAPjP.
Technical staff guidance in preparing the QAPjP quality objectives is
a critical source of information, necessary for the PI and PO to
support development and inplementation of sPecific plans for model
evaluation and use.
'mBLE 4. EXAMPLE P~ QUALITY OB.JE:CrIVES
SULFATE FOR DATA FROM FIELD Sl'ODY
Experimental
Conditions
Precision
Measurement
Parameter 1

002
004=
Mass (DICH

Balance)
(HI-VOL

Balance)
Atm:>. Sanples
Atm:> . Sanples
Atm:> . Sanples
=t. 10%
::!: 10%3
+ 4%3
Accuracy2
=t. 10%
+ 10%
+ 0.02mg/10Orng
:t 0.04ng/zero
+ 0.5ng/l,2,5 9
:t l.Orng/zero
Atm:>. Sanples
:t 4%3
1 Reference methods document is EPA standard method of collection
am analysis (EPA 600/4 series).
2 Difference from a known concentration.
3 Standard Deviation of (1) replicate analysis or (2) the percent
differences between collocated s~lers.
26

-------
The next step for the PI I S technical staff is to define data
collection and analytical quality control checks to support the
quality objectives for sulfates. Table 5 provides a basic list of OC
criteria related to the Project Quality Objective parameters.
Obviously, a system of quality control checks is needed to detennine
the quality of the field data being collected. The QAPjP and related
RPMs nust provide the. procedures, evaluation techniques, and
mechanisms for any corrective action needed to ensure that data of
good quality is being collected. A functioning QA plan or program
will minimize or eliminate surprise nonconformance problems. The goal
is prevention of the generation of bad data, rather than correction
after collection.
AUDIT QUFSl'IONNAIRE PROCESS
It is the role of the PI I S technical staff to refine the data
\
quality objective to the level of detail shown in Tables 4 and 5 of
the exanple. The goals of the staff menDers are to select specific
quali ty control and auditing approaches that will control the bias and
error of the data being collected in the field and to define the
criteria for the audit evaluation. The audit and OC approach selected
should be the one that is the best balance of the time requirements,
deliverables, and cost constraints required by those directing the
overall program.
The PI . s technical staff also is responsible for
initiating/implementing the rore technical phases of the audit
process. Their responsibility extends to the preparation of detailed
guidance for internal audits in the form of audit questionnaires which
provide the assessment and performance criteria for the data
collection, analysis, and precision assessment activities. During
detailed planning and preparation of the project, the technical staffs
guidance for project internal audits, the DQCs, and QC limits/criteria
are used as the starting p:>int for developing: 1) explicit,
quantitative statements of the types of errors;biases that will be
controlled, 2) the level of this control, and 3) the support data that
will be collected during the internal audits or control checks to
characterize all the known sources of error and bias.
27

-------
'mBLE 5. EXAMPLE OF QC CRITERIA
Related
Parameter
Collection/Analytical
Method

Dichotomus
Hi -Vol
9)4=
9)2
Con. 100ni tor
Mass
Hi-Vol/Balance
Dichotomus/Balance
9)4=
Ion Chrom.
Data Handling Strip chart
Data Acquisition
Input
1 Collocated sampler for precision
2 Every 5th filter
3 Every 10th filter
4
Compared to perfect curve
5 Also within + 2 S.D.
6 Also within + 2.8 CV limit
7
10% of samples
8
5 to 8% of samples
9 100% of data
Control
Check
Flow Rate
Flow Rate
Zero
precision
SPAN
Zero
1,2,5 gram wt.
Tare replicates
Gross replicates
Zero
100mg weight
Tare replicates
Gross replicates
Calibration slope
Calibration slope
Standard sample
Percent recovery
Replicates
Baseline, hr, Aug.
Baseline & time
Raw vs printout
28
Control
Limi t .
% 10% set pointl
~ 10% set pointl
zero % 0.025 ppm
+ 10% std. dev.
+ 15% difference2
~ 1.Omg of zero3
~ 0.5mg of wts.
~ 2.8mg difference
+ 5.Omg difference
+ 0.04mg of zero2
~ 0.02mg of wts.3
~ O.Olmg difference
+ 0.02mg difference
> + 15% slope diff.4
~ 2.0% differenceS
+ 10% difference6
90%
+ 4% as std. dev.6
< 1. 0% incorrect 7
< 1.0% incorrect8
< 2.0% incorrect9

-------
~I

-------
APPENDIX C
DATA QUALITY OBJECTIVE

DEPLOY AND OPERATE A DAILY SURFACE
MONITORING NETWORK
Revision:
Date:
3
31 July 1986
1.
Client's DescriDtion of the General Backaround
1.1 General Background
Over the past several years, various proposals have called for amelioration
of the adverse effects of acid deposition by controlling the emissions
of acid precursors. Each of these proposals has raised a number of
international and regional issues. The National Acid Precipitation
Assessment Program (NAPAP) has decided that numerical models provide the
most sCientifically defensible tool for describing existing source-receptor
relationships, for predicting the effect that emission changes might
have on these relationships and, thus, for resolving the various issues
in question. Specifically, the key policy questions that must be answered
for a given distribution of emissions include:
Decosition Loadinqs. What are the seasonal and annual averages of
the wet ana dry deposition of acidic species (esp., sulfate, nitrate,
hydrogen, and ammonium ions) and oxidants to specific states and
Canadian provinces or to portions of states and provinces?

b. Source Attribution. What is the net (i.e., wet and dry) deposition
of sulfur oxides, nitrogen oxides, and volatile organic compounds
(VOCs) to each receptor area from each geopolitical source area?
a.
c.
Chemical Nonlinearity. How effective are various strategies for
controlling sulfur oxides, nitrogen oxides, and VOCs in reducing
acid deposition to sensitive receptor areas? For example, is the
regional source-receptor response relationship "linear" for sulfur
oxides? Do nitrogen oxides and VOC play an important role in
governing the chemical response of the atmospheric system?
In order to produce scientifically defensible tools for analyzing the
consequences of proposed acid-deposition control policles, the
Environmental Protection Agency (EPA), the Ontario Ministry of the
Environment (OME), the Atmospheric Envlronment Service (AES) of Environment
Canada, and the Electric Power Research Institute (EPRI) have supported
the development of regional-~cale acid-deposition models. EPA has been
supporting the development of a Regional .Acid Deposition Model (RADM) at
the National Center for Atmospheric Research (NCAR). OME, AES and the
Umweltbundesamt (West Germany) have been supporting the development of
the Acid Deposition and Oxidant Model (ADOM) at ERT, Inc. The RADM and
ADOM are sophisticated state-of-the-science models specifically designed
1

-------
for application to the complex technical issues implied by the policy
issues. Although the RADM and ADOM have already been shown to be capable
of addressing the policy issues, their predictions will not gain wide
acceptance unless the models have been subjected to a credible model
evaluation program. Therefore, an important element of the modeling
effort must be to establish model credibility through comparison with
observed conditions. Without evaluation of the models, large uncertainty
on the reliability of model predictions limits the use of these predictions
in making policy decisions.

The decision to design a program for the evaluation of these models
initiated a sequence of activities that is expected to culminate in
evaluated models for use in making policy decisions. A series of
workshops, aimed at assisting EPA, OME, AES, and EPRI in this design
process, have been convened. The first workshop, sponsored by the EPA in
Raleigh, North Carolina on February 11-13, 1986, focused on identifying
various protocols for evaluating models against field observations.
Several field projects were identified to obtain data for comparison with
model predictions (Pennell 1986). The second workshop, which was sponsored
by EPRI, was held in Seattle, Washington on March 11-13, 1986. This
workshop focused on refining the data requirements and data quality
objectives for the field projects identified in the first workshop (Sarchet
1986). A third workshop, sponsored by OME, was conducted in Toronto,
Ontario on June 11-13, 1986, to consider quality auditing for the field
measurements and model exercises for the model evaluation program. Data
archival and coordination for all of the projects within the program, as
well as other related monitoring and research projects supported by NAPAP,
EPRI, and the Canadian agencies, were also considered at this workshop
(Olsen 1986).
Model evaluation has two functions:
Assess the ability of models to predict deposition and air
concentration patterns and amoants for effects applications and
policy analysis decisions.

. Provide scientific credibility to assure that model predictions are
correct for the right reasons.
.
These functions are manifested on different time and space scales and
result in evaluations with different functions. Integrated evaluation
(also referred to as operational evaluation) addresses primarily the policy
function. Diagnostic evaluation, on the other hand, addresses primarily
the credibility function and hence are also an important adjunct to the
integrated evaluation. These types of evaluations do not establish
confidence limits on model predictions. Rather, they focus on determining
whether the models simulate the salient features of atmospheric transport,
transformation, diffusion and deposition processes reasonably well and
whether the model is giving the correct predictions for the right reasons.

Integrated evaluation considers how well the model is able to predict
seasonal or annual mean values of deposition and air concentrations.
Field
2

-------
studies to support integrated evaluation must span a sufficient time for
these averages to be established over the simulation period. Frequently
this type of evaluation has been referred to as an operational evaluation
because it examines the model predictions that would be operationally
used in making policy decision or effects assessments. The variables
predicted by the models are the surface air concentration and deposition
patterns of the major species: S02' NOx' 03' and perhaps a few other

oxidants in air and H+, S042-, N03-' and NH4+ in precipitation. Model-

derived seasonal or annual means are to be compared to similar averages
of the field observations of these quantities.

Diagnostic evaluation has as its principal objective to determine if the
model, as a complete entity, and its various modules are functioning
properly. Model complexity requires that the evaluation treat each
individually simulated episode of 1- to 6-day duration as a separate case
study. Variables studied in a diagnostic evaluation include those obtained
for integrated evaluation, as well as those variables which are .sensitive
to the performance of specific modules. Furthermore, diagnostic evaluation
requires information on the three-dimensional nature of the observed
fields to determine if the models are performing correctly.
In the most detailed evaluation, at the sub-modular or mechanistic level,
the workings of individual modules within the complex regional-scale
acid deposition models are evaluated. Such evaluations require extremely
detailed observations of variables unique to the processes being simulated
within each module. High temporal and spatial resolution observations
in three dimensions are needed to determine if these modules satisfactorily
parameterize subgrid scale features of the observed fields.

Diagnostic evaluation spans the widest range of time scales. Sub-event
information, i.e., high temporal resolution, helps to establish that
modules interact correctly. At the other extreme, observed seasonal and
annual averages are needed to demonstrate that individual episodes can be
combined to form longer term means. All of these factors were taken
into consideration by the workshop participants in the identification of
field studies that were necessary for the various aspects of model
evaluation.
Field projects needed for model evaluation that were identified by the
workshop participants include:
INTEGRATED EVALUATION
1.
2.
3.
4.
5.
Deploy and Operate a Daily Surface Monitoring Network (Operational)
Conduct Vertical Profiles over Modeling Domain (Diagnostic)
Determine Subgrid Deposition Variability (Diagnostic)
Evaluate 1985 NAPAP Emission Inventory (Input, Diagnostic)
Evaluate Inflow Boundary Conditions (Input, Diagnostic)
3

-------
MECHANISTIC (MODULE) EVALUATION
6.
7.
8.
9.
Evaluate Wet Deposition Module
Evaluate Dry Deposition Module
Evaluate Atmospheric Transport Module
Evaluate Gas Phase Chemistry Module
The principal purpose of these field projects is to provide'a data base
.for RADM and ADaM evaluation (Pennell 1986; Sarchet 1986). The workshop
participants further recommended that the model evaluation field studies
need to span at least two full .calendar years in which the daily surface
network (Project 1) is producing quality data. Embedded within these two
years would be four periods in which more intensive (higher sampling
frequency) and extensive (additional variables) measurements would be
taken from aircraft and at special (enhanced) surface sites. The surface
network is capable of providing field observations at all spatial scales.
Its national coverage, with greater site density in the northeastern United
States, will yield national patterns of the geographic distribution of the
observed fields. In the area of higher site density, fields associated
with individual wet deposition episodes can be resolved for diagnostic
evaluations. Enhanced sites within the surface network serve as focal
points for diagnostic and subgrid scale studies. Special sites distributed
in clusters about selected surface network sites can support diagnostic
evaluation at a modular level and are needed for interpreting the spatial
variability associated with surface network observations. Evaluation of
certain modules may require clusters of closely spaced surface sites.

But the surface networks are not capable of obtaining vertical profile
da~a needed in the diagnostic evaluations. Sampling and measurement
systems carried aloft by free or tethered balloons, or on aircraft are
needed to acquire these data. Such observations mainly support the
diagnostic evaluations, but also contribute to modular evaluations, and
in special situations, to integrated evaluation. These measurements can
only be performed during intensive periods.
The intensive periods were suggested primarily to collect data for.process
module evaluation (Projects 6-9; these periods would also be used for the
vertical profiles (Project 2) and subgrid variability (Project 3) studies.
The intensive periods would be of two-month duration each and scheduled
to sample the important seasonal contrasts. At most, two intensives
would be scheduled in any year. A two-month intensive period was
considered superior to a one-month period for several reasons:
. the longer duration increased the probability of capturing appropriate
deposition episodes for evaluation,

. fewer intensives during a year means less stress on the personnel
running the field activities, and
. a longer period between intensives means that the data from the
previous intensive can be more thoroughly analyzed to support the
final planning of the next intensive.
4

-------
The only disadvantages to the longer and fewer intensives were that a
given season may be sampled only once or that some seasons may not be
sampled at all. In any case, seasonal differences in synoptic patterns
must be considered in scheduling intensives. Spring and autumn seasons
offer the greatest opportunity for frontal precipitation and cyclonic
storm systems. Summer affords opportunities to sample convective episodes
and the warmest temperatures. Winter presents the lowest temperatures
and solar irradiances, and a high probability for solid phase
precipitation.

Because of resource limitations, the process module evaluation field
studies may not be funded. Therefore, field study design for the vertical
profiles and subgrid deposition variability studies should be developed
independently of the module evaluation field studies.
1.2 Specific Background for This Project

The daily surface network described 1.n this DQO is essential for the
model evaluation program. The surface network is actually COmposed of
several networks sponsored by different organizations: the EPRI OEN
(Operational Evaluation Network), the OME APIOS Network, the AES CAPMoN
and the EPA ME (Model Evaluation)-35 Network (co-located with the NTN
Dry Deposition Network). Each should follow a compatible Protocol for
measuring deposition, air quality, and meteorology. The combined surface
network should be of national scale. Most of the regional-scale models
focus on the eastern half of the United States and southeastern Canada
because this is where there are known impacts and because deposition
patterns in the northeast are important to an assessment of effects. As
a result, a higher spatial density of sampling sites in this region is
warranted. Subgrid information, obtained at clusters of sites spaced
less than about 80 km, cannot be compared directly with model predictions.
However, such measurements are essential to assessing the representa-
tiveness of standard network sites and is discussed in the DQO for
Proj ect 3.
1.3 MesoSTEM Evaluation
The model evaluation field study program offers the opportunity to evaluate
models other than RADM and ADOM. EPA plans to adapt its efforts to also
field evaluate the Sulfur Transport Eulerian Model with a predictive
mesoscale transport driver; this version is named IIMesoSTEM.1I
All of the projects and DQOs relate to MesoSTEM evaluation, except this
one. That is because the spatial scale of the observations is too coarse.
5

-------
2.
Clients' Statement of the Problem
2.1 General Statement of the Problem

The clients (EPA, EPRI, OME, and AES) desire an operational evaluation of
the acidic deposition models, RADM and ADOM. There are presently no
acceptable monitoring data bases for an operational evaluation of RADM
and ADOM. Also, there is not agreement among evaluators and modelers on
the methods and procedures of comparing model predictions of acidic
deposition to future network monitoring results. Furthermore, there are
no acceptable data bases for diagnostic and mechanistic evaluation of
these models and their process modules (see Section 1.1 and Sarchet 1986
for definitions of the terms operational, integrated, diagnostic and
mechanistic evaluations). To address these problems, the clients sponsored
workshops to develop designs for a field measurement program to provide
the necessary information for evaluation. Workshop participants were
asked to provide answers to the following questions for each of the field
projects:
. What are the model/field variables to be compared?
. What are the spatial and temporal scale averages to be used
comparison?
. How will the comparisons be performed?
. How will model performance be judged?

Specific information on the inputs and outputs of the models was provided
to the participants by the modelers as a basis for developing answers to
these questions. Table 1. summarizes the temporal and spatial character-
istics of the input and output parameters for the various modules of RADM.
in the
There was a consensus among the participants at both the Model Evaluation
Protocols Workshop and the Field Study Design Workshop that the surface
monitoring network is required for an operational evaluation of the RADM
and ADOM. Field data on the chemical species in Table 1 that are involved
in wet and dry deposition must be obtained for a minimum period of two
years to perform operational evaluations of the models.

2.1.1 Sources of Error
The operational evaluation of the models must recognize and manage the
following sources of error:
2.1.1.1.
Representation of Fundamental Processes in the Model.
Those sources of error will be identified through four field study projects
(6-9) identified in Section 1.1 of this document.
The errors due to numerical solutions and computation will not be treated
by any of the nine projects identified in Section 1.1 of this document.
6

-------
TABLE 1.
----------------------------------------------------------------------
INPUT AND OUTPUT VARIABLES OF RADM.
----------------------------------------------------------------------
Transcort/Discersion
Incuts
3-Dimensional wind field
Temperature
Pressure
Pollutant or tracer concentrations
Outcuts
Inter-grid fluxes
Layer-average ccncentrations (15)

Gas Phase Chemistry
Incuts
Pollutant concentrations
Photolysis rates
Meteorology

Outouts
Layer-average concentrations and
chemical conversion rates for:
S02' 03' HOOH, HN03' HCHO. NO.
+ 2-
N02' VOC. PAN. NH3' NH4 . S04 .

and N03-
Wet Decosition
Incuts
Temperature
Precipitation
Pressure
Pollutant concentrations
Outouts
Vertical redistribution, chemical
conversion and wet deposition
fluxes of: S02' HN03' HOOH. 03'
+ + 2- -
NH3' H , NH4 ' S04 . and N03
7
Temcoral
Soatial
1 hour
80 Ian
II
II
II
II
a
II
1 hour
80 Ian
..
..
1 hour
80 Ian
a
..
..
a
1 hour
80 Ian
1 hour
80 km
"
II
..
"
..
"
1 hour
80 Ian

-------
Drv Decosition
Temcoral
Scatial
Incuts
Surface characteristics
Surface meteorology
Pollutant concentrations
1 hour
.
80 km
.
.
.
Outcuts
Surface fluxes of:
S02' HN03' 03' HOOH. NH3. NO. N02'
and partlcles
1 hour
80 km
-------------------------------------------------------------------------
2.1.1.2 Input Data

These sources of error are to be treated through field investigations.
The errors arise in estimating emissions (Project 4) and in estimating
fluxes of pollutants from outside of the modeling domain (Project 5).
Input or.emissions data for NH3' VOCs. H2CO. and organic acids are

particularly unreliable or unavailable. In order to model H+, ion balance
is required; this means accounting for emissions of cations not specified
in Table 1. There are input errors in the meteorological variables.
Many of these factors will be treated through model sensitivity studies.
2.1.1.3 Monitoring Data

. Sources of error in the monitoring data are to be treated by field
investigations (Projects 1 and 2). The errors which arise in network
monitoring are mostly traceable to non-ideal siting, sampler and instrument
performance, sample transport. and analytical laboratory performance.
The errors that occur in the surface network are a concern to this project
and DQO. Errors also arise in the spatial averaging (interpolation)
from a few stations; this is discussed in Section 2.1.2.2. Another
important source of error in the surface network is subgrid variability
(Le.. spatial variability within a grid-cell); it is treated in Project 3.
2.1.2 Model Operational Evaluation Problem

The following discussion is presented to provide information to the
implementor about the type of comparisons that may be made. The evaluation
of the models against the monitoring data base is not part of this project.
At the clients' workshops, no clear decisions were reached on the methods
that should be used in making the comparisons or on the criteria that
should be used in judging model performance. Specification of the
evaluation methods and comparison criteria will be presented in a future
task. However. these methods for comparing modeled and measured variables
applicable to this task are expected to be used in this model evaluation.

8

-------
2.1.2.1 Conventional Point to Grid Comparisons

This approach to model evaluation is based on the suggestions of the AMS
Workshop on Dispersion Model Performance (Fox 1981). This approach has
been applied to the evaluation of regional-scale models 1n the Regional
Air Quality Model Assessment and Evaluation Project sponsored by EPRI
(Ruff et al. 1984) and in the model evaluation work of the MOl (MOl 1982)
In this approach, the difference, di, between an observed variable, Coi'
and. the modeled counterpart, Croi, is the basic Quantity used in asSesslng
the performance of a model.
Assuming that all processes in the atmosphere are completely deterministic,
a perfect model driven by perfect initial and boundary condition data
would yieTd di = 0 at every point of comparison in the model domain. In
reality, however, this level of agreement is impossible. Real models
entail numerous compromises in terms of their spatial and temporal
resolution and in the degree of realism to which the various physical
and chemical processes occurring within the atmosphere can be represented.
Additional errors are introduced by the data used to drive the model.
These are never sufficient to accurately describe initial and boundary
conditions. Given these considerations, both the MOl and EPRI model
evaluation projects used the fOllowing criteria for model .perfection"
(Ruff et al. 1984):
(d> = 0, where d is the bias.

ud' the root mean square error, is small compared to the standard
deviation of Cro, Uc .
m
di is not a function of the model input parameters. (diCroi> = 0 is
a necessary, but not sufficient condition, for this state.
The definitions of these properties are:
(d> = (l/N) ri di

di = Coi - Cmi
ud = [ri (di - (d»2/(N - 1) ]1/2

Uc = [ri (Croi - (Cm»2/(N - 1) ]1/2
m
(di~i> = (l/N) ri di~i
where ~i and Coi are determined for some averaging time of interest.
In addition to these quantities, the EPRI study prepared scatterplots of
Coi versus ~i' showing the perfect fit line, the correlation coefficient
9

-------
(Coi - (Co») (Cmi - (Cm»/{ ri(Coi - (Co»)2

ri (Smi - (Sm»2 }1/2

and the mean bias, (d). Graphical and statistical comparisons can also
be made of the frequency distributions of Coiand Smi using the
Kolmogorov-Smirnov tests for agreement. .
r - r.
- 1
The principal advantage of this approach to model evaluation lies in its
conventionality. Assuming that the differences, di, are normally
distributed, the various statistical tests that can be applied to the
data are well established, and their meanings are fairly clear. The
principal disadvantage of the approach lies in the problem of comparing
grid-averaged model data with observations made at specific points.
Usually, there will be at most only one observation within any grid and
there is no reason, a priori, to expect a single-point measurement to
correspond to the grid-averaged value. One method of dealing with this
problem is to restrict comparisons to those stations that are thought to
be representative regional-scale phenomena. In the EPRI evaluations,
these stations were identified by autocorrelation analysis on hourly
averaged concentration data. The hypothesis was that high correlations
between I-hour or 3-hour ~easurements indicated sites primarily influenced
by large-scale processes; whereas low correlations indicated sites strongly
influenced by local sources.

Another problem with this method of model evaluation is that the
statistical measures of model performance are sensitive to slight spatial
misalignments between the observed and modeled fields. For example, in
areas with large gradients, large differences (dis) can result from slight
discrepancies in the shape and placement of the modeled and observed
fields. These discrepancies could, in reality, be inconsequential either
in terms of the intended use of the model or in terms of the actual spatial
resolution of the observational network. This problem can be addressed
through subjective comparisons of objectively analyzed and model-derived
fields, but this approach is subject to the weaknesses inherent to
subjective judgments.
2.1.2.2 Pattern Comparisons

Some of the problems of paint to grid Comparisons can be overcome by
pattern Comparisons. This method of comparison is also more consistent
with the way models will be applied in policy analysis. As indicated in
Section 1 of this DQO, EPA is especially interested in the ability of
the models to compute mean annual and seasonal deposition to areas on
the order of the size of states, or portions of states, and not necessarily
to specific receptor sites. In order to compare observed and mOdeled
deposition fields or patterns, however, a method must be found for
converting individual point measurements of deposition or ambient
concentration into an estimated field. One Possible technique is kriging
(Barnes 1980). Kriging is a data interpolation technique that uses a
weighted moving average to estimate the value of a function between points
10

-------
where it is actually known. The method can be applied to both model-
generated data and observational data. Kriging is expected to be applied
to both types of data for intercomparison of results.

The basic kriging method, called .simple kriging-, is based on two
assumptions:
. The expected value of the function is constant everywhere in the
domain of interest (i.e.J there are no trends in the data).

. The difference between data values at two points is a function only
of the distance between them (i.e., the variance is isotropic).
The procedure requires estimating the variogram function, which describes
how the variance of the differences between values at two points 'changes
with the distance between them. If the simple kriging assumptions hold,
the variogram is not difficult to estimate, provided that there are
sufficient data.
Kriging has several advantages that make it attractive for analyzing
deposition and concentration fields. One of these is the ability to
estimate the variance or the "kriging error- at each point for which an
estimate is made. The kriging error is defined as the difference between
the estimate and the true, unknown value. It can be used to select optimal
locations for additional measurements, or it can be used to put a kind
of "error band" or .confidence banda on the location of an interpolated
isopleth. This latter ability is particularly attractive for model
evaluation purposes. It enables the data analyst to make an objective
decision as to whether or not differences between observed and modeled
deposition or concentration patterns are significant compared to
uncertainties in the ability of the observations to define these fields.

An additional advantage of kriging is that the system of equations used
to derive the optimal weights for the weighted moving average does not
depend on the data values. Instead, the system depends on the variogram
and the relative geometry of the data points. This lack of dependence
on data values means that single, large values do not overly influence the
estimates. If the data contain information on measurement variance or
uncertainty, this information can be included in the kriging system so
that less reliable data are given less weight.
Finally, kriging enables estimates, as well as the uncertainties in these
estimates, to be averaged over grid blocks of varying sizes and over the
entire field.
Kriging is certainly no panacea. It has limitations. The most serious
of t~ese are obviously the fundamental assumptions of constant expected
value and isotropy. Although these assumptions can be relaxed somewhat,
in practice, it is clear that deposition data from the eastern United
States with their strong spatial gradients, could violate at least one of
them. If the data do not satisfy even the relaxed conditions of simple
kriging, the technique of universal kriging might be applicable. Universal
11

-------
kriging can account for trends in the data as long as these trends are
gentle enough to be represented by a low-order polynomial. Data drift
can then be accounted for by various procedures. As pointed out by Barnes
(1980), it is not valid to remove data drift by fitting a least squares
surface to the data and calculate the variogram on the residua1.s.

Even though the data set may not meet all of the criteria for its proper
application, it can still be used to produce reasonable-looking fields.
The problem appears to be in interpreting the meaning of the estimates
of kriging error. If the kriging assumptions are not met, the validity
of the estimates of interpolation error are questionable. This situation
would call into question the assumption that the estimates of kriging
error can be used as an objective test for agreement between observations
and model results. Work is currently underway to assess the ~ignificance
of this problem: and at the present time, not all individuals who have
investigated the use of kriging to analyze deposition and concentration
fields are convinced that the technique is completely valid.
Kriging does not place a great deal of restrictions on data collection
protocols. The technique can be used to analyze individual events or
long-term averages. Kriging is not affected by data gaps at individual
measurement locations, as long as the gap is not sufficiently long to
affect the sample average. The main demand that the technique places on
data collection is that the number of data points (i.e., the spatial
coverage) must be sufficient to adequately define the variogram (in the
case of simple kriging) or the spatial co-variance structure (for universal
kriging). The spatial density of the data must also be sufficient to
meet the accuracy requirements for a given evaluation objective since
kriging error, in a particular region of the domain of application, will
decrease with increased density of data in that region.

2.1.2.3 Principal Components Analysis
Principal components analysis (PCA) is another technique that has been
suggested for use in evaluating model performance. It is essentially a
technique for multivariate analysis of complex systems that are
characterized by a large number of interdependent variables. It can be
used for both temporal and spatial data analysis. The basic idea of PCA
is to find a linear transformation that will change the original set of
correlated variables into a set of independent, uncorre1ated ones. The
key for finding this transformation is to diagonalize the correlation
matrix formed from the original data set: for if a set of variables are
independent, their correlation matrix will be a diagonal one: namely,
the identity matrix. This transformation is done by finding the eigen
values and eigen vectors of the correlation matrix. The ordering of the
principal components (first, second, third, etc.) is given by the magnitude
ofHthe associated eigen values. It can be shown that the first principal
component is that linear combination of variables that explains the
greatest amount of variability in the original data. The second principal
component explains the next largest amount of variability, and so forth.
Usually, most of the information contained in the original data can be
explained by a small number of principal components.
12

-------
It is the significance of the ordering of the principal components that
makes PCA attractive for model evaluation. In such an evaluation, PCA
would be applied to modeled and observed data at a series of points. If
the model is an accurate representation of the physical and chemical
system, then the principal components, or at least the first two or three
derived from the model results, should be the same as those determined
from the measurements. .
A principal concern about PCA as a model evaluation tool is uncertainty
about the robustness of individual principal components. The technique
is definitely sensitive to outliers. Thus, differences between the model
and the observations that might not have any practical significance,
could completely change the definitions of the principal components. An
investigation of the sensitivity of the technique to these factors is
required before it could be applied in practice. A question related to
robustness is how to interpret the results. It is clear that if the
significant principal. components for a model-produced data set and for
the actual observations are identical, then the model is probably doing
something right. But what does it mean if the definitions of the principal
components are different? What is an objective measure of the amount of
disagreement?

The.demands that PCA places on a data set8are somewhat more severe than
those made by kriging. PCA attempts to model the temporal response of
the key variables in a system at specific points in space. Thus, the
technique requires simultaneous time-resolved measurements of each variable
to be included in the analysis at every point of interest.' Excessive
data gaps or frequent periods in which the measurements are close to the
detection limit of the instrument, a situation which results in little
variation or signal being introduced into the data record, will result
in increased instability in the makeup of the derived components. Such
instability, of course, reduces the confidence that one might have in
the significance of the principal components. In judging the sufficiency
of a given data set for PCA, one rule of thumb states that the number of
degrees of freedom per variable should be more than 30 and, if possible,
equal to 100 or more. In addition, the temporal resolution of the data
must resolve the temporal behavior of the major processes affecting the
response of the system. In terms of data completeness, PCA requires at
least a 90% valid data capture rate.
2.2 Specific Statement of the Problem

In the preceding section, a general discussion of problems that are
expected to be encountered in the regional-scale model evaluation program
were presented. In this section, specific discussions pertaining to the
monitoring network(s), as described in this OQO, are presented. In the
Workshop on Model Evaluation Protocols (Pennell 1986), participants called
for a model evaluation network with 24-hour sampling of precipitation
chemistry and 12-hour sampling of aerometric variables. However, the
Canadian networks (CAPMoN and APIOS) are presently monitoring with a
24-hour average sampling protocol. Although the RFP for EPRI's Operational
13

-------
Evaluation Network (OEN) (RP2434-4) contains an option for 12-hour
sampling, it is likely that EPRI will not exercise it. Also, EPA does
not have the resources to fund 12-hour sampling and still maintain the
spatial coverage of the planned network. To assist in attaining uniform
protocols, the EPA ME-35 network will perform 24-hour average sampling
for model evaluation.
In eastern North America, wet deposition and air concentrations will be
monltored by NAPAp.s National Trends Network (NTN), EPRI.s Operational
Evaluation Network (OEN), and the Canadian networks (CAPMoN and APIOS)
The major difference in the operation of the networks is the sampling
period. The sample averaging period for the NTN is one week, whereas
the other networks will be following a daily sampling protocol, although
presently there are differences. Networks with a common or highly
compatible protocol are required to provide information on the wet
deposition and surface concentrations of key acidifying materials.

The purpose of this project is to monitor and to produce a data base of
24-hour average wet deposition and air concentrations at 35 of the EPA
NTN Dry Deposition sites (including 7 MAP3S Precipitation Chemistry Network
sites) using sampling methods, analyses, and QA/QC that are highly
compatible with those of EPRI, OME, and AES (A workshop will be convened
in September 1986 to negotiate common samplers and protocols or to
establish procedures for intercomparison experiments). The data base
will be used to evaluate the performance of the models. The recommended
variables to be compared in the evaluation were identified in both the
field studies (Barchet 1986) and model evaluation protocols (Pennell
1986) workshops. These 24-hour average variables are specified in
Table 2.
The "second priority" variables indicated in Table 2 reflect the clients'
recognition that dependable, inexpensive samplers and analyzers have not
been demonstrated. If suitable methods become available, those species
will be moved into the "first priority" category. Some+"first priorityU
variables, such as the metal cations, are needed with H for ion balance
calculations.
Those variables in Table 2, and additional ones, will be monitored with
an averaging time of about 3-6 hours at a subset of about 3-6 of the ME-
3~ stations during the intensive periods. However, those experiments are
not included in this project and DQO (e.g., see the DQO for Project 9,
"Evaluate Gas Phase Chemistry Module").

The NTN will consist of 150 wet deposition monitoring sites operated by
the USGS and up to 100 selected air concentration monitoring sites (NTN-
Dry) operated by EPA/EMSL. The latter sites will provide estimates of
weekly wet and dry deposition, respectively. The first 35 sites of the
100-station NTN-Dry that EMSL will install are in the northeastern United
States. The number and location of these sites have been selected, with
consideration of the existing Canadian CAPMoN and APIOS and EPRI OEN
sites, in order to optimize the capability of the combined networks to
14

-------
TABLE 2. VARIABLES SPECIFIED BY THE CLIENT TO BE MONITORED FOR
MODEL EVALUATION
------------------------------------------------------------------------
------------------------------------------------------------------------
Gases (24-hour average):

First priority: S02' HN03' 03' NH3'
Second Priority: NO, PAN and H2CO.

Particles (24-hour average):
N02' and HOOH.
+ + 2- - -
H , NH4 ' ~04 ' N03 ' and C1 .
metals (V, Mn, Fe, As, Se, Sb, Hg, Pb).
metals (Na, Mg, K, Ca),
2- - + - 2-
S04 ' Cl , NH4 ' N03 ' and C03 .
Precipitation (24-hour average):

First priority: H+ (free), conductivity, S042-, N03-' NH4+' Cl-,
Na+, K+, Ca2+, Mg2+, HOOH, and S(IV).
First priority: (2 pm
Second priority: (2 pm
2-10 pm
Second priority:
Thi rd pri ori ty:
H2CO.
metals (V, Mn, Fe, .As, Se, Sb, Hg, Pb).
Meteorology (3-hour average):
Surface wind speed and direction, temperature, pressure,
precipitation, relative humidity, and insolation.
-----------------------------------------------------------------------
resolve significant spatial patterns and to achieve an acceptable level
of uncertainty (e.g., a goal of z30% on the seasonal averages). Kriging
techniques described in Section 2.1.2.2 were used in the optimization
analysis.

This project DQO, for 35 EPA Model Evaluation (ME-35) stations (in addition
to the 35 CAPMoN, APIOS, and OEN stations) relates only to the 35 EPA/EMSL
NTN Dry Deposition (NTN-35 Dry) sites located in the northeastern United
States. EMSL will establish its 35 weekly air concentration monitoring
sites through a separate DQO and contract (see Appendix A and Figure 1
for their approximate locations). The purpose of this task is to establish
at the NTN-35 Dry the capability to obtain the variables identified in
Table 2 without perturbing the I-week NTN wet and dry protocol. The
NTN-35 Dry will measure and record continuously the meterological variables
and 03 and will provide to the ME-35 the quality assured magnetic tape
copies of these variables for each station. However, the ME-35 will
operate separate gas samplers, particle samplers, and precipitation
samplers. Each model evaluation case is expected to be 2-4 days. This
15

-------
+
+
<>
+
".
. - "EPA. Var
o - EPA. I1E-35
o . - EPRI, 081
.1 - HY - DEC
+ - C4nadfan
X - Proposed HTN
o - Proposed Canadi an
I


.~

I
I
I
r

I
i
i
I

I
I
I
I
.
!
.
i
I
I
I
i
i
FIGURE 1.
Model Evaiuation Network Sites
evaluation plan provides for the integration through the diurnal cycle
while preserving the major featur~s of the air concentration and chemical
deposition patterns. .
The clients recognize that research monitoring coordinated among several
networks and funding agencies has not been previously attempted.
Therefore, an intensive QA/QC program is required. The Toronto QA workshop
participants recommended that one site in each of the four networks be
designated as an inter-network comparison site, where samplers from each
16

-------
of the networks using their respective protocols would be collocated
(Olsen 1986). Thus for the EPA, three additional sites are needed.
Duplicate samples must be obtained at each of the inter-network sites,
as well as at 10: of the other sites. Interlaboratory comparisons will
be required also. Revision of the DQO is expected after the review of
the initial data base and sampler performance records. Common start and
stop times for the daily sampling are also needed.

With respect to microscale siting decisions, various site selection
criteria have been developed for selecting deposition and aerometric
monitoring sites that are reasonably free of influence from local sources.
It is presumed that sites chosen according to these criteria will be
representative of grid-scale averages; however, the truth of this assertion
can never be proven a priori. In order to examine the question of how
representative a single measurement might be of a grid-scale average, a
subgrid variability study will be conducted as part of the model evaluation
program. This study will be summarized in a separate DQO (see Project 3,
"Determine Subgrid Variability'I).
3.
Clients' Descriction of the Acclication of the Product
The product of this project is a quality assured regional acid deposition
monitoring data base of 24-hour average wet deposition and air concen-
trations. The EPA, EPRI, OME, and AES evaluators will use that data
base to compute monitoring station time series, deposition patterns, and
principal components. The evaluators will compare those results with
RADM and ADOM predictions.
For the first year1s operation, all data will be released to the model
developers for use in testing and improving model performance as soon as
they have been quality audited and after a blind evaluation of the models
for selected events during that year has been performed. A comparison
of the revised models predictions against the first year1s data will
also be conducted to establish the level of performance improvement, if
any. The second year's data, however, will be quality audited and
sequestered for use in a hands-off model evaluation exercise.
Major milestones and related reports that depend on this project are:
Qili..
Title
03/89
Report on blind operational and diagnostic evaluations of RADM,
ADOM and MesoSTEM against first 6-months FY88 benchmark data
base and results. .
09/89
Report on blind operational and diagnostic evaluations of RADM,
ADOM and MesoSTEM against second 6-months FY88 benchmark data
base and results.
06/90
Report on operational and diagnostic evaluation of revised
RAOM, revised ADOM and revised MesoSTEM against all FY88 data.
17

-------
12/90 .
12/91
Report on the blind operational and diagnostic evaluation of
revised RADM, revised AD OM and revised MesoSTEM against all
FY89 data.
Final report on operational and diagnostic evaluations of RADM,
ADaM and MesoSTEM.
4.
Clients' Constraints of Time and Deliverables
For the clients to deliver the products in Section 3 on schedule, this
project must provide the deliverables according to the following schedule:
Date
01/87
04/87
08/87
10/87
03/89
06/89
09/89
03/90
04/90
06/90
09/90
5.
Titl e
Approved work plan with SOPs and QA/QC plan.

Begin installation of model evaluation (ME-35) monitoring
network.
Pre-monitoring QA assessment report.
OEN and ME-35 Network operational.'
QC report for FY88 data.
Quality assured data base for FY88.
(Released to modelers.)
Quality audit report for FY88 data base.
QC report for FY89 data.
Network dismantled.
Qua 1 i ty assured data base for FY89. (Sequestered from mode 1 ers.)
Final quality audit report.
Constraints on Budget

Planning budgets are confidential. This section will be completed upon
award and negotiation of revisions to this DQO.
6.
Imolementor's Discussion of Alternatives and Selection of Aooroach
6.1 General Approach

None. There are no alternatives that will equal the cost effectiveness
of surface monitoring.
18

-------
6.Z Spatial Alternatives

None. The clients have specified the site locations to be the EPA NTN-35
Dry. Specifications for the OEN, CAPMoN, and APIOS site locations have
also been established. The candidate sites are shown in Figure 1 and
listed in Appendix A.
6.3 Temporal Alternatives

None. The clients are specific (Table Z) in stating the requirements; they
are cost effective for the client's goals. In order to assure uniformity
in the sampling, the clients need to agree on a common start and stop
time (common GMT) for the daily sample collections, e.g., start after
sunrise, but before the photochemical cycle is established.
6.4 First-Priority Gases and Fine Particles (Z pm)

Filter-pack and diffusion-denuder type samplers are the most cost-effective
approach to monitoring the gases and fine particles identified in
Table Z. The status of those samplers is:
6.4.1 Transition-flow reactor sampler (adaptation of Canadian filter
pack)

. Successfully field-demonstrated in 1985 by EMSL and ASRL for HN03'

NH3' SOZ' NOZ' fine particulate N03- and S04Z- (Knapp et al. 1986).'

. Successfully demonstrated in the California Nitric Acid Inter-
comparison Study (9/85) for HN03' NOZ' NH3' fine particulate N03-
+
and NH4 (Ellestad et al. 1986).
. The TFR sampler is highly compatible with the Canadian filter pack;
it consists of a cyclone (050 = Z pm), followed by a tube (the TFR)
lined with films reactive for HN03 and NH3' followed by a sampler

similar to the Canadian filter pack. The TFR permits HN03 and NH3
to be monitored without evaporation biases.
. In TFR, HN03 is collected on nylon film liner and extracted and
analyzed as N03- by IC. NH3 is collected on Nafion film liner and
+
extracted and analyzed as NH4 by IC.

. In filter pack, S04Z-, N03-, NH4+ are collected on Teflon filter,
extract;d and analyzed by IC. HN03 is collected on Nylon filter,
etc. NH3 is collected on oxalic acid, extracted, and analyzed as
NH4+ by IC. SOZ is collected on KZC03 coated filter, extracted,
oxidized, and analyzed as S042- by IC. NOZ is collected by TEA-coated
filter, extracted, and analyzed as N02- by IC.

19

-------
. TFR presently operates at three EPA prototype sites.
6.4.2 Canadian filter pack
. Successfully field-demonstrated in 1985 by EMSL for HN03 plus fine

particulate N03-' NH3 plus fine particulate NH4+, S02' and S042-.

. Successfully demonstrated in the California Nitric Acid
Intercomparison Study (9/85) for HN03 plus fine particulate N03-'

NH3 plus fine particulate NH4+, S02' and S042-. However, estimates

of HN03 were 20-60: greater than spectroscopic measurements.
. Presently operating in Canadian networks and at six EPA prototype
sites. Designated by EPRI for its Operational Evaluation Network.

.6.4.3 Denuder-difference sampler
. Not successfully field-demonstrated by EMSL or ASRL for HN03' NH3'
- 2-
S02' N02' fine particulate N03 and S04 .

. Successfully demonstrated by four independent laboratories in the
California Nitric Acid Intercomparison Study (9/85) for HN03 and fine
particulate N03-.
6.4.4 Annular-denuder sampler

. Not successfully field-demonstrated by EMSl or ASRl for HN03' NH3'
- 2-
S02' N02' fine particulate N03 and S04 .
. Not successfully demonstrated in the California Nitric Acid
Intercomparison Study (9/85) for HN03 and fine particulate N03-.

Inter-laboratory variance was large; average values for HN03 and

particulate N03- were about 20-30% below other methods that exhibited
good agreement with spectroscopic measurements of HN03.

Best sampling approach: either the TFR sampler, the revised TFR, or the
Canadian filter pack. The clients propose to convene a workshop in
September 1986 to adopt common samplers and protocols or intercomparison
experiments. The method of choice of EPA is the TFR sampler. It is
proven for all of the first-priority species, except HT; 03 will be
provided by the NTN-35 Dry. The TFR sampler is derived from the Canadian
filter pack and is highly compatible with the Canadian and EPRI networks.
The TFR sampler is not being commercially produced; however, descriptions
of theory and design are available.
6.5 Second-Priority Gases
20

-------
6.5.1 H2CO

Formaldehyde will be collected by Waters Associates Sep-Pak C18 cartridges
coated with 2,4-DNPH, followed by HPLC of derivatives. (Kuwata et al. 1983).

6.5.2 PAN and HOOH
PAN and HOOH are not planned for routine sampling; no routine cumulative
sampling 1s yet demonstrated. However, HOOH should be implemented at as
many sites as possible.

6.6 Second-Priority Particles
Particles are collected with manual dichotomous sampler on Teflon filters.

. + + 2+ 2+
Partlcles are extracted and analyzed for Na , K , Ca , and Mg by AA
or IC. Other metals are analyzed by Inductively Coupled Plasma Emission
Spectroscopy (ICPES) or x ray fluorescence; other soluble cations and
anions are analyzed by IC.

6.7 Precipitation
Precipitation shall be collected with the Aerochem sampler using the NTN
sampling and analysis protocol. Changes in sampling and analysis protocol
may be anticipated to achieve compatibility with EPRI, OME, and AES (see
the end of Section 6.4). A separate collector is required for each of
the species, S(IV) and HOOH, because of sample preserving techniques.
In areas where more than 20: of the annual precipitation is snow, Nipher
gauges shall also be used. QC diagnostic information (rain gauge readings,
sampler lid position, etc.) must also be recorded. The following analysis
methods apply:

. First priority precipitation species: H+ by pH electrode; acidity
by strong base titration; cations by AA and anions by IC; S(IV)
collected with TCM and analyzed by pararosaniline method.
. Second priority precipitation species:
for preserving sample.

. Third priority precipitation species:
no method yet demonstrated
by ICPES.
Only precipitation samples with volumes greater than 10 ml will be
analyzed.

6.8 Instrument Shelter
The following requirements apply: out-door, ~eather-proof; temperature
controlled (%5 C); volume approximately 2-4 m. Alternative: use the
NTN-35 Dry stations and shelters.
21

-------
6.9 Data Acquisition System

No data acquisition system is specified. Chart recorder of sampler flow
rate.
6.10 Partitioning of Variance

Variance will be partitioned into sampler preparation, sampler operation,
sampler handling, transport, storage, laboratory sample treatment, and
analysis. Techniques will incl~de collocated samples, field replicates,
and field splits. The analytical laboratory will perform laboratory
replicates, laboratory splits, analysis replicates, and analysis splits.
The variance will be managed to load the major contributions into sampler
operation and anal¥sis.
6.11 Partitioning of Bias

Based on recovery, bias will be partitioned into inter-network, matrix
effects associated with sample preservation, shipping, preparation, and
analysis. Techniques will include exchange of references, field matrix
spikes, laboratory matrix spikes, and analysis matrix spikes.
6.12 Data base: ADS.
6.13 Relationship to MesoSTEM

The spatial scale of the ME-35 (and OEN, CAPMoN, and APIOS) network is
about 250 km; this is too coarse to provide an evaluation of MesoSTEM,
which requires a scale finer than about 25 km.
7.
Data Quality Objective Statement for Imclementor1s Selected Accroach
7.1 Precision and Accuracy
7.1.1 Model:
not applicable to this project.
7.1.2
Surface measurements
Standard and research analytical procedures will be used for the
determination of the concentrations of species related to acidity.
Precision and accuracy goals for this task are summarized in Table 3.
7.2 Representativeness

Spatial locations are specified by the client(s). Approximate locations
of the NTN-35 Dry are listed in Appendix A and shown in Figure 1.
Temporal: 24-hour average samples for 2 years. This sampling period
does not capture the diurnal cycle, which modelers have requested.
However, the problem is diminished by constructing synoptic cases of 2-4
days length (by adding daily record) for the comparison with the models.
22

-------
------------------------------------------------------------------------
TABLE 3. PRECISION AND ACCURACY GOALS FOR EPA ME-35
(24-HOUR AVERAGE)
------------------------------------------------------------------------
       Estimate
     Ana1ytic~1) Ana1~t~ca(b) of Overall
Scecies   ~ Accuracv Prec1s10n Precision
Gases (,uq/m3)     
S02    1-200 .10: .10: .15%
HN03   1-20 II II II
NH3    1-20 " II u
NOx (NO, N02)  1-20 " " II
H2CO   1-20 " II It
HOOH   1-20 " " It
PAN    0-10 " II "
VOC    ? " It "
Ozone   1-250 It " II
Fine Particles (JLO 1m3)    
SO 2-   1-50 .10: .10: .15%
4   
N03 -   1-20 u " u
NH4+   1-20 " " "
C1-    1-20 " " "
H+ (will not be measured)    
Coarse Particles
SO 2-
4
N03-
NH +
4
Cl-
CO 2-
3
Metals
(uq/m3)
1-50
1-20
1-20
1-20
1-20
10-3_10
Prec;citation
+
H (as pH)
conductivity
SO 2-
4
N03-
NH +
4
(uf'1/L) (c)
(Pi' em)
2-8
1-200
0.2-10
0.2-10
0.2-10
.10: .10: .15%
" " U
It " "
" " "
II II II
" " II
.5% .0.1 unit d5:
.5: .5:  .15%
.10: .10:  .15:
" "  "
" "  "
23   

-------
Scecies
Precicitation (~/l)(c)
~
Ana1ytictl)' Ana1~t~catb)
Accuracy Preclslon
Estimate
of Overall
Precision
S(IV)
C1-
Na+
K+
Ca2+
Mg2+

Other Metals
0.2-10 .10% .10: .15%
0.1-10 " " II
0.1-10 " " "
0.1-10 " .. ..
0.2-20 " .. "
 . .
0.2-20 " " ..
10-3_10 " " "
(a) The difference as a percentage of the reference or true va1ue~ or
the percent recovery of a spike.
(b) Expressed as a percent relative standard deviation of replicates of
ambient samples (or laboratory referen~es if insufficient ambient
samples).
(c) For sample volumes greater than 10 ml.
------------------------------------------------------------------------
7.3 Completeness

Completeness is defined as the number of valid data points acquired divided
by the total number planned. Required data capture rate (after quality
auditing) at each ME-35 station is 90% for each variable specified in
Table 3. Since the sampling frequency is once per day, at least 330
precipitation and air concentration filter samples must be obtained per
year to meet this completeness requirement.
REFERENCES:
Barnes, M. G. 1980. "The use of Kriging for Estimating the Spatial
Distribution of Radionuclides and Other Spatial Phenomena" TRAN-STAT:
Statistics for Environmental Studies, No. 13, PNL-SA-9051, Pacific
Northwest laboratory, P.O. Box 999, Richland, Washington
Ell estad, T. E.
Fox, D. G. .1981. "Judging Air Quality Model Performance", Bull. Amer.
Meteorol. Soc., 62, No.5, pp 599 - 609.
Knapp, K. T. 1986.
Kuwata, K. M., M. Uebori, H. Yamasaki, Y. Kuge, and Y~ Kiso. 1983.
"Determination of Aliphatic Aldehydes in Air by liquid Chromatography."
Anal. Chern., 55:2013.
24

-------
Ruff, R. E., K. C. Nitz, F. L. Ludwig and C. M. Bhumralkar. 1984.
Recional Air Oualitv Model Assessment and Evaluation. EA-3671, Electric
Power Research Institute, Palo Alto, California.

United States-Canada Memorandum of Intent (MOl) Work Group 2. 1982.
Final Report on Atmospheric Sciences and Analysis. Washington~ D.C. and
Toronto, Canada.
8. Summary

This DQO, with exceptions clearly indicated in Sections 6 and 7, meets
the goals expressed by the clients for the agreed upon approach.
.
9.
DOO Recommended bv
10.
DOO Accroved bv
25

-------
  APPENDIX A. CLIENTS' NETWORK SITE LOCATIONS
 Lat. Long.  1.0. 
EPA NTN-35 Ory Network Sites  
1. 35.900 -78.867 A $ 101 2
2. 35.950 -84.283 A $ 102 5
3. 41.350 -74.033 A $ 103 27
4. 44.383 -73.850 A $ 104 29
5. 40~783 -77.933 A S 105 34
6. 37.300 -78.000 CIS 106 10
7. 39.083 -79.567 81$ 107 14
8. 43.800 -72.000 83S 109 26
9. 42.733 -76.650 A $ 110 31
10. 41.100 -80.000 83$ 112 35
11. 40.500 -83.500 0 $ 114 39
12. 42.050 -84.033 C2S 115 40
13. 39.000 -76.900 0 $ 116 9
14. 40.300 -79.700 0 $ 117 12
15. 38.033 -78.533 82S 118 13
16. 38.750 -81.000 0 $ 119 15
17. 37.038 -81. 033 C2S 120 16
18. 37.067 -82.983 81S 121 18
19. 39.517 -84.717 A $ 122 19
20. 41. 000 -82.100 CIS 123 37
21. 43.000 -83.000 0 $ 124 38
22. 36.000 -81.000 Cl$ 126 3
23. 36.300 -86.500 0 S 127 6
24. 40.100 -76.800 CIS 128 8
25. 37.667 -84.967 81$ 129 20
26. 40.050 -88.367 A $ 130 23
27. 39.800 -86.500 0 $ 133 44
28. 44.200 -89.900 0 $ 134 36
29. 45.483 -69.650 CIS 135 47
30. 35.050 -83.417 81S 137 4
31. 38.733 -87.483 81$ 140 21
32. 40.000 -75.000 CIS 144 30
33. 41.700 -87.983 A $ 146 43
34. 40.067 -81.133 C2S 113 36
35. 43.000 -85.500 0 S 149 *
26

-------
 Lat. Long.  1.0.  
EPRI OEN Sites     
36. 44.617 -68.967 U $ U01 * 
37. 44.517 -72.867 U $ U02 * 
38. 42.583 -72.533 U $ U03 * 
39. 43.817 -74.900 U $ U04 * 
40. 41.567 -75.983 U $ U05 * 
41. 39.983 -82.017 U $ U06 * 
42. 39.233 -82.467 U S U07 * 
43. 41. 033 -85.317 U S U08 * 
44. 37.867 -87.117 U $ U09 * 
45. 44.933 -84.633 U $ U10 * 
46. 38.133 -83.450 U $ U11 * 
47. 35.783 -89.133 U $ U12 * 
48. 35.717 -78.667 U $ U13 * 
49. 32.050 -82.467 U $ U14 * 
SO. 32.467 -87.083 U $ U15 * 
51. 32.350 -90.283 U S U16 * 
52. 44.708 -88.624 U S U24 * 
53. 46.233 -91.936 U S U25 * 
New York Department of Environmental Conservation Sites (if supported)
54. 43.300 -74.100 N $ N01 * 
55. 42.200 -74.800 N $ N02 * 
56. 44.600 -75.400 N $ N03 * 
57. 42.100 -77.000 N S N04 * 
58. 42.100 -79.400 N S N05 * 
Canadian Sites (OME Recommended)   
59. 44.1' -77.8 0 $ 002 * 
60. 44.2 -81.0 0 $ 005 * 
61. 42.8 -81.5 0 S 001 * 
62. 44.2 -65.8 0 $ 007 * 
63. 46.8 -71.6 0 $ 008 * 
64. 45.2 -72.5 0 S 006 * 
65. 49.0 -74.6 0 $ 011 * 
66. 45.8 -77.3 0 S 004 * 
67. 45.3 -79.5 0 S 003 * 
68. 49.5 -82.6 0 $ 010 * 
69. 46.7 -84.2 0 $ 009 * 
27

-------
Lat.
Long.
I.D.
Top Priority Sites for Any Addit10nal NTN Sites in FY87 (Network=100)
70.
71.
72.
73.
34.500
38.500
41.500
39.900
-78.500 F1S
-76.000 F2S
-72.200 F3S
-78.800' F4$
136 1
F02 *
F03 *
F04 *
Suggested Priority Sites for Additional Canadian Sites
74.
75.
76.
77.
46.300
44.400
43.500
43.100
-74.800
-79.000
-81.400
-79.300
FC1S
FC2S
FC3S
FC4$
FC1 *
FC2 *
FC3 *
FC4 *
28

-------