ATMOSPHERIC
ENVIRONMENT
SERVICE
U.S. ENVIRONMENTAL
PROTECTION
AGENCY
PROJECT PLAN
for the
ACID DEPOSITION
EULERIAN MODEL
EVALUATION
and
FIELD STUDY
ELECTRIC POWER
RESEARCH INSTITUTE
FLORIDA
ELECTRIC POWER
COORDINATING GROIUP
Prepared by
D. Alan Hansen
Prepared for
THE PROJECT MANAGEMENT GROUP
MINISTRY OF THE
ENVIRONMENT
February 1989
-------
100R891O9
PROJECT PLAN
for the
ACID DEPOSITION EULERIAN MODEL EVALUATION
AND FIELD STUDY
February 1989
Prepared by
D. Alan Hansen
Electric Power Research Institute
Palo Alto, California
Prepared for the
PROJECT MANAGEMENT GROUP:
Keith J. Puckett
Atmospheric Environment Service
Environment Canada
Downsv i ew, Ontario
D. Alan Hansen
Electric Power Research Institute
Palo Alto, California
H. Michael Barnes, Francis A. Schiermeier
Atmospheric Research and Exposure Assessment Laboratory
U.S. Environmental Protection Agency
John J. Jansen
Florida Electric Power Coordinating Group
Tampa, Florida
Maris Lusis
Ministry of the Environment
Toronto, Ontario
-------
PMG PROJECT
Ver. 4, 2/89
This report: has not been reviewed to determine
whether it contains patentable subject matter, nor
has the accuracy of its information or conclusions
been evaluated. Accordingly, the report is not
available to the public and its distribution is
limited to advisors and participants in the Eulerian
Model Evaluation Field Study for the sole purpose of
evaluating its progress and future course. The
Electric Power Research Institute assumes no
liability for the accuracy of the report's contents.
ACKNOWLEDGMENTS
The efforts of the other members of the Project Management
Group (Mssrs. Barnes, Jansen, Lusis and Puckett) in supplying
information and in reviewing various draft manuscripts of
this plan are gratefully acknowledged. Without their support
and the timely response of their staffs and contractors to
information requests, completion of this plan would not have
been possible.
-------
PMG PROJECT PLAN
Ver. 4, 2/89
PREFACE
The purpose of this project plan is twofold. The first component
is to serve as a general source of guidance to the Project
Management Group (PMG) and its technical oversight Teams in their
quest for a successful evaluation an outcome that depends
critically on the development or acquisition of well defined
evaluation methods, observational data of known uncertainty, and
the ability to interpret the results in a meaningful way. The
second is to provide a framework for consolidating the activities
of the individual participants in the bilateral acid deposition
model evaluation study into a cohesive whole.
The field study components, in particular those relating to the
surface network, of the overall model evaluation effort are more
thoroughly described in this plan than are the procedures for
evaluating the Eulerian models. This is a consequence of the fact
that the evaluation procedures were still evolving from concepts
to detailed implementation plans over the period this document was
produced.
Representatives of the participating organizations* have agreed
that the following principles should guide the PMG:
o Each measurement activity will be operated according to a
comprehensive quality assurance plan.
* Atmospheric Environment Service of Environment Canada, Electric
Power Research Institute, U.S. Environmental Protection Agency,
Florida Electric Power Coordinating Group, Ontario Ministry of
the Environment
-------
PMG PROJECT
Ver. 4, 2/89
o Procedures will be developed and adopted by the
participants that will ensure to the extent practicable
the comparability of measurement methods.
o All activities related to model evaluation will be
coordinated among participants.
The framework will be assembled by describing the genesis of the
model evaluation study, what data each of the participants are
collecting to support the model evaluation, what the quality
objectives are for the data, how those objectives will be
achieved, where the data will reside, and how the model evaluation
will be carried out.
It is hoped that implementation of this plan will contribute to
achieving a scientifically credible and technically defensible
model evaluation.
-------
PMG PROJECT PLAN
Ver. 4, 2/89
TABLE OF CONTENTS
Page
PREFACE ii
LIST OF TABLES vi
LIST OF FIGURES vii
1. BACKGROUND 1-1
1.1 Development of ADOM and RADM 1-1
1.2 Commitment to Model Evaluation 1-3
1.3 Types of Model Evaluation 1-4
1.4 Field Study Planning 1-5
2. ORGANIZATION 2-1
2.1 Overall Study Organization 2-1
2.2 Model Evaluation Team Support Organization 2-1
3. OBJECTIVES 3-1
3.1 Project Management Group 3-1
3.2 Technical Oversight Teams 3-2
3.2.1 Operational and diagnostic measurements 3-2
3.2.2 Emissions inventories 3-3
3.2.3 Model evaluation 3-3
4. DATA QUALITY OBJECTIVES 4-1
5. DELIVERABLES AND SCHEDULE 5-1
5.1 PMG 5-1
5.2 Technical Oversight Teams 5-1
5.2.1 Operational measurements 5-1
5.2.2 Diagnostic measurements 5-1
5.2.3 Emissions inventories 5-5
5.2.4 Model evaluation 5-5
6. AEROMETRIC AND PRECIPITATION MEASUREMENTS 6-1
6.1 Field Measurements 6-1
6.1.1 EPA: ACID-MODES 6-10
6.1.2 OME: APIOS 6-12
6.1.3 AES: CAPMoN, enhanced chemistry, aircraft 6-17
6.1.4 EPRI: OEN 6-21
6.1.5 FCG: FADMP 6-21
6.1.6 Complementary programs 6-21
6.2 Emission Inventories 6-27
6.3 Data Base Management 6-28
6.4 Methods Characterization 6-29
6.5 Quality Assurance Auditing and Corrective Action 6-29
6.6 Inter-network Comparisons 6-36
6.6.1 Colocation of field measurement systems 6-37
6.6.2 NWRI QC comparison on precipitation samples 6-38
6.6.3 Filter pack testing 6-39
6.6.4 AES/EPA airborne measurements comparisons 6-39
-------
PMG PROJECT PLAN
Ver. 4, 2/89
TABLE OF CONTENTS (Continued)
Page
6.7 Intra-network Colocation 6-39
6.8 Common Filter and TFR Supplier 6-40
6.9 Composite Data Archive 6-40
6.10 Individual Network Data Archives 6-42
7. EMISSIONS 7-1
8. MODEL EVALUATION PROTOCOLS 8-1
8.1 Operational Evaluation 8-2
8.2 Diagnostic Evaluation 8-4
8.3 How Models Will be Run to Obtain Averages 8-5
9. REFERENCES 9-1
APPENDICES A-l
A. PMG Charter A-2
B. Pertinent Quality Assurance Plans A-6
-------
PMG PROJECT PLAN
Ver. 4, 2/89
LIST OF TABLES
Table Page
1-1 Planning and Design Meetings 1-8
2-1 External Review Panel 2-6
4-1 Data Quality Objectives 4-2
Air Quality 4-2
Precipitation Chemistry 4-4
Meteorology 4-5
5-1 Schedule 5-2
6-1 Model Evaluation Field Study Site Locations 6-3
APIOS (OME) 6-3
CAPMoN (AES) 6-5
OEN (EPRI) 6-6
ME-35 (EPA) 6-7
EPA Optional and Supplementary and TVA Sites 6-8
EPA Gradient Resolution Network (GRAD) 6-8
EPA Sub-grid Variability Network (VAR) 6-9
FADMP (FCG) 6-9
6-2 ME-35 Measurement Techniques 6-11
6-3 Measurement Techniques During Intensives 6-13
EPA 6-14
AES Ground-based Measurments at Egbert 6-14
Additional AES Measurements at Egbert 6-15
OME Ground-based Measurements at Dorset 6-16
6-4 APIOS Measurement Techniques 6-18
6-5 CAPMoN Measurement Techniques 6-19
6-6 Airborne Measurements to be Taken by AES 6-20
6-7 OEN Measurement Techniques 6-22
6-8 FADMP Measurement Techniques 6-24
6-9 Georgia Tech Intensive Measurements 6-26
6-10 Methods Performance Characterization 6-30
Laboratory Tests 6-30
Field Tests 6-32
References 6-33
6-11 Filter Specifications 6-41
6-12 Data Archive Contents 6-43
-------
PMG PROJECT PLAN
Ver. 4, 2/89
LIST OF FIGURES
Figure Page
2-1 Model Evaluation Organization 2-2
2-2 Model Evaluation Team Support Organization 2-3
6-1 Surface Network Sites 6-2
-------
Section 1
Ver. 4, 2/89
Section 1
BACKGROUND
This section provides a brief history of the events that have
culminated in the regional Eulerian model evaluation study
described in this document. It begins with a description of why
comprehensive acid deposition models have been developed. This is
followed by a statement of the rationale underlying our conviction
that it is necessary to thoroughly evaluate the performance of
these models. Different approaches to model evaluation are then
described. The section ends with a chronology of the more
significant steps that have been taken in planning the study-
1.1 Development of ADOM and RADM
The atmospheric deposition of acidic materials in precipitation,
gases and particles can damage sensitive components of terrestrial
and aquatic ecosystems. The processes involved in converting
gaseous emissions to acids and their salts, and in transporting
and depositing them are so complex as to defy simple
interpretation based on field measurements, no matter how
carefully made. What it takes, in principle, to predict reliably
how much emitted material will be deposited and where, is a
thorough understanding of the relevant processes and their
embodiment in computer simulation models. This predictive ability
is necessary if cost effective measures are to be taken to protect
sensitive ecosystems by selectively controlling the emission of
acid precursors.
1-1
-------
Section 1
Ver. 4, 2/89
Mathematical models that incorporate our present understanding of
the governing processes (e.g., horizontal and vertical transport,
gas phase chemistry, scavenging and subsequent chemical reactions
in clouds, and wet and dry deposition) have been, and continue to
be, developed to fill this need. However, some of these models do
not capture the higher order complexity of the chemical processes
involved. Rather, they treat all processes in a simple first-
order way. This type of model has been rejected by many acid
deposition researchers as being an unreliable tool for predicting
deposition fields from arbitrary emission fields because it does
not capture the nonlinearities inherent in the natural system that
can give deposition responses that are not proportional to
emissions changes. Although it may do a reasonable job of
reproducing present deposition patterns given present emissions,
there is concern as to whether this type of model can produce
realistic deposition patterns given different emissions.
What is needed are models that represent the higher order science
in as complete a fashion as is practicable within the constraints
of present knowledge and modeling resources. Two of these higher
order, comprehensive models that are under development in North
America are the Regional Acid Deposition Model (RADM) and the Acid
Deposition and Oxidant Model (ADOM), respectively designated by
the U.S. and Canadian governments as potential emission control
policy assessment tools. RADM has been developed under the aegis
of the American National Acid Precipitation Assessment Program
(NAPAP). ADOM development was begun by the Ontario Ministry Of
1-2
-------
Section 1
Ver. 4, 2/89
the Environment and the Atmospheric Environment Service,
Environment Canada, with supplementary support subsequently
provided by the Federal Republic of Germany's Umweltbundesamt and
the Electric Power Research Institute.
These models are intended to provide a surrogate reality of such
fidelity that legislators, regulators, and those whose discharges
to the atmosphere are regulated will endorse their use for this
purpose. Such acceptance by the community at large will make them
credible tools for exploring emissions change scenarios and
assessing source-receptor relationships.
1.2 Commitment to Model Evaluation
Although the RADM and the ADOM are the focus of the model
evaluation effort described here, other models will almost
certainly be evaluated once the proper tools (data and methods)
are available. Model evaluation is viewed by the participants as
an essential element in the process that begins with model
development and ends with its application, because it is the step
that demonstrates how well the model mirrors the natural system.
Further, the economic and scientific motivations underlying this
demonstration are substantial.
Managerial and technical approaches for the regional Eulerian
model evaluation and field study have been proposed earlier
(Durham et al., 1986) and serve as the basis for much of this
plan.
1-3
-------
Section 1
ver. 4, 2/89
1.3 Types of Model Evaluation
Although the lines of distinction are not always clearly drawn,
four broad categories of model evaluation can be defined:
mechanistic, diagnostic, operational, and comparative.
Mechanistic evaluations can be conducted by examining in detail
the fidelity of process representations in the model code with
respect to the best understanding available of the governing
mechanisms. They can also involve an analysis of how well
specific parameterizations represent more mathematically exact
process representations. They answer the question, "Is the
science correctly represented?"
Diagnostic evaluations would not normally involve the same level
of detail as mechanistic ones. Rather, they examine the response
of model outputs to a wide range of model inputs to see how well
the model mimics perceived reality as represented by theory and
careful observation. One subset of this type of evaluation is the
familiar sensitivity analysis, wherein the relative response of a
specific output to changes in different inputs, or combinations of
inputs, is studied. Another would be comparison of the serial
changes in species' compositions predicted by the gas phase
chemistry module with those involved in smog chamber experiments.
As used in the present context, diagnostic evaluations rely in
large part on time-resolved (less than 24 hours), three-
dimensional observational data. They answer the question, "DO the
parts of the model appear to be working correctly?"
1-4
-------
Section 1
Ver. 4, 2/89
A model's performance is operationally evaluated on the basis of
its ability to simulate observations of target variables (such as
sulfate or nitrate deposition in precipitation) averaged over a
given period generally several days to a year. (Because the
models are not intended to capture the fine-scale spatial and
temporal variability of rainfall and meteorological variables,
there is little point in operationally evaluating the models on a
shorter term). Measurement data from the monitoring networks
described in this plan will be largely used for this type of
evaluation. Over the range of conditions tested, operational
evaluation answers the question, "Is the model giving the right
answers?"
In a comparative evaluation the performance of a model or its
parts is compared with that of another model for an identical set
of inputs (to the degree allowable by the models' formulations).
It answers the question, "If I use this model, will I get the same
results as if I had used that model?"
The primary use of the data expected from the field study covered
by this project plan is intended to be for operational and
diagnostic evaluations.
1.4 Field Study Planning
A series of planning meetings and workshops, many of them jointly
sponsored, has been conducted to define goals and methods for the
model evaluation. They are listed in Table 1-1, together with
1-5
-------
Section 1
Ver. 4, 2/89
subsequent pertinent meetings.
At the Quality Assurance Workshop, held 11-13 June 1986 in
Toronto, the attendees recommended the establishment of a Quality
Assurance Management Committee, composed of a representative from
each of the sponsoring organizations. This recommendation was
implemented and a charter for the committee was subsequently
drawn up and endorsed by each of the organizations.
After several meetings had been convened to coordinate
preparations for the field study, it became apparent to the QAMC
members that activities other than field measurements in
particular, emission inventories and model evaluation protocols
were equally essential to the model evaluation process, but were
not receiving the same level of coordinated attention. The QAMC
asked the Eulerian Modeling Bilateral Steering Committee (EMBSC)
to consider this problem and to make a recommendation for
addressing it. Its recommendation was to rename the QAMC the
Project Management Group (PMG), to reflect a broader set of
responsibilities, and to set up three subsidiary teams to oversee
activities on the topics of measurements, emissions, and model
evaluation.
The recommendations of the EMBSC were adopted with slight
modification: the Measurements Team was split into two, one each
for operational measurements and diagnostic measurements. The PMG
felt that the distinction between routine, surface-based
1-6
-------
Section 1
Ver. 4, 2/89
(operational) and research-grade, airborne and ground (diagnostic)
measurements was sufficiently great to warrant separate teams.
The initial meetings of these groups are listed in Table 1-1. The
meetings will continue at approximately quarterly intervals until
the group's component of the model evaluation effort is completed.
1-7
-------
Section 1
Ver. 4, 2/89
Table 1-1
PLANNING AND DESIGN MEETINGS FOR THE
MODEL EVALUATION EFFORT
DATE TOPIC
30 OCT 84 EMBSC
19 FEE 85 EMBSC
MAY 85 Technical Committee Workshop
on Field Study Plan
NOV 85 EPRI OEN Workshop
FEE 86 Workshop on Model Evaluation
Protocol
19 FEE 86 EMBSC
MAR 86 Workshop on Field Study Design
JUN 86 Workshop on Quality Assurance
25 AUG 86 EMBSC
OCT 86 Methods Reconciliation
Workshop
MAY 87 RADM Peer Review
JUL 87 QAMC
22 JUL 87 EMBSC
AUG 87 Workshop on Diagnostic
Evaluation
OCT 87 PMG
NOV 87 PMG and Team Conveners
FEE 88 PMG and Teams
LOCATION
Washington, D.C.
Toronto, Ont.
RTF, NC
Seattle, WA
Raleigh, NC
Toronto, Ont.
Seattle, WA
Toronto, Ont.
Toronto, Ont.
Toronto, Ont.
Raleigh, NC
Chicago, IL
Washington, D.C.
Raleigh, NC
Chicago, IL
Chicago, IL
RTF, NC
Subsequent meetings of the PMG and teams have been convened
approximately quarterly-
1-8
-------
Section 2
Ver. 4, 2/89
Section 2
ORGANIZATION
2.1 Overall Study Organization
The organizational structure of the binational acid deposition
model evaluation effort is shown in Figure 2-1. Top level guidance
and liaison among high-level managers of the participating
organizations is provided by the EMBSC. Reporting to the EMBSC,
the members of the PMG are managers responsible within their
organizations for the measurement networks and/or for their model
evaluation efforts. The Team members, in turn, are managers
within their organizations of, or individuals with expertise in,
the appropriate program component.
The evolution of this organizational structure has been described
in Section 1. The structure reflects the breadth and scope of the
agencies and technical disciplines involved in planning,
implementing, and completing this very complex undertaking.
The responsibilities of the PMG and the Teams are described in
Sections 3 (objectives) and 5 (deliverables).
2.2 Model Evaluation Team Support Organization
The Model Evaluation Team has set up an organizational structure
involving checks, feedbacks, high level oversight, and extensive
interactive peer review for conducting the performance evaluations
of the models. The structure is illustrated in Figure 2-2 and is
2-1
-------
Section 2
Ver. 4, 2/89
Eulerian Modeling
Bilateral Steering Committee
J. Durham, EPA
G. Foley, EPA
R. Perhac, EPRI
E.W. Piche, OME
J.W.S. Young, AES
+
1
1
1
1
1
1
+
PROJECT MANAGEMENT GROUP
D.A. Hansen, EPRI**
J.J. Jansen, FCG
M. Lusis, OME
K.J. Puckett, AES
F.A. Schiermeier, EPA
1
1
1
1
1
1
I
OPERATIONAL
MEASUREMENTS
TEAM
N. Bowne, ME-35 PI
W. Chan, OME
D. Daly, ADS
E. Edgerton, ESE
J.Kruse, OEN PI
S.McNair, CAPMoN
F. Pooler, EPA
N. Reid, OME
R. Vet, AES**
A. Olsen, ADS DBM
I
| EMISSIONS
| TEAM
I S. Heisler, ENSR
| M. Hodges, ESE
| N. Kaplan, EPA**
j J. McManus, AEP
| J. Novak, EPA
j D. Pahl, EPA
j F. Vena, Env.Can.
| D. Yap, OME
DIAGNOSTIC
MEASUREMENTS
TEAM
J. Boatman, NOAA
J. Bottenheim, AES
N. Bowne, ENSR
J. Ching, EPA**
K. Demerjian, SUNYA
J. Hales, PNL
G. Isaac, AES
L. Lindsey, PNL
A. Olsen, ADS DBM
W. Seiler, FRG
C.Spicer, Battelle
+ + +
MODEL
EVALUATION
TEAM
R. Barchet, PNL*
J. Chang, SUNYA*
R. Dennis, EPA
T. Lavery, ESE
D.A. Hansen, EPRI
P.K. Misra, OME**
J. Novak, EPA
A. Olsen, ADS DBM
K. Puckett, AES
A. Venkatram, ERT*
* Ex officio
** Chairman
Figure 2-1. Model Evaluation Organization
2-2
-------
Section 2
Ver. 4, 2/89
New Runs | MODEL |
+ ---------------- > | DEVELOPERS |
| Requested + ---- + I- ---- +
I I I
I I I
+ ----------- -i- + ------ + ----- + + ----- + + ------- +
| EXTERNAL + ------ >| MODEL | Protocol | PROTOCOL |
| REVIEW | | EVALUATION + --------- >| IMPLEMENTATION |
I PANEL j< ------ + TEAM | j GROUP |
+ ------------ 1- H ----- + + ---- + + --- + ------- + ---- +
II II
II II
II II
ASSESSMENT AND | | | Archived
INTERPRETATION |< ---------- + | Field
GROUP | | Data
Figure 2-2. Model Evaluation Team Support Organization
2-3
-------
Section 2
Ver. 4, 2/89
designed to provide a highly visible, scientifically credible
evaluation process, in which all sponsors can participate.
Once the model evaluation protocol(s) has been completed under the
aegis of the Model Evaluation Team (MET) , it will be implemented
by the Protocol Implementation Group (PIG), which will most likely
be made up of computationally oriented staff from a contractor.
The PIG will treat the protocol as a set of instructions that will
be carried out as written. It will draw on the field data archive
as needed to meet the data requirements of the protocol. It will
interact with the modelers to exercise the models as specified in
the protocol. As the protocol itself is exercised, the results
of the observations-predictions comparisons, sensitivity analyses
and other possible activities specified by it will be fed by the
PIG to the Assessment and Interpretation Group (AIG). The PIG is
viewed as something of a buffer between the AIG and the model
developers, reducing their interaction and the perception of those
outside the process that the modelers are overly influencing any
conclusions drawn by AIG.
The AIG will have the responsibility for interpreting the results
and producing evaluation reports, initially, in a preliminary
sense to NAPAP in time for incorporation in the 1990 final
assessment report, and finally as a report on the completed
operational and diagnostic evaluations. The composition of the
AIG is not settled, but will probably be made up of contractor
staff supported by external expert consultants. The AIG will
2-4
-------
Section 2
Ver. 4, 2/89
likely be funded and managed in large part by the U.S. EPA, the
staff of which will frequently consult with the MET.
An international group of highly respected scientists (see Table
2-1), expert in various aspects of model evaluation, make up the
External Review Panel (ERP). They have been invited by the EMBSC
on behalf of the MET to serve on this panel. They will work
closely with the MET not only reviewing the model evaluation
protocol before its implementation, but reviewing the interim and
final reports passed to it by the Team from the AIG. It is
anticipated that the ERP will make recommendations from time to
time for course corrections that may involve protocol
modifications or additional model runs. These recommendations
will be channelled through the MET.
2-5
-------
Section 2
Ver. 4, 2/89
Table 2-1
External Review Panel
Dr. Peter Bloomfield, North Carolina State University
Dr. William Chameides, Georgia Institute of Technology
Dr. Anton Eliassen, the Norwegian Meteorological Institute
Dr. Fred Fehsenfeld, NOAA Aeronomy Laboratory
Dr. Bernard Fisher, Central Electricity Research Laboratories, UK
Dr. Dean Hegg, University of Washington
Dr. Dieter Kley, Institut fur Chemie, Julich, FRG
Dr. Harold Schiff, York University, Canada
Dr. Ted Yamada, Los Alamos National Laboratory
2-6
-------
Section 3
Ver. 4, 2/89
Section 3
OBJECTIVES
3.1 Project Management Group
The objective of the PMG is to ensure that the Eulerian acid
deposition models are evaluated:
o according to a well-defined protocol,
o using input and evaluative data of defined precision
accuracy, representativeness, and comparability,
o in such a way that uncertainties in model outputs can be
distinguished from those in the input and evaluative data,
and
o in terms of established (to the degree possible)
performance criteria.
The PMG will pursue this objective by:
o coordinating activities of the member organizations
related to model evaluation, partly through
approximately quarterly meetings;
o soliciting suggestions from the Eulerian Model Bilateral
Steering Committee (EMBSC) when problems arise which are
of interest to the PMG and require resolution at a higher
management level;
o establishing four teams to assist the PMG by providing
technical oversight of Study-related activities on the
topics of operational (routine monitoring) measurements,
diagnostic (airborne and enhanced chemistry site)
3-1
-------
Section 3
Ver. 4, 2/89
measurements, emission inventories, and model evaluation;
o meeting approximately quarterly with the team chairs to be
briefed on team activities;
o providing for the review and approval of the project
quality assurance plans for each of the sponsor's1
networks;
o encouraging standardization of methods and protocols;
o encouraging member agencies to practice active quality
control; and
o specifying common data base characteristics and reporting
protocols.
3.2 Technical Oversight Teams
The Teams will provide a broad base of technical expertise and
management skill to assist in meeting the PMG's objective. They
will also be guided by specific objectives developed by the PMG in
consultation with each team.
3.2.1 Operational and diagnostic measurements teams. The
objective of both of these teams will be to produce a standardized
data set of defined precision, accuracy, representativeness, and
comparability* for the model evaluation program through the
coordination and oversight of the measurements, data management
and quality assurance programs of the individual participating
organizations.
*The terms defining data quality are discussed in Section 4.
3-2
-------
Section 3
Ver. 4, 2/89
The Operational Measurements Team will pursue its objective by:
o ensuring that the results of quality control studies are
assessed and that recommended corrective actions are
taken;
o reviewing and recommending for diagnostic studies the
methods of establishing estimates of bias and variance;
o reviewing and recommending quality assurance and quality
control methods for model development and evaluation; and
o designing inter-network and inter-laboratory studies
of uncertainties.
3.2.2 Emissions inventories. The objective of this team will be
to produce a standardized data set of defined uncertainty through
the coordination and oversight of NAPAP, EPRI, and Canadian
emission inventory acquisition, data management and quality
assurance programs.
3.2.3 Model evaluation. This team's objective is to ensure that
the model evaluation methods are consistent with the model design
characteristics and appropriate in the context of their
application, that they can be objectively used and produce results
that are scientifically defensible.
3-3
-------
BLANK
-------
Section 4
Ver. 4, 2/89
Section 4
DATA QUALITY OBJECTIVES
The data quality objectives stem directly from the PMG's
objective. They will be achieved through implementation and
execution of this plan and the QA plans of the participating
organizations listed in the Appendix. These plans should be
consulted for details.
Quantitative objectives may be stated for the precision, accuracy,
lower quantifiable limits and completeness of each measured
observable. Ideally these would be specified in advance by the
model evaluators, based on their perception of the data quality
required for them to do an adequate job. However, since no
comparable specifications have ever been formulated, such an
expectation is unrealistic. Therefore, these data quality
objectives will be based instead on what are reasonable
expectations for the selected measurement methods under carefully
controlled field and laboratory conditions and on less
quantitative judgements of the methods' ability to provide data
with quality commensurate with that required by the evaluation
protocol. They are given for precision, accuracy, lower
quantifiable limit, and completeness in Table 4-1.
Although numerical measures of data comparability and
representativeness may, in principle, be developed, to do so a
priori appears to be impractical at this juncture. They will
4-1
-------
Table 4-1
DATA QUALITY OBJECTIVES FOR
PRECISION, ACCURACY, LOWER QUANTIFIABLE LIMIT
AND COMPLETENESS
Section 4
Ver. 4, 2/89
AIR QUALITY
Observable
(Interval)
Particulate
Mass
(24 Hr)
Particulate
Sulfate
(24 Hr)
Particulate
Nitrate
(24 Hr)
Particulate
Ammonium
(24 Hr)
Sulfur
Dioxide
(24 Hr)
Nitric
Acid
(24 Hr)
Ammonia
(24 Hr)
Expec
Method Upper
(Units) Range
FP/G 50
(ug/m3)
FPC/G 100
(ug/m3)
FP/AC 50
(ug/m3)
FP/IC
(ug/m3)
FP/AC 20
(ug/m3)
FP/IC
(ug/m3)
FP/AC 20
(ug/m3)
FP/AC 200
(ug/m3)
FP/IC
(ug/m3)
TFR/IC 20
(ug/m3)
TFR/AC
(ug/m3)
FP/AC
(ug/m3)
FP/IC
(ug/m3)
TFR/AC 20
(ug/m3)
FP/AC
(ug/m3)
Ozone Photometry 1000
(1 Hr) (ug/m3)
Nitrogen Luminol CL 100
Dioxide (ug/m3)
(1 Hr) TEA FP/IC 20
(24 Hrs) (ug/m3)
. Precision
(The
Larger of)
3 ug/m3
3 ug/m3
0 . 4 ug/m3
V-15%
0 . 3 ug/mj
V-15%
0.03 ug/mj
V-15%
0.4 ug/mj
V-15%.
0.4 ug/mj
V-15%
0 . 1 ug/m J
+/-10% or
10 ug/m3
V-io%
0.2 ug/mj
V-15%
0.4 ug/m3
Lower Quanti-
fiable limits C
Accuracy 3 x SD 10 x SD
(ug/m3)
(ug/m3)
+/-10% 0.3
(ug/m3)
(ug/m3)
+/-10% 0.02
(ug/m3)
+/-10% 0.3
(ug/m3 )
V-10% 0.3
(ug/m3)
+/-10% 0.07
(ug/m3)
V-10% 8
(ug/m3)
+/-10% 0.17
(ug/m3)
(ug/m3)
6
(ug/m3)
6
(ug/m3)
0.8
(ug/m3)
0.5
(ug/m3)
0.05
(ug/m3)
0.8
(ug/m3)
0.8
(ug/m3)
0.2
(ug/m3)
25
(ug/m3)
0.5
(ug/m3)
0.8
(ug/m3)
!omple
_ness
90%
90%
90%
90%
90%
90%
90%
90%
90%
90%
90%
4-2
-------
Section 4
Ver. 4, 2/89
Table 4-1 (Continued)
DATA QUALITY OBJECTIVES FOR
PRECISION, ACCURACY, LOWER QUANTIFIABLE LIMIT
AND COMPLETENESS
AIR QUALITY
Observable
(Interval)
PAN
(24 Hr)
Hydrogen
Peroxide
(1 Hr)
Method
(Units)
FS/IC
(ug/m3)
E/F
(ppb)
Hydrocarbons GC/FID
(ppbc)
Aldehydes
Der/HPLC
(ppb)
Expec. Precision
Upper (The
Range Larger of)
Lower Quantifiable
Limits Complete-
Accuracv 3 x SD 10 x SD ness
40
+/-15I +/-10% 4 14
4 ug/mj (ug/m-1' (ug/mj'
90%
FP/AC = Filter pack, automated colorimetric analysis
FP/IC = Filter pack, ion chromatographic analysis
Luminol CL = Luminol chemiluminescence
TFR/IC = Transition flow reactor, ion chromatographic analysis
TFR/AC = Transition flow reactor, automated colorimetric analysis
FS/IC = Filter sampler, ion chromatographic analysis
FP/G = Filter pack, gravimetry
FPC/G = Fine particle collector, gravimetry
PAN = Peroxyacetyl nitrate
GC/FID = Gas chromatography analysis with flame ionization detection
Der/HPLC = Derivatization with high performance liquid chromatograhic
analysis
4-3
-------
Table 4-1 (Continued)
DATA QUALITY OBJECTIVES FOR PRECISION, ACCURACY,
LOWER QUANTIFIABLE LIMIT AND COMPLETENESS
Section 4
Ver. 4, 2/89
PRECIPITATION CHEMISTRY (24 Hrs)
Expec. Precision L
Method Upper (The
Observable (Units) Range Laraer_of) Accuracy
Precipitation Rain 10,000 +/-10% +/-10%
Amount Collector 8 gm
(grams)
Field pH
Field
Conductance
Lab pH
Lab
Conductance
Sulfate
Nitrate
Chloride
Ammonium
Sodium
Potassium
Calcium
Magnesium
pH Meter 14
(pH units)
Cond. Mtr. NA
(umho/cm)
pH Meter 14
(pH units)
Cond. Mtr. NA
(umho/cm)
1C 100
(umol/1)
1C 50
(umol/1)
1C 8
(umol/1)
AC 10
(umol/1)
AA 8
(umol/1)
AA 4
(umol/1)
I CAPES 9
(umol/1)
I CAPES 4
(umol/1)
+/-0.04 +/-Q.05
pH units pH units
0.2 umho/cm
+/-0.04 +/-Q.05
pH units pH units
0.2 umho/cm
0.2 umol/1
0.2 umol/1
0.1 umol/1
0.6 umol/1
0.5 umol/1
0.3 umol/1
0.3 umol/1
0.08 umol/1
ower Quantifiable
Limits Complete-
3 x SD 10 x SD ness
8
(gm)
NA
0.3
(umho/cm)
NA
0.3
(umho/cm)
0.1
(umol/1)
0.1
(umol/1)
0.1
(umol/1)
0.5
(umol/1)
0.4
(umol/1)
0.2
(umol/1)
0.2
(umol/1)
0.07
(umol/1)
24
(gm)
NA
1
(umho/cm)
NA
1
(umho/cm)
0.4
(umol/1)
0.4
(umol/1)
0.3
(umol/1)
1.4
(umol/1)
1.1
(umol/1)
0.6
(umol/1)
0.6
(umol/1)
0.2
(umol/1)
90%
90%
90%
90%
90%
90%
90%
90%
90%
90%
90%
90%
90%
1C = Ion chromatography
AC = Automated colorimetry
AA = Atomic absorption spectroscopy
ICAPES = Inductively coupled argon plasma emission spectroscopy
4-4
-------
Section 4
Ver. 4, 2/89
Table 4-1 (Continued)
DATA QUALITY OBJECTIVES FOR
PRECISION, ACCURACY, LOWER QUANTIFIABLE LIMIT
AND COMPLETENESS
METEOROLOGY
Observable
(Interval)
Method
(Units)
Precipitation Rain
Amount Gauge
(1 Hr) (cm)
Range
Upper
Lower
NA
Wind Speed Anemometer 50
(1 Hr)
540
Wind Wind Vane
Direction (deg)
(1 Hr)
Temperature Thermistor 122
(1 Hr) (deg F) -40
Dew Point
(1 Hr)
LiCl
(deg F)
Barometric Capacit.
Pressure (in Hg)
(1 Hr)
104
-22
31
22
Precision Lower Quantifiable
(The Limits Complete-
Larcrer of) Accuracy 3 x SD 10 x SD ness
+/-0.13 0.025cm 0.076cm
0.025cm
1 mph
+/-100
+/-0.05in
1 mph
NA
NA
NA
NA
NA
NA
NA
NA
NA
90%
90%
90%
90%
90%
90%
4-5
-------
Section 4
Ver. 4, 2/89
probably be developed a posteriori based on analysis of field
and laboratory measurement and quality control data. In the
meantime, the PMG will attempt to ensure that the data are as
comparable and representative as possible by taking the steps
discussed below.
Representativeness will be judged both temporally and spatially.
With only two years of data expected from the model evaluation
field program a rigorous determination of temporal
representativeness will probably not be possible for all measured
observables. However, inferential determinations can be made by
comparison with those observables for which longer term records
exist, in particular meteorological variables. The actual
comparison methods remain to be defined by the measurements teams,
Spatial representativeness can be assessed in at least two ways.
One will be based on the data collected in the VAR network and
will give insight into sub-grid cell variance. The other will be
based on an analysis of paired-station covariance, using data from
the combined networks. Higher covariance associated with stations
having smaller separations would indicate lack of an overriding
local source or topographical influence and therefore a higher
probability of the stations' spatial representativeness.
Comparability will be established in several ways: by comparison
of quality control data among networks, by inter-laboratory
comparison studies involving the interchange of samples or the
4-6
-------
Section 4
Ver. 4, 2/89
challenging of samplers with common test atmospheres, by
comparison of measurement data from the Egbert and Penn State
inter-network colocation stations, and by comparison of standard
operating procedures among networks. Development of procedures
for implementing these comparisons will be the responsibility of
the measurements teams.
Precision will be a measure of the reproducibility of
measurements. Data from colocated samplers, replicate analyses,
duplicate samples and repeated span checks can be used to measure
reproduc ib i1ity.
Accuracy will be determined by comparison of measurements against
authoritative standards or, in their absence, against arbitrary
standards. In the latter case, the determination will be referred
to explicitly as "relative accuracy."
Lower quantifiable limit will be determined as the minimum
concentration that a measurement process can distinguish at a
specified confidence level above a background value. The
procedure for determining an LQL may differ from observable to
observable. It's value may vary with time, as the variables
involved in its determination may not be constant.
Completeness will be determined as the percentage of the possible
reported values that are actually validated and entered into the
evaluation data sets. A common set of data validation criteria
4-7
-------
Section 4
Ver. 4, 2/89
will be established by the measurements Teams.
4-8
-------
Section 5
Ver. 4, 2/89
Section 5
DELIVERABLES AND SCHEDULE
5.1 PMG
Deliverables from the PMG include the Project Plan and semi-annual
(or, as requested) briefings to the EMBSC on the project status.
The project schedule is shown in Table 5-1.
5.2 Technical Oversight Teams
5.2.1 Operational measurements. The operational measurements
team will be responsible for producing:
1. a standardized data set from the surface networks for use
/
in the model evaluation;
2. evidence of the comparability of data sets from the
contributing networks;
3. quality-assured data on a schedule commensurate with the
needs of model evaluators and preliminary, screened data
from intercomparison sites within 3 months so that the
comparability of data among networks may be assessed;
4. a QA Plan for the operational networks and evidence of
its application;
5. Quarterly reports to the PMG until approximately August
1988 and then Semi-annual reports thereafter.
5.2.2 Diagnostic measurements. The diagnostic measurements team
will be responsible for producing:
1. a standardized data set from the airborne measurements,
5-1
-------
Section 5
Ver. 4, 2/89
Table 5-1
SCHEDULE
1988
Field Operations IJa I Fe I Ma | Ap I Ma I Jn | .Tu I Au I Se I Oc I No I De |
Snow Sampling Study < 1 I I I I I I I I '
OEN Precip Chem | >
OEN Pilot Study | 1 | I I I I I I
ME-35 Pilot Study | I I I I I I I I I I I I I
FADMP Pilot Study | | | 1 | | I I I I I
Full Network Opns and I I I I I I >
U.S. cont. emissions +++++++++++++
Summer Intensive I I I I I I I III
Canada I I I I I I I I 1 III
U.S. I I I I I I I I I 1 I I I
Draft report: hourly I I I I I I I I I " I I I
emissions data base +++++++++ 1 1 +
Canadian cont. emissions) I I I I I I I 1 I I I
Quality Assurance
NWRI Sample Distrib. | | | |A|«|A|A|^JA|AJAJA|
Filter Sample Exchange | | TO BE DETERMINED I I I I I
Colocated Measurements I I I | | | >
Field Audits | | TO BE DETERMINED I I I I I
Data Delivery
FADMP data to ADS I I I I I I I I I I | A |
NAPAP '85 emissions I I I I I I | | | | ~ | |
inventory ++++++++++ + + +
Enhanced surface data | | TO BE DETERMINED I I I I |
Airborne data | | TO BE DETERMINED I I | | |
Model Evaluation
Workshop on measures of \ \ \ \ \ \ \ * \ \ \ \ \
model performance ++++++++++++ +
Draft Protocol I I I I I I I I IA I I | |
+ + h + + + + + + + + + +
Protocol review by I I I I I I I I I I 1 I
External Rev. Panel +++++++++ ++
5-2
-------
Section 5
Ver. 4, 2/89
Table 5-1 (Continued)
SCHEDULE
1989
+++++++++++++
Field Operations Ua I Fe I Ma I Ar> I Ma IJu Uu I Au I Se I Oc I No I De I
+|-|-++(.+|-++++\.
Full Network Opns and < >
continuous US emiss. +++++hhh+++++
Workshop: data I I I I I AI I I I I I I I
collectors & modelers++++h++++++++
I I I I I I I I I I I I I
++++|-++(.++|-++
Quality Assurance
+++++++++++++
NWRI Sample Distrib | ~| ~| -| -| -| ~| -| ~| - | ~| ~| -|
H 1- + + 1- + H + (. + + 1- +
Filter Sample Exchange | | TO BE DETERMINED I I I I I
+++++++++++++
Colocated Measurements < >
++ hh++h
Field Audits | | TO BE DETERMINED I I I I I
+++++++++++++
Data Delivery to ADS
++ +hh +++++h
Six months network + I I A I I I ~ I I I I I I
one intensive H 1U.S.H 1 1Can.+i 1 1H H
One year network + I I I I I I I I I I I A I
two intensives +++++++++++++
FADMP data I'll'll'll'll
++++ ++ ++ ++
EPA and EPRI data | | |/v~~~/v/v/v~~/v
++|-++|-(-++++^ 1-
Hourly emissions I I I I I I" I I I I I I I
1st six months +++++++++++++
I I I I I I I I I I I I I
+__++++++++++++
I I I I I I I I I I I I I
+++++++++++++
Model Evaluation
+++++++++(.+++
Final protocol I I I IA I I I I I I I I I
document +++++++++++++
Preliminary model I I I I I 1 I I I I
evaluations +++++++ +++++
Continued model I I I I I I I I >
evaluations ++++H++ +
5-3
-------
Section 5
Ver. 4, 2/89
Field Operations
Table 5-1 (Continued)
SCHEDULE
1990
UaIFeI Ma IAPI MaIJuIJu|AuISeIOcI No IDeI
Full network opns and <-
continuous US emiss. +
FADMP network opns <
1 I I I I
+ + + + + + + + + + H --- H --- +
I A| "I Al "I ~l I I I I I I I
Quality Assurance
NWRI sample distrib
+ + 4. + 4. 1- + H + 4
Filter sample exchange | TO BE DETERMINED | | |
++__+.._++.++ 1- 1-4
Colocated measurements < 1 I I I I
Field Audits
Data Delivery to ADS
<
| TO BE DETERMINED I I I I I I
+ -- + -- + -- + --- h --- h -- + -- + -- + --- h --- 1- -- + -- +
Routine surface data
FADMP data |
Enhanced surface data |
Day-specific emissions | | |
data set ++4
Model Evaluation
I- . H + + 1- 4. 4. + + (.
TO BE DETERMINED I I I I | |
I I I I I I I I I A
Continued model
evaluations
I
I I I I I I I I I I I I
+ + 4- 4- + + h + h + + +
I I I I I I I I I I I I
I I I I I I
+ + + + + +
..... I
I
I
I I I I I I I I I I I I I
4- h + + + h + h + h + 1 +
I I I I I I I I I I I I |
I I I I I I I I I I I | |
+ + + + + + + + + + + + h
5-4
-------
Section 5
Ver. 4, 2/89
VAR network, and enhanced chemistry stations;
2. evidence of the comparability of data sets from the
contributing programs;
3. quality-assured data no longer than 6 months after
completion of the measurements;
4. QA plan for the diagnostic measurements and evidence of
its application; and
5. semi-annual reports to the PMG.
5.2.3 Emissions inventories. The emissions inventories team will
be responsible for producing:
1. a standardized emissions data base for use in model
evaluation;
2. evidence of the comparability of the constituent data
sets;
3. quality assured data on a schedule that meets the needs
of the model evaluation team;
4. QA plan for the emissions inventory and evidence of its
application; and
5. semi-annual reports to the PMG.
5.2.4 Model evaluation. The model evaluation team will be
responsible for producing:
1. scientifically defensible model evaluation protocols;
2. establishment of a model evaluation advisory
committee;
3. QA plan for model evaluation process and evidence of its
application as part of the final report on model
5-5
-------
Section 5
Ver. 4, 2/89
evaluation;
4. statement of requirements and schedules for data delivery
for model evaluation; and
5. semi-annual reports to the PMG.
5-6
-------
Section 6
Ver. 4, 2/89
Section 6
AEROMETRIC AND PRECIPITATION MEASUREMENTS
This section describes what and where measurements will be made,
what tests have been conducted to characterize their performance,
what steps will be taken to achieve the data quality objectives,
and how the data will be archived.
6.1 Field Measurements
Observational data are to be collected over a two-year period
beginning in mid-1988 in at least five surface-based, cooper-
atively coordinated, measurement networks (see Figure 6-1). In
the U.S.A., the Environmental Protection Agency (EPA), EPRI, and
the Florida Electric Power Coordinating Group (FCG) will operate
networks, while in Canada the Atmospheric Environment Service
(AES) and the Ontario Ministry of the Environment (OME) will do
likewise. (The door is being left open for participation by
other organizations, providing they meet the standards specified
for ensuring comparability of their measurements with those of
the existing participants.) Participating sites and their
locations are listed in Table 6-1. Sites have been selected with
regard to their freedom from the influence of local emission
sources, their placement with respect to one another to ensure
that important spatial gradients in deposition predicted by the
models can be resolved, and other criteria as enumerated in
planning documents. (See, for example, Operational Evaluation
Network Work Plan, ERT, 1987.)
6-1
-------
o\
10
OEN (EPRI)
V ME-35(EPA)
O GRAD(EPA)
VAR(EPA)
CAPMON (AES)
APIOS (OME)
A FADMP (FCG)
a\
-------
Section 6
Ver. 4, 2/89
Table 6-1
MODEL EVALUATION FIELD STUDY
SITE LOCATIONS
APIOS (OME)
SITE NAME
Longwoods
(with AES)
Wellesley
Balsam Lake
Dorset
Charlston Lake
Fernberg
Gowganda
High Falls
Egbert (with
AES, EPA, EPRI)
State College, PA
(with AES, EPA,
EPRI)
NO.
01
02
03
04
05
06
07
08
09
10
LATITUDE
42
43
44
45
44
47
47
46
44
40
53
28
38
13
30
50
39
20
14
47
LONGITUDE OBSERVABLES MEASURED
81
80
78
78
76
91
80
81
79
77
29
46
51
56
03
52
47
33
47
56
PC,
PC,
PC,
PC,
03,
PC,
PC,
PC,
PC,
PC,
PC,
S02
S02
SO2
S02
NOX
SO2
S02
SO2
SO2
S02
S02
, SO4
,SO4
,SO4
,S04
,PAN
,SO4
,S04
,SO4
,SO4
,S04
,S04
,tNO3
,tN03
,tNO3
,tN03
,tNO3
,tN03
,tNO3
,tNO3
,tN03
,tNO3
,RG
,RG
,RG
,RG
,03
,RG
,RG
,03
,RG
,03
i
,RG
,RG
,RG
Rural Ozone Onl>
Hawkeye Lake
Tiverton
Huron Park
Thedford
Parkhill
Mendaumin
Merlin
Long Point
Simcoe
Stouffville
11
12
13
14
15
16
17
18
19
20
48
44
43
43
43
42
42
42
42
43
40
18
18
10
10
57
15
35
51
57
89
81
81
81
81
82
82
80
80
78
26
35
30
51
41
12
13
23
16
36
03
03
03
O3
O3
03
O3
O3
O3
O3
6-3
-------
Section 6
Ver. 4, 2/89
Table 6-1 (Continued)
MODEL EVALUATION FIELD STUDY
SITE LOCATIONS
APIOS (OME), Continued
Precipitation Chemistry Only*
SITE NAME NO. LATITUDE LONGITUDE OBSERVABLES MEASURED
81 33 PC
80 53 PC
78 54 PC
79 04 PC
76 32 PC
76 36 PC
89 37 PC
91 12 PC
PC:Precipitation chemistry: pH, conductivity, sulfate, nitrate,
chloride, ammonium, sodium, potassium, calcium, magnesium
SO2: Gaseous sulfur dioxide
S04: Particulate sulfate
tNO3: Gaseous nitric acid plus particulate nitrate
03: Gaseous ozone
PAN: Gaseous peroxyacetyl nitrate
RG: Weighing bucket rain gauge
* Data delivery on slower schedule than from sites 1-10.
Melbourne
N. Easthope
Raven Lake
Nithgrove
Wilmer
Rail ton
Dawson
Quetico Centre
21
22
23
24
25
26
27
28
42
43
44
45
44
44
48
48
47
24
37
12
27
23
38
45
6-4
-------
Section 6
Ver. 4, 2/89
Table 6-1 (Continued)
MODEL EVALUATION FIELD STUDY
SITE LOCATIONS
CAPMON IAESJ.
NO. LATITUDE LONGITUDE OBSERVABLES MEASURED
SO4,tN03,03,RG
S04,tN03,03,RG
,SO4,tNO3,O3,RG
,SO4,tNO3,03,RG
,SO4,tNO3,03,RG
,S04,tN03,03,PAN,
,SO4,tNO3,03,RG
,SO4,tNO3,O3,RG
SITE NAME
ELA
Algoma
Bonner Lake
Chalk River
Sutton
Montmorency
Kej imkuj ik
Chapais
Egbert (with EPA,
EPRI, OME)
State College, PA
(with EPA,EPRI, O]
Longwoods (with
OME)
PC:Precipitation chemistry: pH, conductivity, sulfate,
nitrate, chloride, ammonium, sodium, potassium, calcium,
magnesium
S02: Gaseous sulfur dioxide
S04: Particulate sulfate
tNO3: Gaseous nitric acid plus particulate nitrate
03: Gaseous ozone
PAN: Gaseous peroxyacetyl nitrate
RG: Rain gauge
01
02
03
04
05
06
07
08
09
10
PPM
LC.;
11
49
47
49
46
45
47
44
49
44
40
42
39
06
23
04
05
19
26
49
14
47
53
93
84
82
77
72
71
65
74
79
77
81
43
06
07
24
42
09
12
49
47
56
29
PC
PC
PC
PC
PC
PC
PC
RG
PC
PC
PC
PC
/
i
i
i
i
i
i
i
i
i
i
SO2,
S02,
RG
S02,
S02,
S02,
S02,
S02,
SO2,
S02,
S02,
6-5
-------
Section 6
Ver. 4, 2/89
Table 6-1 (Continued)
MODEL EVALUATION FIELD STUDY
SITE LOCATIONS
SITE NAME
Tunkhannock, PA
Ft. Wayne, IN
Gaylord, MI
Winterport, ME
Uvalda, GA
Marshall, TX
Lancaster, KS
Underbill, VT
Big Moose, NY
Yampa, CO
Shawano, WI
Round Lake, WI
Warwick, MA
Zanesville, OH
Leitchfield, KY
Pittsboro, NC
Moorhead, KY
Bells, TN
Marion, AL
Morton, MS
Due West, SC
State College, PA
Brookings, SD
Jerome, MO
Egbert, Ont.
PEN (EPRI)
NO. LATITUDE LONGITUDE OBSERVABLES MEASURED
PC, APC, gases, met
02a
07
10
13
14
17
18
20a
21
23
24a
25
26
27
28
29
30
31
32
33
34^
36b
37
38^
39b
41
41
44
44
32
32
39
44
43
40
44
46
42
40
37
35
38
35
32
32
34
40
44
37
44
34
02
56
37
01
39
34
31
49
09
42
14
39
01
25
47
12
44
36
17
19
46
14
55
14
30
39
58
05
59
58
10
42
03
54
30
09
00
52
30
30
10
30
45
30
30
59
50
10
00
75
85
84
68
82
94
95
72
74
106
88
91
72
82
86
79
83
89
87
89
82
77
96
91
79
59
19
38
58
29
25
18
52
54
54
37
55
18
04
21
15
31
07
21
38
23
55
49
58
47
40
08
30
30
24
06
17
08
08
49
28
40
10
04
10
20
20
30
30
00
10
59
50
55
00
II
II
II
II
II
II
II
II
II
II
II
II
II
II
II
II
II
II
II
II
II
PC,APC,SO2,HNO3,NH3
3With ME-35
b With APIOS, CAPMoN, ME-35
PC= Precipitation chemistry: pH, conductivity, sulfate, nitrate,
chloride, ammonium, sodium, potassium, calcium, magnesium; APC=
Aerosol particle chemistry: mass, sulfate, nitrate, ammonium;
Gases: Ozone, nitrogen dioxide, sulfur dioxide, nitric acid,
ammonia; Met: Precipitation amount, wind speed, wind direction,
dew point, temperature, barometric pressure
6-6
-------
Section 6
Ver. 4, 2/89
Table 6-1 (Continued)
MODEL EVALUATION FIELD STUDY SITE LOCATIONS
SITE NAME
Pittsboro, NC
Wartburg, TN
West Pt, NY
Whiteface Mtn,
ME-35 (EPA)
NUMBER LATITUDE LONGITUDE OBSERVABLES MEASURED
NY
State College, PA
Parsons, WV
Prince Ed. SF, VA
NH
Hubbard Brook,
Ithaca/Danby, NY
Kane Forest, PA
Goddard SP, PA
Deer Cr. Park, OH
Newcomb Tract, MI
Beltsville, MD
Laurel Hill SP, PA 317
Tanners Ridge, VA 318
Cedar Creek SP, WV 319
Mountain Lake, VA
301
302
303
305
306a
307
308
309
310
312
313
314
315
316
KY
320
321
322
323
324
326
35.67
36.08
41.35
44.38
40.78
39.10
37.17
43.80
42.35
41.60
41.35
39.64
42.42
39.03
40.01
38.52
38.88
37.37
37.09
39.53
40.92
43.63
36.11
36.04
39.92
37.68
40.05
40.80
45.20
46.62
35.05
38.78
40.32
41.45
44.13
44.53
41.58
44.71
44.14
PC= Precipitation chemistry: pH, conductivity, sulfate, nitrate,
chloride, ammonium, sodium, potassium, calcium, magnesium; PA=
Precipitation amount; APC= Aerosol particle chemistry: sulfate,
nitrate, ammonium; FPM= Fine particle mass; Gases= Sulfur
dioxide, nitric acid, nitrogen dioxide, ammonia
a With APIOS, CAPMoN, OEN
b With OEN
* Includes S(IV)
Lilley Cornett,
Oxford, OH
Brokensword, OH
Unionville, MI
Roaring Creek, NC
Edgar Evins SP, TN 327
Arendtsville, PA 328
Perryville, KY
Bondville, IL
Salimonie Lake, IN 333
329
330
334
335
Perkinstown, WI
Ashland, ME
Coweeta Forest, NC 337
Vincennes, IN 340
Washington Cr. NJ 344
University Park,IL 346
Cadillac, MI
Underbill, VT
Tunkhannock , PA
Shawano , WI
Egbert , Ont
349
395b
396b
397b
398a
79
84
74
73
77
79
78
72
76
78
80
83
83
76
79
78
80
80
82
84
83
83
82
85
77
84
88
85
90
68
83
87
74
87
85
72
75
88
79
.23
.54
.05
.85
.93
.66
.31
.00
.49
.77
.17
.22
.90
.82
.23
.48
.85
.52
.99
.72
.00
.38
.05
.73
.31
.97
.37
.60
.60
.41
.43
.49
.87
.72
.42
.87
.99
.62
.47
PC*
PC,
PC,
PC*
PC,
PC*
PC,
PC,
PC*
PC,
PC,
PC,
PC*
PC*
PC,
PC*
PC,
PC,
PC,
PC,
PC,
PC,
PC,
PC,
PC,
PC*
PC*
PC,
PC*
PC*
PC*
PC,
PC,
PC,
PC,
PC,
PC,
PC,
PC,
,PA,
PA,
PA,
,PA,
PA
,PA
i
PA,
PA,
,PA,
PA,
PA
PA
,PA
,PA
PA
,PA
PA
PA
PA
PA
PA
PA
PA
PA
PA
,PA
,PA
PA
,PA
,PA
,PA
PA
PA
PA
PA
PA
PA
PA
PA
,
i
i
i
,
i
i
i
i
i
t
i
i
i
i
i
i
i
i
i
i
i
i
i
i
i
i
i
i
APC,
APC,
APC,
APC,
APC,
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
APC
,
/
/
/
i
i
i
,
i
i
i
i
i
i
i
,
i
i
r
i
i
i
i
i
i
i
i
i
i
i
i
FPM, gases
gases
FPM, gases
FPM, gases
FPM, gases
gases
FPM, gases
FPM, gases
FPM, gases
FPM, gases
FPM, gases
FPM, gases
gases
FPM, gases
FPM, gases
gases
FPM, gases
FPM, gases
FPM, gases
gases
FPM, gases
FPM, gases
gases
FPM, gases
FPM, gases
FPM, gases
gases
FPM, gases
FPM, gases
FPM, gases
FPM, gases
FPM, gases
FPM, gases
FPM, gases
gases
gases
gases
gases
gases
6-7
-------
Section 6
Ver. 4, 2/89
Table 6-1 (Continued)
MODEL EVALUATION FIELD STUDY SITE LOCATIONS
EPA Optional (O), Supplementary (S), and TVA (T) Sites
NAME
SITE
NUMBER LATITUDE LONGITUDE
OBSERVABLES MEASURED
IL
Grant Fork,
Piseco, NY
Belleayre, NY
Plainview, IL
Breese, IL
Quabbin Res.,MA
Land Bet.Lakes,
KY (LBL)
356(0)
357(0)
358(0)
359(S)
360(S)
393(S)
394 (T)
38.92
43.45
42.14
39.08
38.67
42.30
36.79
89.73
74.52
74.52
89.95
89.73
72.34
88.07
PC, PA, APC,
PC, PA, APC,
PC*,PA, APC,
PC, PA, APC,
PC, PA, APC,
PC, PA, APC,
PC, PA, APC,
gases
gases
gases, FPM
gases, FPM
gases
gases
gases
EPA Gradient Resolution Network (GRAD)
NAME
SITE
NUMBER LATITUDE LONGITUDE
Ford City, PA
Hawthorne , PA
Pr.Gallitzin,
Shawnee SF, PA
Decatur , PA
Emporium, PA
Renovo , PA
Williamsport,
Wirt, NY
Little Marsh,
E. Smithfield,
Way land, NY
Brackney , PA
North Orwell,
PA
PA
PA
PA
PA
361
362
363
364
365
366
367
368
369
370
371
372
373
374
40.
41.
40.
40.
40.
41.
41.
41.
42.
41.
41.
42.
41.
41.
75
03
63
03
71
50
37
16
15
90
95
56
94
90
79.
79.
78.
78.
77.
78.
77.
76.
78.
77.
76.
77.
75.
76.
51
27
56
64
40
15
53
92
11
44
66
60
92
28
OBSERVABLES MEASURED
PC*,PA, APC, gases@
PC, PA, APC, FPM,gases@
PC*,PA, APC, gases@
PC, PA, APC, gases@
PC*,PA, APC, gases@
PC ,PA, APC, gases©
PC*,PA, APC, FPM,gases@
PC, PA, APC, FPM,gases@
PC, PA, APC, gases@
PC*,PA, APC, FPM,gases@
PC, PA, APC, gases@
PC*,PA, APC, gases@
PC*,PA, APC, gases@
PC, PA, APC, gases@
PC:
Precipitation chemistry: pH, conductivity, sulfate, nitrate,
chloride, sodium, potassium, calcium, magnesium
* Includes S(IV)
PA: Precipitation amount
APC: Aerosol particle chemistry: sulfate, nitrate
FPM: Fine particle mass
Gases: Sulfur dioxide, nitric acid, nitrogen dioxide, ammonia
@ Includes ozone
6-8
-------
Section 6
Ver. 4, 2/89
Table 6-1 (Continued)
MODEL EVALUATION FIELD STUDY SITE LOCATIONS
NAME
EPA Sub-grid Variability Network (VAR)
SITE
NUMBER LATITUDE LONGITUDE OBSERVABLES MEASURED
Eddyville, Ky 381 37.07 88.03
Cadiz, Ky 382 36.77 87.73
New Concord,Ky 383 36.53 88.09
Benton, Ky 384 36.82 88.405
PC, PA, APC, gases@
PC, PA, APC, gases©
PC, PA, APC, gases©
PC, PA, APC, gases@
FADMP (FCG)
OBSERVABLES MEASURED
PC, PA, APC, gases©
PC, PA, APC*, gases*©
PC, PA, APC, gases©
PC, PA, APC*, gases*©
PC: Precipitation chemistry: pH, conductivity, sulfate, nitrate,
chloride, sodium, potassium, calcium, magnesium
PA: Precipitation amount
APC: Aerosol particle chemistry: sulfate, nitrate, ammonium
Gases: Sulfur dioxide, nitric acid, nitrogen dioxide,
ammonia
* Samples collected as 3-day averages
** Present location; may be relocated within 1 km.
© Includes ozone
TBD: To be determined
SITE NUMBER LATITUDE
2
5
g**
13
30
29
27
25
47
38
10
45
30
40
41
38
LONGITUDE
85
82
81
80
48
28
21
49
29
34
30
40
6-9
-------
Section 6
Ver. 4, 2/89
Embedded within these two years would be four periods in which
more intensive (higher sampling frequency) and extensive
(additional variables) measurements would be taken from aircraft
and at special (enhanced) surface sites. These intensive
measurement periods are planned to collect data for diagnostic
evaluations since the surface network does not provide the
relevant information. The intensive periods will be scheduled to
sample important seasonal contrasts.
6.1.1 EPA; ACID-MODES. The EPA field measurement programs are
collectively referred to as the ACID Model Operational/
Diagnostic Evaluation Study. Data for the operational aspect
will come from a 35-station network called the ME-35, located in
the eastern U.S. The variables that will be measured in this
network, the measurement techniques, and the data averaging
intervals are listed in Table 6-2.
The issue of how representative of the total area within a
modeled grid cell are the measurements made at one station will
be explored using three to five additional measurement stations
clustered around three geographically dispersed ME-35 or TVA
stations. As shown in Table 6-1, the exact locations of the
stations comprising this sub-grid variability network (VAR) have
yet to be determined. An additional set of 14 stations arrayed
in three parallel linear chains in a southwest-northeast
direction across Pennsylvania into New York will be operated by
EPA in a effort to resolve the steep depositional gradient
expected in that region. This set is called the GRAD network,
6-10
-------
Table 6-2
Section 6
Ver. 4, 2/89
OBSERVABLE
Air Quality
ME-35 MEASUREMENT TECHNIQUES
TECHNIQUE
Particulate sulfate, FP/AC
nitrate, ammonium
Nitric acid, ammonia TFR/AC
Ammonia, sulfur dioxide, FP/AC
nitrogen dioxide
Precipitation Chemistry
Amount WOC
pH pH Meter
Conductivity meter
Sulfate, nitrate, AC
ammonium, chloride
Sodium, potassium AA
Calcium, magnesium ICAPES
Dissolved sulfur dioxide AC
AVERAGING
PERIOD fhrs)
24
24
24
24
24
24
24
24
24
24
FP/AC = Filter pack, automated colorimetric analysis
TFR/AC = Transition flow reactor, automated colorimetry
TEA FP/AC = Triethanolamine impregnated filter in filter pack,
automated colorimetry analysis
WOC = Wet-only collector
AA = atomic absorption spectroscopy
ICAPES = Inductively coupled argon plasma emission spectroscopy
6-11
-------
Section 6
Ver. 4, 2/89
referring to its role in resolving depositional and concentration
gradients. The locations of its stations are also given in Table
6-1. In addition to the same measurements made at ME-35
stations, ozone will be measured at GRAD network stations.
Consideration is being given to the possibility of expanding the
number of GRAD stations at a later date.
EPA is also funding the operation of additional stations
cooperatively with state agencies in Illinois, New York, and
Massachusetts, and with the Tennessee Valley Authority in
Tennessee. These sites are also listed in Table 6-1.
Plans call for collection of the data for diagnostic evaluations
primarily during 6-week-long intensive measurement periods at
least during the summer of 1988 and possibly spring of 1990. The
emphasis during these "intensives" will be on the collection of
airborne measurement data to yield vertical profiles and
horizontal transects at a higher spatial and temporal resolution
than obtainable from the surface networks. These data will be
supplemented with those from measurements taken at the less
numerous enhanced chemistry stations (see Sections 6.1.2, 6.1.3
and 6.1.6) of a larger suite of variables at higher temporal
resolution than those taken at the majority of surface stations.
Descriptive information on the broad suite of variables to be
measured during intensives appears in Table 6-3.
6.1.2 OME: APIOS. Eight stations of the existing Acid
Precipitation in Ontario Study daily sampling network have been
6-12
-------
Section 6
Ver. 4, 2/89
Table 6-3
MEASUREMENT TECHNIQUES TO BE USED
DURING INTENSIVES
OBSERVABLE
Sulfur dioxide
Sulfur dioxide,
nitric acid
Ozone
Ammonia
Nitrogen dioxide
Hydrogen peroxide
Hydrocarbons
(speciated)
Light scattering
coefficient
Dew point
Broad band
radiation
Ultraviolet
radiation
Altitude
Position
Particulate
sulfate, nitrate
chloride
Particulate
ammonium
TECHNIQUE
Flame photometry
FP/IC
Chemiluminescence
FP/AC
Luminol Chemiluminescence
E/F
Capillary column GC
Nephelometer
Chilled mirror
Pyranometer
Photocell
AVERAGING
PERIOD
1 min (5 sec)
30 min
1 min (5 sec)
30 min
1 min (5 sec)
1 min
Integrated
(5 to 30 min)
1 min (5 sec)
1 min
1 min
1 min
Absolute pressure transducer Continuous
Loran - C Continuous
FP/IC 30 min
FP/AC
30 min
6-13
-------
Section 6
Ver. 4, 2/89
Table 6-3 (Continued)
MEASUREMENT TECHNIQUES TO BE USED
DURING INTENSIVES
AES Ground-based Measurements at Egbert
Same measurements shown in Table 6-5 plus:
OBSERVABLE
Sulfur dioxide
Ozone
NOy
Nitrogen
dioxide
Nitric oxide
Ammonia
PAN
Nitric acid
Hydrogen
peroxide
Formaldehyde
Aldehydes
Hydrocarbons
(speciated)
TECHNIQUE
Pulsed fluorescence
Filter pack
UV photometry
Catalytic reduction,
chemiluminescence
Luminol chemiluminescence
Chemiluminescence
Filter pack
Denuder
GC/ECD
Filter pack
TOLAS
Coulometric peroxidase
Luminol chemiluminescence
TOLAS
GC
GC
Carbon monoxide NDIR
Aerosol Filter pack
particles
AVERAGING
PERIOD
Continuous
6 hrs
Continuous
Continuous
Semi-
continuous
Continuous
6 hrs
1 hr
(48/day)
6 hrs
5 min
Continuous
Continuous
5 min
1 hr
(3/day)
Continuous
6 hrs
GC/ECD = Gas chromatography with electron capture detection
TOLAS = Tunable diode laser absorption spectroscopy
NDIR = Non-dispersive infrared
6-14
-------
Section 6
Ver. 4, 2/89
Table 6-3 (Continued)
MEASUREMENT TECHNIQUES TO BE USED
DURING INTENSIVES
Additional AES Measurements at Egbert
OBSERVABLE
Ozone profile
Ozone/SO2
profile
Ozone profile
Temp., RH
profile
Mixing depth
Micrometeor-
ological
variables
TECHNIQUE
Tethersonde
DIAL
Beukersonde
Beukersonde
Acoustic sounder
Mie lidar
Standard met tower
FREQUENCY OF
MEASUREMENT
Periodically
Periodically
2/day as appropriate
4/day as appropriate
Continuous
Continuous
Continuous
6-15
-------
Section 6
Ver. 4, 2/89
Table 6-3 (Continued)
MEASUREMENT TECHNIQUES TO BE USED
DURING INTENSIVES
OME Ground-based Measurements at Dorset
Same measurements shown in Table 6-4 plus:
AVERAGING
OBSERVABLE TECHNIQUE PERIOD
Ammonia Filter pack 24 hrs
NOy Catalytic reduction with Continuous
chemiluminescence
NO/NO2 Luminol chemiluminescence Continuous
PAN GC/ECD (48/day)
Hydrogen TOLAS Continuous
peroxide
Formaldehyde TOLAS Continuous
Aldehydes TAGA 6000 Continuous
Hydrocarbons GC (3/day)
(speciated)
GC/ECD = Gas chromatography with electron capture detection
TOLAS = Tunable diode laser absorption spectroscopy
TAGA 6000 = A system based on mass spectrometry
6-16
-------
Section 6
Ver. 4, 2/89
adapted for operational model evaluation data collection. OME
will also support colocated measurements with AES, EPA, and EPRI
at Egbert, Ontario and State College, PA. Measuring capabilities
at these sites are summarized in Table 6-4. One OME site
Dorset is being instrumented for intensive atmospheric
chemistry measurements. The measurements to be made there are
listed in Table 6-3.
6.1.3 AES; CAPMoN. enhanced chemistry sites and aircraft. A 10-
station subset of the existing Canadian Air and Precipitation
Monitoring Network has been designated for operational evaluation
data collection. AES will also support colocated measurements
with EPA, EPRI, and OME at the State College, PA site.
Measurement attributes are shown in Table 6-5. The site at
Egbert, Ontario, will not only serve as another location for
colocating one sampling system each from AES, EPA, EPRI, and OME,
but will also have enhanced measurement capabilities (listed in
Table 6-3) to provide data for diagnostic evaluation.
AES also is planning an airborne measurement campaign to collect
data for diagnostic evaluation as summarized in Table 6-6. To
the extent possible, the AES measurement campaign will overlap
with that of EPA.
6-17
-------
OBSERVABLE
Air Measurements
Sulfate and
Nitrate
Ammonium
Nitric acid and
Sulfur dioxide
Sulfur dioxide
Section 6
Ver. 4, 2/89
Table 6-4
APIOS MEASUREMENT TECHNIQUES
TECHNIQUE OR PROCEDURE
AVERAGING
PERIOD fhrs)
Teflon filter, extract in DDW 24
Ion chromatography
Teflon filter, extract in DDW 24
Automated colorimetry
Nylon Filter, extract in 0.003N 24
NaOH, ion chromatography
Whatman 41 impregnated with K2C03 24
Extract with H2O2, ion chromatog.
N.B. Sulfur dioxide is obtained as the sum of the nylon and
Whatman 41 values.
Precipitation Measurements
PH
Total acidity
Conductivity
Sulfate, nitrate
and chloride
Ammonium
pH meter with low conductance
combination electrode
Gran titration
Conductivity cell and meter
Ion chromatography
Automated colorimetry
Sodium, potassium Flame atomic absorption
calcium and
magnesium
24
24
24
24
24
24
6-18
-------
Section 6
Ver. 4, 2/89
Table 6-5
CAPMoN MEASUREMENT TECHNIQUES
OBSERVABLE
Air Measurements
Sulfate and nitrate
Sulfur dioxide and
nitric acid
Ozone
TECHNIQUE
FP/IC
FP/IC
UV Photometry
Precipitation Chemistry
pH
Sulfate, nitrate,
chloride
Ammonium
Sodium, potassium
Calcium, magnesium
pH meter
Ion chromatography
Automated colorimetry
Flame photometry
Atomic absorption
AVERAGING
PERIOD fhrs)
24
24
Continuous
24
24
24
24
24
FP/IC = Collection with filter pack, ion chromatographic
analysis
6-19
-------
Section 6
Ver. 4, 2/89
Table 6-6
AIRBORNE MEASUREMENTS TO BE TAKEN BY AES
OBSERVABLE
Sulfur dioxide
Nitric oxide
Nitrogen dioxide,
ozone
PAN
Hydrogen peroxide
Hydrocarbons
(speciated)
TECHNIQUE
Pulsed fluorescence
Luminol chemiluminescence
Luminol chemiluminescence
AVERAGING
PERIOD
30 sec
20 sec
1 sec
GC, luminol chemiluminescence 5 min
Enzymatic fluorimetric
Cannister samples analyzed
by GC
Sulfate, nitrate, Filter pack
nitric acid, ammonia
Aldehydes
Solar radiation
Cloud/precipitation
water
Aerosol size
distribution
Cloud droplet size
distribution
Precipitation
particle size
distribution
Cloud liquid water
content
DNPH cartidges
UV radiometer
ASRC collector
PMS ASASP
PMS FSSP
2-D grey scale
2-D-P
PMS FSSP
King probe
10 sec
5x2 min
50 min
50 min
30 sec
<20 min
<1 sec
<1 sec
<1 sec
<1 sec
GC = Gas chromatography
6-20
-------
Section 6
Ver. 4, 2/89
6.1.4 EPRI; PEN. The Operational Evaluation Network will
include 23 independent sites (exclusive of the 2 colocated with
the other networks). These are largely at or nearby former sites
in the Utility Acid Precipitation Study Program (UAPSP). A
summary of OEN measurements is given in Table 6-7.
6.1.5 FCG; FADMP. Four sites will be operated in Florida using
methods virtually identical to those used in the OEN (see Table
6-8). 24-hour precipitation samples will be collected at all
four sites. 24-hour air quality samples will be collected every
day at two of the sites and 72-hour samples every third day at
the remaining two sites (see Table 6-1).
6.1.6 Complementary programs. Several studies of various
aspects of the acidic deposition phenomenon will be taking place
concurrently with the model evaluation field study. Results from
some of these will be useful supplements for model evaluation. In
addition, opportunities for collaboration with other
organizations are being investigated.
6-21
-------
Section 6
Ver. 4, 2/89
OBSERVABLE
Air Quality
Particulate mass,
sulfate, nitrate
ammonium
Table 6-7
OEN MEASUREMENT TECHNIQUES
TECHNIQUE
FP/AC
Sulfur dioxide
Nitric acid, ammonia
Ozone
Nitrogen dioxide
Peroxyacetyl nitrate
Hydrocarbons, speciated
Carbonyls
Meteorology
Wind speed
Wind direction
Temperature
Dew point
Barometric pressure
Precipitation amount
Precipitation Chemistry
pH, field and lab
Conductivity, field and
lab
AVERAGING
PERIOD rhrs)
24
FP/AC 24
TFR/FP/AC 24
UV Photometry 1
Luminol chemiluminescence 1
Alkaline filter/IC 24
Canister/CCGC 24
DNPH/HPLC 24
Cup anemometer 1
Wind vane 1
Thermistor l
LiCl cell l
Capacitance l
Weighing bucket l
pH meter 24
Conductivity meter 24
6-22
-------
Section 6
Ver. 4, 2/89
Table 6-7 (continued)
OEN MEASUREMENT TECHNIQUES
AVERAGING
OBSERVABLE TECHNIQUE PERIOD fhrs)
Precipitation Chemistry (continued)
Sulfate, nitrate, 1C 24
chloride
Ammonium AC 24
Sodium, potassium AA 24
Calcium, magnesium ICAPES 24
Precipitation amount WOC 24
FP/AC = Filter pack collection, automated colorimetric analysis
TFR = Transition flow reactor
1C = Ion chromatographic analysis
Canister/CCGC = Collection in passivated canister, capillary
column gas chromatographic analysis
DNPH/HPLC = Collection on dinitrophenylhydrazine cartridge,
analysis by high performance liquid chromatography
AA = Atomic absorption spectroscopic analysis
ICAPES = Inductively coupled argon plasma emission spectroscopic
analysis
WOC = Wet-only collector
6-23
-------
Table 6-8
FADMP MEASUREMENT TECHNIQUES
Section 6
Ver. 4, 2/89
OBSERVABLE
Air Quality
Particulate sulfate
nitrate, ammonium
Nitric acid
TECHNIQUE
FP/AC
TFR/FP/AC
Ammonia, sulfur dioxide, FP/AC
nitrogen dioxide
AVERAGING
PERIOD (hrs)
24
24
24
Precipitation Chemistry
Amount
PH
Conductivity
Sulfate, nitrate,
chloride
Sodium, calcium,
magnesium
Ammonium
Potassium
WOC
pH meter
conductivity meter
1C
ICAPES
AC
AE
24
24
24
24
24
24
24
FP = Filter pack
AC = Automated colorimetric analysis
TFR = Transition flow reactor
WOC = Wet-only collector
AE = Atomic emission spectroscopy
1C = Ion chromatography
ICAPES= Inductively coupled argon plasma emission spectroscopy
6-24
-------
Section 6
Ver. 4, 2/89
The Department of Energy's acid deposition research program is
termed Processing of Emissions by Clouds and Precipitation
(PRECP). Many of its researchers will have participated in a
multi-agency field investigation of convective storms in the
vicinity of Champaign, IL. Dubbed 3CPO (for Cloud Chemistry and
Cloud Physics Organization) it was planned for May through
July 1988 coincident with the beginning of the model evaluation
field study. The dynamics of convective storms and how they
process atmospheric constituents were to be studied with an eye
toward refining the parameterizations in the RADM scavenging
module. The following year, in late Fall 1989, PRECP
researchers plan to similarly study stratiform cloud systems.
Although the results will be most useful to those developing
models, they may also find model evaluation applications.
NOAA at the Scotia Range at Penn State; SUNY (Albany) at
Whiteface Mountain, NY; TVA at Whitetop Mountain; and Georgia
Tech at Brasstown Bald in north Georgia, operated specially
equipped ground stations and an aircraft (NOAA) during the summer
1988 intensive measurement period. As an example of the types of
measurements that are to be made at these locations, the
measurements planned for the Georgia Tech site are shown in Table
6-9. Data will be used for diagnostic model evaluation and
refining estimates of inflow boundary conditions for the modeling
domain.
American Electric Power Service Corporation is sponsoring the
collection of several hundred canister and sorbent samples at
6-25
-------
Section 6
Ver. 4, 2/89
Table 6-9
MEASUREMENTS PLANNED FOR GEORGIA TECH
SITE AT BRASSTOWN BALD DURING SUMMER 1988 INTENSIVE
OBSERVABLE
SO-
NO
NO-
NO.
Y
CO
TECHNIQUE
UV Photometry
Pulsed Fluorescence
Chemiluminescence
Photolysis/chemiluminescence
Au converter/chemiluminescence
GC/HgO detection
SAMPLE
PERIOD
12 sec
continuous
2 min
2 min
2 min
4-5/hr
NMHC
(speciated)
HNO3
S04=
(particulate)
NO3~
(particulate)
GC/FID
Nylon filter in filter pack
1C analysis
Teflon filter in filter pack
1C analysis
Teflon filter in filter pack
1C analysis
NMHC = Non-methane hydrocarbons
GC = Gas chromatography
FID = Flame ionization detector
1C = Ion chromatography
2/hr (max)
30 min to
2 hrs
30 min to
2 hrs
30 min to
2 hrs
6-26
-------
Section 6
Ver. 4, 2/89
five OEN stations during the Autumn 1988 and possibly the Spring
1990 U.S. intensives. Plans call for the canister samples to be
collected over 24-hour period and analyzed by capillary column
gas chromatography for C2 through C12 hydrocarbons. The sorbent
samples are to be collected over 12-hour periods and analyzed for
Cl through C5 carbonyl compounds. The data will be used for
diagnostic model evaluation and for checks on the hydrocarbons
emissions estimates.
6.2 Emission Inventories
A necessary input for exercising the models is the gridded
emissions distribution. Inventories for the U.S. and Canada have
been compiled for sulfur dioxide, nitrogen oxides, volatile
organic compounds, soil dust, and ammonia separately by EPA and
EPRI with assistance from AES and OME. EPRI's inventories are
for the year 1982. EPA has compiled one set of inventories for
1980 and is in the process of developing another for 1985.
In addition, EPA plans to estimate the real-time SO2 and NOx
emissions from over 200 of the largest stationary sulfur dioxide
sources (comprising about 100 power plants) over the course of
the field study to make this particular input to the model
evaluation data set as realistic as possible. A similar activity
is underway in Canada for the largest 15 sources east of
Saskatchewan, but only during the intensive measurement periods.
6-27
-------
Section 6
Ver. 4, 2/89
6.3 Data Base Management
Each participating organization will maintain the data from its
own network in its own data base. To facilitate easy access to
the data for model evaluation, a composite archive of commonly
formatted data will also be established within the Acid
Deposition System (ADS), maintained at the Battelle Pacific
Northwest Laboratory. Realizing the data's unique value to the
model development community (because of their geographical
coverage, number of measured variables, duration, and quality
definition), the participants have agreed that data collected
during the first year of the field study (June 1988 through May
1989) will be available for model development following their
validation.
However, there may be some restrictions on the data's
availability for the following reasons:
1. Much of the first six months' data will be used to conduct a
preliminary evaluation of the RADM in time for the results to
be included in the final assessment report from NAPAP (Fall
1990).
2. Some of the data generators would like to have the initial
opportunity to analyze the data in preparing reports of
findings for publication in the technical literature.
Therefore, potential data users should be aware that it may be
necessary to gain approval from the data generators before the
data can be released.
The second year's data are to be sequestered and used initially
6-28
-------
Section 6
Ver. 4, 2/89
solely for a comprehensive model evaluation, the conduct of which
will probably extend beyond the lifetime of NAPAP.
6.4 Methods Characterization
Measurement methods used in the model evaluation field study must
be fully characterized in terms of their sensitivity (LQL),
precision, and accuracy commensurate with estimated model
evaluation requirements and influence of potential
interferences. Many of the planned methods had not been
standardized at the time of their selection because no standard
methods existed for the observables of interest that had the
requisite characteristics: sensitivity, selectivity, simplicity,
reliability, economy, etc. It was therefore necessary to conduct
the necessary characterization tests prior to the method's
adoption for use in the field study-
The sample collection or measurement systems that have been
subjected to laboratory characterization tests specifically for
the model evaluation field study are the filter packs, transition
flow reactors (TFR), PAN filter sampler, Luminox LMA-3 N02
analyzer, and an automated colorimetry system. Filter packs,
TFRs, the PAN filter sampler, and precipitation collectors have
been tested under field conditions as well. The specific tests
and pertinent references to them are listed in Table 6-10.
6.5 Quality Assurance Auditing and Corrective Action
Performance and systems audits of field, laboratory, and data
management operations will be handled by a combination of
6-29
-------
Section 6
Ver. 4, 2/89
Table 6-10
METHODS PERFORMANCE CHARACTERIZATION
LABORATORY TESTS
System
Filter Pack
Test
Filter absorption capacity for
impregnating solution
Reference
i
SO2 collection efficiency of carbonate
impregnated filters as function of
temperature, relative humidity, and
concentration
NH3 collection efficiency of citric acid
impregnated filters as function of
temperature, concentration, and citric
acid loading
NO^ collection efficiency of triethanol-
amine impregnated filters as function of
filter type and face velocity
SOo collection efficiency of triethanol-
amine impregnated filters as function of
concentration
HN03 collection efficiency of nylon
filters
Flow resistance of various 47-mm filter
discs, wet and dry
Integrated PAN Efficiency of chilled water scrubbers for
acetic acid removal
Chilled scrubber temperature dependence on
flow rate
Determining analytical conditions for
acetate analysis on ion chromatograph
Transition
Flow Reactor
HNO3 collection efficiency by nylon
inserts during dynamic sampling, dry
air and 50% RH
HNO3 collection efficiency by nylon
inserts during passive sampling
HN03 collection efficiency, blank levels
6-30
-------
Section 6
Ver. 4, 2/89
Table 6-10 (Continued)
LABORATORY TESTS (Continued)
System
Automated
Colorimetry
Luminox
(LMA-3)
Test
Reference
i
Phosphoric acid interference with indol-
phenol blue method
Comparison with ion chromatographic nitrate i
analyses
Sample processing rate for nitrate, i
ammonium, and sulfate analyses
Analysis of TEA impregnated filter i
extracts
Optimization of analytical conditions for i
sulfate, nitrate, and ammonium analyses
Linearity, range, lower detection limit,
zero and span drift, interferences, RH
and temperature response
Linearity, duplicate sampling, zero and
calibration drift, interferences,
11
6-31
-------
Section 6
Ver. 4, 2/89
Table 6-10 (Continued)
FIELD TESTS
System
TFR/Filter Pack
Filter Packs
Precipitation
Collectors
Precipitation
Chemistry and
Deposition
Test
Reference
Check prototype performance and compare
with other methods during SCAQS
Duplicate sampling
VI
Machined TFE vs injection molded PFA iii,iv
filter holders
2-year comparison of AES and OME data ix
at Longwoods
Methods characterization x
Comparison of HNO3 nylon filter method xi,xiii
with spectroscopic and other methods xiv
Comparison of NH3 impregnated filter xv
method with spectroscopic and other
methods
Comparison of HNO3, NO3~ and NH4+ xii
methods
Snow sampling efficiency of different vii
types of precipitation gauges and
samplers; influence on composition
Precision using Aerochem Metrics and viii
MIC collectors. Examination of sources
of error
6-32
-------
Section 6
Ver. 4, 2/89
Table 6-10
References
i. Operational Evaluation Network Semi-Annual Progress Report,
1 January - 1 August 1987, ERT Doc. No. P-E292-710,
Concord, MA. October 1987.
ii. D.W. Joseph, C.W. Spicer and G.M. Sverdrup. Evaluation of
Luminox LMA-3 NO2 Monitor for Acid Deposition Network
Applications, Battelle Draft Topical Report, Columbus,
Ohio. July 1986.
iii. W.J. Mitchell. Comparative Testing of Machined and Molded
Teflon Filter Holders for Dry Deposition -Preliminary
Analysis. EPA Memorandum dated 13 January 1987.
iv. W.J. Mitchell. Further Comparative Testing of Machined
(Canadian) and Molded (American) Teflon Filtger Holders.
EPA Memorandum dated 20 February 1987.
v. T.G. Ellestad. ASRL Concentration Monitor. Unpublished
manuscript dated 6 February 1986.
vi. K.T. Knapp, J.L. Durham, and T.G. Ellestad. Pollutant
Sampler for Measurements of Atmospheric Acidic Dry
Deposition. Environ. Sci. Technol. .2_0:633-637 (1986).
vii. L. Topol et al. Investigation to be completed April 1988.
viii. A.J.S. Tang, W.H. Chan, D.B. Orr, W.S. Bardswick and M.A.
Lusis. An Evaluation of the Precision, and Various Sources
of Error, in Daily and Cumulative Precipitation
Chemistry Sampling. Water, Air and Soil Pollution 36;91
(1987).
ix. W. Fricke. A Preliminary Comparison of APN and APIOS Data
at Longwoods/Ont. Internal AES memorandum, 23 December
1986.
x. K.G. Anlauf, H.A. Wiebe, and P. Fellin. Characterization
of Several Integrative Sampling Methods for Nitric Acid,
Sulphur Dioxide and Atmospheric Particles. J. Air Pollut.
Control Assoc. .36:715 (1986).
xi. K.G. Anlauf et al. Measurement of Atmospheric Nitric Acid
and Ammonia by the Filter Method and a Comparison to the
Tunable Diode Laser Method. Proceedings of the EPA/APCA
Symposium on Measurement of Toxic and Related
Air Pollutants, pp. 373-378. May 1987.
xii. K.G. Anlauf et al. A Comparison of Three Methods for the
Measurement of Atmospheric Nitric Acid and Aerosol Nitrate
and Ammonium. Atmos. Environ. .19:325 (1985).
6-33
-------
Section 6
Ver. 4, 2/89
Table 6-10
References (continued)
xiii. K.G. Anlauf, D.C. MacTavish, H.A. Wiebe, H.I. Schiff, and
G.I. MacKay. Measurement of Atmospheric Nitric Acid by the
Filter Method and Comparison with the Tunable Diode Laser
and Other Methods. Accepted for publication, Atmospheric
Environment, 1988.
xiv- K.G. Anlauf, et al. A Comparison of the Measurement of
Atmospheric HNO3 at High Ambient Concetrations by Nylon
Filter, Tunable Diode Laser, Transition Flow Reactor, and
Fourier Transform Infrared Spectroscopy. In preparation,
1988.
xv. H.A. Wiebe et al. A Comparison of Atmospheric Ammonia by
Filters, Transition Flow Reactor Tubes, Denuder Tubes, and
Fourier Transform Infrared Spectroscopy. In preparation,
1988.
6-34
-------
Section 6
Ver. 4, 2/89
contractual and organizational arrangements. AES and OME will
use their own staff members (not directly involved in operations)
to conduct audits.
A subcontractor, Desert Research Institute (DRI), to EPA's prime
contractor (ENSR), will conduct systems and performance audits of
ENSR's and Combustion Engineering Environmental's activities in
support of ME-35. DRI will also audit the airborne measurement
systems operated by Battelle Columbus Laboratories during the
intensives.
Within the OEN, the initial plan called for quality assurance
staff from each of the two measurement contractors (ENSR as
prime, CE Environmental as subcontractor) to audit the operations
of the other. This has been superseded by the use of internal
audits of each contractor's operations by members of its own
staff, not directly involved in the the operations, and external
systems audits by a QA contractor common to all participants.
This use of a single contractor (REA) to audit all networks stems
from an awareness that establishing and maintaining comparability
of measurements among the networks over the course of the field
study would be simplified if the quality assurace audit planning
and execution were centralized. The nature of the external
audit is described below.
EPA was the first to contract with Research & Evaluation
Associates to perform management systems audits (MSA) and data
6-35
-------
Section 6
Ver. 4, 2/89
traceability audits on the prime contractor's activities. The
MSAs will involve reviews of facilities, equipment, record
keeping, data validation, data management and reporting for the
entire QA system. Traceability audits involve reviews of
operational, computational and recording activities of the
measurements. Data points will be selected at random to trace
back from the central data base through the laboratory to their
origins in either the aircraft or field sampling sites.
The Diagnostic Measurements Team will assist in determining the
type and extent of quality assurance applied to the aircraft and
enhanced chemistry measurements.
Descriptions of the audit procedures are given in the respective
network QA Plans (see Appendix).
The results of all audits will be reported through the
responsible technical oversight team to the PMG. Deviations from
standard operating procedures, results outside control limits,
and other indications of procedural weaknesses or circumstances
that could detract from measurement comparability among the
various activities will be dealt with at the appropriate level
required for corrective action at the earliest opportunity.
6.6 Inter-network Comparisons
These will be conducted by the participating organizations
through the operation of colocated measurement systems and by
interlaboratory comparisons. The FADMP will not participate in
6-36
-------
Section 6
Ver. 4, 2/89
the field comparisons, but will participate in the other
activities designed to demonstrate or assess comparability of
measurements. Having selected methods identical to those used in
the OEN the FCG decided that colocating FADMP equipment with the
other networks at State College and Egbert would be redundant
(see below).
6.6.1 Colocation of field measurement systems. Two sites
(Egbert, Ontario and State College, PA) will be equipped with
measurement systems from AES, EPA, EPRI, and OME. At Egbert,
each of these organizations will install one air quality sampler
(filter pack or filter pack/TFR combination), one precipitation
collector, and one rain gauge.
At State College, the complete suite of samplers and analyzers
used by each of these organizations at its network sites will be
installed in duplicate, exclusive of those instruments used by
only one of the participants (such as analyzers for ozone, by
EPRI, and, possibly, hydrogen peroxide, by EPA), of which only
one will be installed. The colocation of duplicate measurement
systems will allow the inter-network deviations to be
distinguished from the intra-network measurement precision.
At Longwoods, Ontario, OME and AES will operate colocated
sampling systems to provide a third site to allow possible bias
between their air quality measurements to be assessed.
6-37
-------
Section 6
Ver. 4, 2/89
6.6.2 NWRI QC comparison on precipitation samples. The National
Water Research Institute, Environment Canada, has been contracted
to provide external quality assurance services by providing 10
certified precipitation test samples per month to each of the
participating laboratories and to approximately six other
laboratories shown to have performed reliably in previous inter-
laboratory comparisons. NWRI will monitor the stability of the
test samples.
The analytical results will be used to assess inter-laboratory
bias. Inclusion of the other six high-performance laboratories
is expected to provide a stable and reliable median for bias
assessment. Two or three artificially prepared standard mixtures
of known stability would also be distributed monthly to allow
analytical accuracy also to be assessed.
Criteria will be established to define very good, average, and
poor performance. Verified instances of poor performance by a
participating laboratory will be communicated as soon as
practical to the laboratory so that corrective action may be
taken. Concurrently, the measurements team representative
responsible for the laboratory will be notified so that he can
ensure that corrective action has been taken. Such instances
will be brought to the attention of the measurements team and the
PMG so that further assurance is gained that measurement
discrepancies are resolved.
6-38
-------
Section 6
Ver. 4, 2/89
Reports on the inter-comparison procedures and results will be
issued annually by the NWRI and at the end of the study.
6.6.3 Filter pack testing on common test atmospheres. In
addition to the comparisons conducted under field conditions at
the colocated sites, the filter packs used by the participating
networks are to be challenged under controlled conditions with
test atmospheres containing nitric acid, sulfur dioxide, and
ammonia (either in combination or individually) as a further test
of their relative performance.
The protocol for testing the filter packs will be developed by
ENSR in consultation with the Operational Measurements Team and
the actual tests will be performed using the test atmosphere
generation and exposure system at ENSR's Camarillo, CA
laboratory. ENSR will provide a report of the test results to
the Team through the Teams's OEN representative.
6.6.4 AES/EPA airborne measurements comparisons.
The airborne measurement systems used by the AES and EPA will be
subjected to intercomparison testing according to a protocol to
be developed under the auspices of the Diagnostic Measurements
Team.
6.7 Intra-network Colocation
In addition to the data from the duplicate samplers at the State
College inter-network comparison site, intra-network precision
assessments will rely on data from 4 APIOS, 2 OEN, 6 EPA, and 1
6-39
-------
Section 6
Ver. 4, 2/89
FADMP colocated stations. The stations will be geographically
dispersed and will be changed in the OEN after the first year and
in the ME-35 every six months.
6.8 Common Filter and TFR Supplier
By agreement among participants, all Teflon, nylon, and
impregnated filters used in the field study will be supplied by a
common vendor. Following a competitive procurement, ENSR was
selected as the filter supplier. Each participating organization
will contract separately with ENSR for its supply of filters.
Filter specifications are given in Table 6-11.
The Teflon and nylon filters will be shipped in yearly batches
to each sponsor. Impregnated filters will be supplied in monthly
batches because their greater propensity for contamination limits
their shelf life. Nylon and Naphion filter-material inserts for
the transition flow reactors will also be provided by ENSR to the
ME-35, OEN, and FADMP. As the surface area of the inserts is 70%
of that of the 47-mm filters, the blank levels for nitric acid
(nylon inserts) and ammonia (Naphion inserts) will be
proportionately smaller than the values shown in Table 6-11.
6.9 Composite Data Archive
Site descriptions, all measurement data taken during the model
evaluation field study, and quality control data and sample
status codes that support data quality estimates will be archived
together in the Acid Deposition System (ADS) data base at
Battelle Pacific Northwest Laboratory. This archive, compositing
6-40
-------
Section 6
Ver. 4, 2/89
Table 6-11
FILTER SPECIFICATIONS
(All 47-mm diameter)
Filter
Type
Teflon
Membrane
1 urn
Zefluor
Nylon
Membrane
S&S 1 urn
Nylon 66
Whatman
41
Whatman
41
Whatman
41
Target
Species
Sulfate
Nitrate
Ammonium
Nitric acid
Blank Levels
(ug/filter)
1.1
1.3
1.0
1.0
Sulfur Dioxide 2.1
Ammonia
PAN
1.0
1.0
Recipe
NA
NA
NA
NA
15% K2C03
5% Glycerol
25% Citric acid
5% Glycerol
10% KOH
2% Glycerol
6-41
-------
Section 6
Ver. 4, 2/89
data from all participating networks and laboratories, will
ensure common data formats for like variables, irrespective of
source, and facilitate access by prospective data users. Data
will be transmitted to ADS by each participating organization on
differing schedules, but not to exceed quarterly for the
preceding quarter. Thus, the longest time interval between
sample collection and transmittal of its measurement data to ADS
should be about 6 months.
The contents of the data archive are summarized in Table 6-12.
Functional specifications for the data archive have been
developed under contract from OME and are given by Daly and Olsen
(1988) along with a detailed description of its contents. The
archive will be established under contract from EPA and will be
maintained for two years after the completion of the study.
Thereafter, users may still obtain copies of the data on tape,
but will probably have to sort it themselves to access specific
subsets.
6.10 Individual Network Data Archives
Each of the data-generating organizations will maintain an
archive of its own data. The archive will contain not only all
the original validated data that the organization transfers to
the ADS composite archive but also the quality control data (such
as from analysis of blanks, replicates, spikes, and standards)
and field logs and zero, span and calibration data that are used
for the data quality assessments and for data validation. Also
6-42
-------
Section 6
Ver. 4, 2/89
Table 6-12
DATA ARCHIVE CONTENTS
o Support Documentation
- Program overview
- Sampling platform descriptions
- Data processing manual
- Data transfer description
- Quality control procedures manuals
- Quality control reports
- Quality Assurance reports
o Site Data Base
31 variables
o Precipitation Chemistry Record Variables
147 variables
o Filter/Transition Flow Reactor Chemistry Record Variables
97 variables
o Continuous Gas Phase Chemistry Record Variables
22 variables
o Hourly Precipitation Record Variables
12 variables
o Hourly Meteorology Record Variables
52 variables
o Aircraft Filter Chemistry Record Variables
53 variables
o Aircraft Continuous Sampling Record Variables
124 variables
6-43
-------
Section 6
Ver. 4, 2/89
archived will be the data from quality auditing of lab and field
performance.
6-44
-------
Section 7
Ver. 4, 2/89
Section 7
EMISSIONS
Comprehensive emissions inventories have been, and are being
compiled under programs distinct from the model evaluation
program. As such they are not strictly under the aegis of the
PMG. Nonetheless, these inventories will serve as the major basis
for emissions data inputs to the models during their evaluation.
For this reason, the PMG plans for the Emissions Inventory Team to
ascertain the uncertainties associated with these emissions data
to the extent possible and to work with the Model Evaluation Team
to determine how the emissions uncertainties propagate through the
models to influence the output uncertainties. Of course, these
considerations also apply to the real-time emissions estimates,
gathered over the duration of the field study by EPA in the U.S.
and during the intensives by AES and OME in Canada (see Section
1.5) .
The Team has been asked to determine to what extent quality
control has been exercised in the compilation of the inventories
in terms of checking for consistent application of emissions
calculation procedures, for data entry errors, and for
reasonableness of the values.
With respect to the volatile organic compounds, ammonia, and soil
dust inventories there is little independent data available with
which to gauge uncertainties. At a minimum, the relative
magnitudes of the values in inventories of the same species,
7-1
-------
Section 7
Ver. 4, 2/89
compiled by different organizations, should be compared. When
discrepancies judged to be significant are noted, their causes
should be investigated and the discrepancies resolved, when
possible. When unresolvable, the influence of using the different
values as model inputs on the output uncertainty should be
ascertained by the model evaluators.
7-2
-------
Section 8
Ver. 4, 2/89
Section 8
MODEL EVALUATION PROTOCOLS
RADM and ADOM may be evaluated in a number of ways, as outlined in
Section 1.3. Their comparative evaluation is underway at Battelle
Pacific Northwest Laboratory, with subcontracts to the model
developers, SUNY Albany and ENSR. Protocols for evaluation of the
gas phase chemistry, scavenging (including cloud physics and
aqueous phase chemistry), and atmospheric transport modules are
being developed.
The observational data collected in the model evaluation field
study are to be used to operationally and diagnostically evaluate
the models. These evaluations will involve in one way or another
the comparison of model output with observational data.
Model evaluation is an important component of the NAPAP
assessments. For model evaluation results to be incorporated into
the 1990 NAPAP final assessment report, they should be received by
NAPAP in October 1989 although some schedule slippage is possible.
This schedule necessitates a "preliminary" evaluation of RADM and
ADOM. Over the period April through June 1989 both models will
undergo the same evaluation process, which will use data from the
first six months of the field study, including those from the
summer 1988 intensive measurement campaigns in Canada and the U.S.
The nature of the preliminary evaluation will be specified in a
model evaluation protocol document, which is scheduled for
completion in April 1989.
8-1
-------
Section 8
Ver. 4, 2/89
The protocol for the more comprehensive evaluation that motivated
the field study in the first place is only at a conceptual stage
of development. Its completion will probably take place after
gaining experience with the "preliminary" NAPAP evaluation.
It is the responsibility of the Model Evaluation Team to propose
these protocols and then to expedite their implementation. In the
meantime, the general aspects of the model evaluations, as
described in this section, are sufficiently understood to help
guide the design of the field study.
8.1 Operational Evaluation
Several approaches to operational evaluation have been
considered: geographical pattern comparison, point-to-grid-cell
comparison, and multivariate analysis. Condensed descriptions of
these are provided below.
The first one, pattern recognition, involves use of an
interpolation/extrapolation scheme to construct gridded data maps
based on the time-averaged field measurements and then comparison
of these gridded values with those calculated from the Eulerian
model output. A presumed advantage of this approach is that the
spatially interpolated patterns are better able to represent the
actual deposition and air quality distributions than the discrete
data from which they are derived. Seasonal or longer averages of
observed and predicted precipitation constituents such as sulfate,
nitrate, and ammonium, and air quality variables such as sulfur
dioxide, nitric acid, nitrogen dioxide, ammonia and particulate
8-2
-------
Section 8
Ver. 4, 2/89
sulfate, nitrate, and ammonium would be compared.
An interpolation method under serious consideration for this
application is kriging. (See, for example, Seilkop and
Finkelstein, 1987, for a brief explanation of simple kriging and
its application to precipitation data.) Although simple kriging
has some restrictive assumptions (e.g., Philip and Watson, 1986)
that detract from its utility for model evaluation, it has the
advantage that it yields estimates of interpolation uncertainty
for each interpolated value. This is an important attribute
because, in principle, it allows this source of variance to be
distinguished from others such as measurement uncertainty,
"subgrid" variability, and meteorological stochasticity. An
attempt to identify and use elaborations of the method that avoid
the restrictive assumptions of simple kriging will be made. The
Model Evaluation Team will decide what the preferred interpolation
method or methods will be. It must also resolve the question of
what statistical measures will be used for assessing spatial and
temporal comparability between the observational and model output
fields.
A more traditional approach to operational model evaluation is the
so-called point-to-grid-cell (or "point-to-node") comparison in
which averaged observational data at the measurement locations are
compared with the averaged model predictions for the grid cell
containing each location. Several performance measures based on
this approach were recommended by an American Meteorological
Society workshop in 1980 and are described by Fox (1981).
8-3
-------
Section 8
Ver. 4, 2/89
A third way that has been discussed is the use of principal
component analysis of both the observational and model output
data. (See Henry and Hidy, 1979, for an example of PGA
application to environmental data.) This multivariate analysis
approach takes a large number of variables, many of which may be
temporally correlated, and groups them into a smaller number of
uncorrelated variables (principal components). The correlations
result from physical and chemical associations of the variables.
Measurement data from two identical natural systems will yield
identical variables and weights in their separately calculated
principal components. Therefore, the similarity between the
principal components calculated from the observational data and
those calculated from the model output data should provide a
measure of how well the model is capturing the physical and
chemical essence of the natural system. How the degree of
similarity would be judged and interpreted remains an unresolved
issue.
These three general approaches to model evaluation should not be
considered exhaustive. The Model Evaluation Team is considering a
number of other statistical and subjective measures of model
performance and will be receptive to any further suggestions that
appear promising.
8.2 Diagnostic Evaluation
Diagnostic evaluations will rely principally on measurement data
from the aircraft, VAR surface stations, continuous analyzers at
surface network stations in the vicinity of measuring aircraft and
8-4
-------
Section 8
Ver. 4, 2/89
enhanced chemistry sites operated by cooperating agencies.
Protocols for conducting the diagnostic evaluations have not been
completed, but will almost certainly involve some form of point-
to-grid-cell comparisons for vertically resolved data and line-to-
linearly-grouped-grid-cells comparisons for horizontal transect
data. Protocol completion will be the joint responsibility of
Model Evaluation and Diagnostic Measurements Teams.
8.3 How Models Will be Run to Obtain Averages
Operational evaluations rely on comparing temporally averaged
data. The methods for obtaining the observational averages are
straightforward. Those for the model outputs are not, because of
presumed modeling resource constraints.
Four techniques for obtaining long term averages are under
cons ideration:
o direct simulation of seasonal and annual cycles using the
models as presently configured,
o aggregation of episodic model runs to statistically represent
average behavior,
o interactive use of a comprehensive model and a simpler, less
computationally intensive model, whereby the comprehensive
model establishes typical chemical environments across the
modeling domain and the simpler model works within that
framework to calculate the actual long-term averages, and
o reconfiguration of model architecture to run more speedily and
8-5
-------
Section 8
Ver. 4, 2/89
efficiently on a parallel processing machine. Each technique
has its advantages and disadvantages.
Direct simulation is expected to be the most expensive and time
consuming of the alternatives. The cost of supercomputer running
time and the effort expended in compiling and manipulating the
requisite input data would be relatively considerable. On the
other side of the coin, no major new software development would be
required and there is a current familiarity with running the
models as presently configured.
EPA and OME have been funding examinations of the feasibility of
breaking down the full range of meteorological variability into a
set of meteorological classes, each of which contributes some
characteristic fraction of the total wet and dry deposition to the
ground and within which exist characteristic aerometric
conditions. Feasibility would mean that by weighting the
deposition and concentrations associated with each class by its
frequency of occurrence, the long term totals and averages could
be estimated. The disadvantages of this technique are that its
feasibility has yet to be established and that because it is an
indirect method of estimating averages, it lacks the credibility
of the direct method. Its advantage is that it is less costly in
terms of money and manpower than the direct method, because it
requires less computing time and data assimilation effort.
The feasibility of interactively using comprehensive and simpler
models to obtain long-term averages has not been explored in
8-6
-------
Section 8
Ver. 4, 2/89
depth. The approach was suggested by analogy to the solution to a
related problem suggested by Kleinman (1988) whereby he would use
RADM to establish a chemical environment and then a simpler model
to evaluate SO2 emissions change scenarios.
The possibility of running the models on a parallel processing
machine has only recently been brought under consideration. Its
feasibility is being explored by the RADM development staff in
separate consultations with Argonne National Laboratory and with
IBM.
This approach would require substantial modification of the
computer code and the acquisition of an appropriate existing
computer or the development of one custom-designed for this
application. The expense and effort to meet these requirements
are an obvious disadvantage, but its relative magnitude versus
direct simulation remains to be determined.
On the positive side, the very nature of the Eulerian (gridded)
approach and the processes being simulated in the models they
are inherently multitudinous and parallel makes them ideal
candidates for parallel processing. If appropriate hardware had
been available at the inception of the models' development, it is
likely they would have been written in parallel mode.
8-7
-------
Section 9
Ver. 4, 2/89
Section 9
REFERENCES
Daly, D.S. and A.R. Olsen, 1988. Data Integration System for the
Eulerian Model Evaluation Field Study. Draft Report, June 1988.
Battelle Pacific Northwest Laboratories, Richland, WA 99352.
Durham, J., R. Dennis, N. Laulainen, D. Renne, B. Pennell, R.
Barchet, and J. Hales, 1986. Regional Eulerian Model Field Study;
Proposed Management and Technical Approaches. Atmospheric
Sciences Research Laboratory, U.S. EPA, Research Triangle Park,
NC. August 1986.
Fox, D.G., 1981. Judging Air Quality Model Performance. Bull.
Amer. Meteor. Soc. 62:599-609.
Henry, R.C. and G.M. Hidy, 1979. Multivariate Analysis of
Particulate Sulfate and Other Air Quality Variables by Principal
Components - Part I. Annual Data from Los Angeles and New York.
Atmos. Environ. 13:1581-1596.
Kleinman, L.I., 1988. Evaluation of SO2 Emission scenarios with a
Nonlinear Atmospheric Model. Atmospheric Environment, in press.
Philip, G.M. and D.F. Watson, 1986. Comment on "Comparing Splines
and Kriging"- Computers & Geosciences 12.' 243-245.
Seilkop S.K. and P.L. Finkelstein, 1987. Acid Precipitation
Patterns and Trends in Eastern North America, 1980-84. J.
Climate Appl. Meteor. 26:980-994.
9-1
-------
Appendices
Ver. 4, 2/89
APPENDICES
A. PMG Charter
B. List of pertinent quality assurance plans
A-l
-------
Appendices
Ver. 4, 2/89
CHARTER OF THE
PROJECT MANAGEMENT GROUP
FOR REGIONAL EULERIAN ACID DEPOSITION/OXIDANT
MODEL EVALUATION STUDIES
SPONSORS
Atmospheric Environment Service, Environment Canada, Toronto,
Ontario, Canada
Electric Power Research Institute, Palo Alto, CA
Environmental Protection Agency, Research Triangle Park, NC
Florida Electric Power Coordinating Group, Tampa, FL
Ontario Ministry of the Environment, Toronto, Ontario, Canada
BACKGROUND
Each of the sponsoring agencies and institutions is operating or
plans to operate an acid deposition monitoring network and to make
additional measurements for model evaluation. Each of these
approaches has independent sampling procedures. For effective
model evaluation against the common monitoring data, differences
among methods applied by the various sponsors to measure the same
variable must be defined and minimized.
The Regional Model Evaluation Quality Assurance Workshop (Toronto,
10-13 June 1986) recommended that the Sponsors establish a Quality
Assurance Management Committee (QAMC) to function as described in
the workshop report (Olsen, 1986) and proposed QA management
approach (Cox, 1986). This QAMC was constituted immediately
following the workshop and by October 1986, EPA, EPRI, and OME had
become signatories to the QAMC charter. In 1987 AES became a
signatory to the charter, bringing the committee to full
membership.
In response to a recommendation solicited by the QAMC
from the Eulerian Modeling Bilateral Steering Committee (EMBSC),
o the QAMC was renamed the Project Management Group (PMG);
o its purview enlarged from network monitoring to also
encompass emissions inventories, measurements for diagnostic
evaluations, and the model evaluation process itself; and
o four teams were established to assist the PMG in organizing,
coordinating, and assuring the quality of operational
measurement, diagnostic measurement, emissions estimation,
and model evaluation activities as described in the Project
Plan.
A-2
-------
Appendices
Ver. 4, 2/89
Subsequently, the Florida Electric Power Coordinating Group (FCG)
adopted sampling methods identical to those used by EPRI and
joined the model evaluation field study.
PURPOSE OF THIS CHARTER
The purpose of this Charter is to:
o express the agreement of intent among Sponsoring Agencies and
Institutions to establish the Project Management Group, and
o express the extent of cooperation and obligations of the
Sponsors and the members of the Group.
OBJECTIVES
With assistance from the Teams providing technical oversight of
the Operational Measurements, Diagnostic Measurements, Emissions
Inventories, and Model Evaluation, the Group shall act to provide
a quality assured data set for model evaluation. It shall provide
well documented, scientifically credible operational and
diagnostic evaluations of RADM and ADOM.
FUNCTIONS
The Group shall:
o constitute the four Teams described in the preceding
background statement and convene them at periodic intervals;
o receive status reports from the Teams and recommend
corrective action as needed;
o produce a Project Plan for the model evaluation studies;
o direct the Teams in establishing mechanisms to:
- review and approve Sponsors' Quality Assurance Plans for
measurements and data reduction, validation, and
management;
review and recommend the methods of establishing estimates
of bias and precision;
- encourage standardization of methods and protocols;
encourage member agencies to practice active quality
control;
A-3
-------
Appendices
Ver. 4, 2/89
- design inter-network and inter-laboratory studies of
uncertainties; and
specify common data base characteristics and protocols.
MEMBERSHIP
The Group membership shall consist of one member from each
Sponsoring agency who possesses these characteristics:
o has a detailed knowledge of the monitoring and research tasks
of the model evaluation project;
o is not directly related to data generation from tasks; and
o is knowledgeable in quality assurance or has support of a
quality control staff or contractor.
It is desirable, but not essential, that each Sponsor's member be
in a management position that is effective in recommending
reprogramming of resources to bring about timely corrective
action.
CHAIRMAN
The Group shall elect its chairman, who will serve a term as
agreed upon by the Group members. The chairman's duties will be
to:
o schedule regular quarterly meetings;
o prepare and provide an agenda in advance of each meeting;
o moderate the meeting;
o provide a written summary of the meeting; and
o report on Group accomplishments and model evaluation study
status to the EMBSC.
FINANCIAL SUPPORT
The Sponsoring agencies agree to support this Group in these ways:
o Provide travel and per diem for their members of the Group
and the Teams to attend four meetings per year. These
meetings may be held at one of the agency's facilities or at
a mutually convenient intermediate location such as Chicago,
IL«
o Provide 20% of their member's (or the equivalent in staff's
A-4
-------
Appendices
Ver. 4, 2/89
or contractor's) time for conducting the functions of a Group
member.
Provide internally a Quality Assurance Officer (staff or
contractor) to assess their quality control data
interactively with the appropriate Measurements Team.
The Group shall not request the Sponsoring agencies to provide any
support or funds other than identified above. The Sponsoring
agencies will fund and manage bias and precision data experiments,
partitioning of precision experiments, and internal quality
assurance and quality control within their respective programs.
DURATION
The Sponsoring agencies may withdraw membership at any time.
This charter expires annually on 1 January, unless its Sponsors
specifically approve its continuation. A record of such action
will appear in the minutes of the fourth quarter's meeting.
APPROVAL
Designated and Approved by Agency's or Institution's Manager
Responsible for the Model Evaluation Studies.
AES Member:
Approved by: Date:
EPA Member:
Approved by: Date:
EPRI Member:
Approved by: Date:
FCG Member:
Approved by: Date:
OME Member:
Approved by: Date:
A-5
-------
Appendices
Ver. 4, 2/89
Appendix B
QUALITY ASSURANCE-RELATED DOCUMENTS
IN USE IN THE EULERIAN MODEL EVALUATION FIELD STUDY
A-6
-------
Appendices
Ver. 4, 2/89
Listed here are the quality assurance plans, work plans, operating
(procedures) manuals, and other pertinent documents that dictate
and describe how activities are to be conducted in support of the
Eulerian Model Evaluation Field Study- They are listed by the
organization to whose operations they apply.
1. Atmospheric Environment Service. Environment Canada
Quality Assurance Reports
The Canadian Air and Precipitation Monitoring Network (CAPMoN)
Quality Assurance Plan for Precipitation Monitoring Systems.
R.J. Vet and S.G. Onlock, Report CSC 110.194-3-1 Concord
Scientific Corporation, 2 Tippett Road, Downsview, Ontario M3H
2V2, March 1983.
The Canadian Air and Precipitation Monitoring Network (CAPMoN)
Quality Assurance Plan for Air Monitoring Systems. R.J. Vet,
Atmospheric Environment Service. TO BE WRITTEN
The Canadian Aircraft Program Quality Assurance Plan.
Atmospheric Environment Service. TO BE WRITTEN
Procedures Manuals
Canadian Air and Precipitation Monitoring Network (CAPMoN)
Operator's Instruction Manual - Precipitation. Air Quality
and Inter-Environmental Research Branch, Atmospheric
Environment Service, 4905 Dufferin Street, Downsview, Ontario
M3H 5T4, April 1985.
Canadian Air and Precipitation Monitoring Network (CAPMoN)
Operator's Reference Manual - Precipitation. Air Quality and
Inter-Environmental Research Branch, Atmospheric Environment
Service, 4905 Dufferin Street, Downsview, Ontario M3H 5T4,
April 1985.
Canadian Air and Precipitation Monitoring Network (CAPMoN)
Precipitation Sampling Instruments Operation and Maintenance
Manual - Operator's Edition. Atmospheric Environment Service,
4905 Dufferin Street, Downsview, Ontario M3H 5T4, April 1985.
Canadian Air and Precipitation Monitoring Network (CAPMoN)
Inspector's Reference Manual - Precipitation. Air Quality and
Inter-Environmental Research Branch, Atmospheric Environment
Service, 4905 Dufferin Street, Downsview, Ontario M3H 5T4,
April 1985.
Canadian Air and Precipitation Monitoring Network (CAPMoN)
Precipitation Sampling Instruments Operation and Maintenance
Manual - Inspector's Edition. Atmospheric Environment
Service, 4905 Dufferin Street, Downsview, Ontario M3H 5T4,
April 1985.
A-7
-------
Appendices
Ver. 4, 2/89
Preliminary Draft - Canadian Air and Precipitation Monitoring
Network (CAPMoN) Site Operator's Manual - Air and Ozone
System, Belfort gauges. Atmospheric Environment Service,
March 1988.
Preliminary Draft - Canadian Air and Precipitation Monitoring
Network (CAPMoN) Inspector's Manual - Air and Ozone System,
Belfort gauges. Atmospheric Environment Service, March 1988.
2. Electric Power Research Institute
EPRI-OEN Field Operation and Maintenance Manual. Document No.
2460-003-332, April 1988. ERT, Inc., Concord, MA and
Environmental Monitoring and Services, Inc., Camarillo, CA.
Volume I: Training and Precipitation Measurements
Volume II: Meteorological Measurements
Volume III: Aerometric Measurements
Operational Evaluation Network Quality Control Procedure
Manual (Draft). Document No. 2460-003-800, May 1988. ERT,
Inc., Concord, MA.
Operational Evaluation Network Work Plan (Draft). Document
No. P-E292-100, August 1986. ERT, Inc., Concord, MA.
Operational Evaluation Network Siting Manual. January 1987.
ERT, Inc., Concord, MA.
3. Environmental Protection Agency
Quality Assurance Reports
Acid Model Operational Diagnostic Evaluation Study Quality
Assurance Project Plan, Document No. 9100-014-800, June 1988.
ERT, Inc., Concord, MA, and Environmental Monitoring and
Services, Inc., Camarillo, CA.
Acid Model Operational Diagnostic Evaluation Study: Option XI
- The Measurement of S(IV) in Precipitation Quality Assurance
Project Plan (Draft), February 1988. Combustion Engineering,
Environmental Monitoring and Services, Inc., Camarillo, CA.
Acid Model Operational Diagnostic Evaluation Study: Option XI
- The Measurement of S(IV) in Precipitation Work Plan (Draft),
January 1988, Combustion Engineering, Environmental Monitoring
and Services, Inc., Camarillo, CA.
Procedures Manuals
Acid MODES Network Siting Manual (Draft), October 1987. ERT,
Inc., Concord, MA.
A-8
-------
Appendices
Ver. 4, 2/89
Acid MODES Field Operations and Maintenance Manual (Draft),
February 1988. ERT, Inc., Concord, MA.
Acid Model Operational Diagnostic Evaluation Study Standard
Operating Procedures Field Measurements (Draft), Document No.
G418-800, February 1988. ERT, Inc., Concord, MA. (Revised
version in preparation)
Acid Model Operational Diagnostic Evaluation Study Standard
Operating Procedures Laboratory Analysis and Data Management
(Draft), Document No. G418-800, February 1988. ERT, Inc.,
Concord, MA. (Revised version in preparation)
4. Florida Electric Power Coordinating Group
Laboratory Operations Manual. Florida Acid Deposition Study.
ESE Document No. 006F/80-610-111. Environmental Science and
Engineering, Inc., Gainesville, FL. September 1981.
Environmental Monitoring Project Quality Assurance Plan.
Florida Acid Deposition Study. ESE Document No. 004F/80-610-
111. Environmental Science and Engineering, Inc.,
Gainesville, FL. September 1981.
Field Operator's Instruction Manual (Phases I and II).
Florida Acid Deposition Study. ESE Document No. 004F/80-610-
600. Environmental Science and Engineering, Inc.,
Gainesville, FL. September 1981.
Field Operator's Instruction Manual Appendices (Phases I-IV).
Florida Acid Deposition Study. ESE Document No. 004FS/82-615-
101. Environmental Science and Engineering, Inc.,
Gainesville, FL. September 1982.
Field Operator's Instruction Manual (Phase III). Florida Acid
Precipitation Study. ESE Document No. 004FS/82-615-101.
Environmental Science and Engineering, Inc., Gainesville, FL.
October 1982.
5. National Water Research Institute. Environment Canada
External Quality Assurance. Cost Factors and Work Plans to
Examine Specific Laboratory Performance of those Laboratories
Providing Precipitation Data to Test the Eulerian Model
(Aqueous Phase).
6. Ontario Ministry of the Environment
Quality Assurance Plan - APIOS Deposition Monitoring Program.
Report ARB-76-84-ARSP- Ontario Ministry of the Environment,
1984.
A-9
-------
Appendices
Ver. 4, 2/89
Acidic Precipitation in Ontario study, Quality Assurance
Manual: Deposition Monitoring Network. Report ARB-051-85-AQM.
Ontario Ministry of the Environment, 1985.
Technical and Operating Manual APIOS Deposition Monitoring
Program (1st Revised Edition). W.S. Bardswick. Report ARB-
082-87-AQMo Ontario Ministry of the Environment, 1987.
1986 Performance Report: Water Quality Section, Laboratory
Services Branch. W.M. Wright, Ed., 1987
A-10
-------
ATMOSPHERIC
ENVIRONMENT
SERVICE
U.S. ENVIRONMENTAL
PROTECTION
AGENCY
EPRI
ELECTRIC POWER
RESEARCH INSTITUTE
FLORIDA
ELECTRIC POWER
COORDINATING GROIUP
ONTARIO
MINISTRY OF THE
ENVIRONMENT
Ontario
------- |