United States
Environmental Protection
Agency
Atmospheric Research and Exposure
Assessment Laboratory
Research Triangle Park, NC 27711
Research and Development
EPA/600/SR-92/112 Aug. 1992
^ EPA Project Summary
Protocols for Evaluating Oxidant
Mechanisms for Urban and
Regional Models
H.E. Jeffries, M.W. Gery, W.P. L. Carter,
A task force of chamber operators
and modelers was assembled to ad-
dress needs raised at a prior workshop
on the procedures and practices that
should be followed when evaluating
photochemical reaction mechanisms for
their suitability for use in EPA's urban
and regional air quality models. In ad-
dition to the task force, two workshops
were held that were attended by re-
searchers in the field and at which is-
sues raised by the task force were dis-
cussed and guidance was sought.
Based on our work and the community
input, we describe how to create a pro-
tocol for the evaluation of photochemi-
cal reaction mechanisms. Rather than
prescribe a set of actions, we present
instead criteria that influence decisions
without specifying what those decisions
must be. These specify what the evalu-
ator must consider, what is and is not
relevant, and what must be reported as
the basis of decisions made.
Based on general scientific prin-
ciples, we describe five characteristics
that reaction mechanisms must have if
they are acceptable. Our approach is
based on a complex argument in the
form of a cascaded inference chain
showing how to proceed to establish
that a candidate mechanism might ex-
hibit all five characteristics. The evi-
dence in this argument is chamber data.
The report details the elements that
must be considered and describes the
general content of reports the evalua-
tor must produce. We conclude that,
because of data incompleteness prob-
lems, it is not presently possible to
make a compelling case for accepting
a mechanism for ambient air use, only
to vindicate its use as having been
evaluated the best that can be currently
done, given the available data,,
This Project Summary was developed
by EPA's Atmospheric Research and
Exposure Assessment Laboratory, Re-
search Triangle Park, NC, to announce
key findings of the research project
that is fully documented in a separate
report of the same title (see Project
Report ordering information at back).
Introduction
The EPA must be able to defend its
control policies and regulations. For ozone
and other oxidants these policies depend
upon the application of mathematical mod-
els of emissions, atmospheric transport,
and chemical transformations to forecast
the effects of proposed controls. An im-
portant element in this defense is the
evaluation of the accuracy and reliability
of these models. Because the models
are large and complex, the first approach
to establishing their overall accuracy is to
assess independently the accuracy of each
major component in the models, i.e., emis-
sions, transport, and chemistry. Not only
can the accuracy of the atmospheric
chemical transformation component of
these models be evaluated independently
of the emissions and transport terms, such
independent testing is the least ambigu-
ous method. The need for such testing
raises the need for standards to compare
with the model and for a method by which
the comparisons can best be done. Pre-
vious work sponsored by EPA (see,
Printed on Recycled Paper
-------
'Workshop on Evaluation/Documentation
of Chemical Mechanisms," EPA/600/9-87/
024,1987) established that both standards
and methods acceptable to the scientific
community exist. Provided it was of high
quality, smog chamber data was recom-
mended as the best standard. Provided
that efforts were made to treat the prob-
lems caused by the influence of the cham-
ber and for coping with the scarcity and
quality of measurement data, and even
though it varied significantly from modeler
to modeler, the then-existing mechanism
evaluation practice was recommended as
the best comparison process. The work-
shop recommended that a task force of
chamber operators and modelers be as-
sembled to address these problems. The
work described in this summary is part of
the effort of this task force. During the
project, the task force held two workshops
which were attended by researchers in
the field. In these, issues raised by the
task force were discussed and guidance
was sought. Based on our work and
communily input, we describe how to cre-
ate a protocol for the evaluation of photo-
chemical reaction mechanisms that are
candidates for use in EPA urban and re-
gional oxidant models.
Based on general scientific principles,
we present a set of characteristics that
reaction mechanisms must have if they
are to be acceptable. A central assertion
here is that mechanism acceptance is
more than just establishing agreement of
the model's predictions with observations.
We also recognize that any set of objec-
tive criteria may be insufficient to deter-
mine in full any algorithm for mechanism
choice. Therefore, rather than prescribe a
detailed set of actions (i.e., a cookbook),
we present instead criteria that influence
decisions without specifying what those
decisions must be. Such criteria specify:
(1) what each evaluator must consider in
reaching his decisions, (2) what he may
and may not consider relevant, and (3)
what he can legitimately be required to
report as the basis for his choices. Our
goal Is to have the evaluation be a com-
plete and persuasive argument and this
requires a clear chain of reasoning based
on credible and complete evidence.
Our approach is based on a complex
argument in the form of a cascaded infer-
ence chain showing how to proceed from
kinetics data and chamber data to the
necessary claims that allow use of the
mechanism in EPA models. A second
part of the inference chain addresses cred-
ibility issues for the kinetics and chamber
data. Further, we provide general war-
rants for why this reasoning chain is legiti-
mate.
Our argument is structured around es-
tablishing that an acceptable mechanism
has five general characteristics: (1) con-
sistency with currently accepted theories
and facts applicable to these theories; (2)
accuracy, that is, its predictions are in
demonstrated agreement with observations
from experiments and such agreement
does rjot arise from compensating errors;
(3) simplicity, that is, the mechanism
should explain events in terms of neces-
sary causal forces rather than empirical
generalizations; (4) scope, that is, the
mechanism's predictions should extend
beyond the particular observations it was
initially designed to explain; and (5) fertil-
ity, that is, the mechanism is expected to
predict novel phenomena that were not
part of the set to be originally explained.
In an evaluation, data that are relevant to
each of these characteristics are as-
sembled to construct lower-order claims
(e.g., the mechanism is well-formulated)
which in turn are assembled into higher-
order and more abstract claims that lead
to acceptance or rejection of the mecha-
nism.
After the first chapter that presents the
mechanism characteristics, the argument,
and a general description of the process,
the report is divided into five substantive
chapters and a conclusions chapter. One
chapter defines the terminology and meth-
ods used and presents an overview of the
operations. Four chapters describe the
elements that must be considered in an
evaluation and provide specification of the
content of reports that must be produced
by the evaluator. These are: (a) reports
on the mechanism, (b) reports on the
chamber data, (c) reports on the simula-
tions, and (d) reports on the evaluation
process.
Names and Definitions
In such a complex procedure involving
models, data, and operations, it is neces-
sary to agree on formal terms and to have
formal definitions. For example, the first
part of the definition of a principal mecha-
nism is "a complete mechanism for an
organic compound or mixtures of organic
compounds combined with oxides of ni-
trogen and air that all chambers and the
ambient air are thought to have in com-
mon at present." The rest of the definition
describes what is included in and what is
not included in a mechanism. Unfortu-
nately, principal mechanisms are not what
are tested with chamber data. This is
because each chamber has a perhaps
unique set of wall-mediated reactions with
reactants and products in common with
the principal mechanism, and thus, these
wall reactions influence the chemistry; their
effects must be included in any simulation
if it is to describe accurately the chamber
observations. Thus, auxiliary reaction
mechanisms, one for each chamber, must
be added to the principal mechanism and
it is this combined mechanism that is
tested with chamber data. Other terms
and definitions given in this chapter are
related to the chamber operations models
and the comparison of observations and
predictions.
Reporting Mechanisms
For an evaluator to test a principal
mechanism using our procedures, certain
-, information-aboutthe principal mechanism
must be known and reported. First, the
evaluator must report that he has an ex-
ecuting version of the mechanism and
that he can operate the mechanism as
intended by the developer. Next, he must
report that he has assured that the mecha-
nism is "well-formed." Thus, he must have
knowledge of the development protocol,
knowledge of the supporting data used in
formulating the mechanism, and he must
document that he understands the formu-
lation and condensation rationale used by
the developer. Further, he must deter-
mine the basis for believing that the prin-
cipal mechanism's formulation is indepen-
dent of the chamber data used in its de-
velopment. If the evaluator determines
that the mechanism suffers from flaws in
its formulation, he must decide and report
(justify) one of three courses of action: (1)
decide to act as if the problem would not
influence the mechanism's predictions and
proceed; (2) decide that the mechanism
needs to be "fixed" before proceeding,
deviate from the protocol to fix the
mechanism's formulation, and then pro-
ceed; or (3) decide" that the mechanism's
flaws are so bad that it cannot be fixed
and, therefore, stop the evaluation be-
cause any exhibited accuracy by a funda-
mentally flawed mechanism is meaning-
less.
Assuming that he chooses to proceed,
the evaluator must select chamber-depen-
dent auxiliary mechanisms for each cham-
ber that will be used and he must assure
(and document) that these mechanisms
are also as well-formed as we know how
to make them.
Reporting Chamber Data
The primary evidence that would sup-
port the claims leading to mechanism ac-
ceptance are chamber data. For cham-
ber data to be evidence in these matters,
-------
they must exist and they must be relevant
to the claims being made, i.e., the cham-
ber data must be capable of causing the
evaluator to change his mind about the
validity of the claims.
Thus, the evaluator must first report the
chamber data needed to conduct a com-
pelling evaluation. That is, one in which
the evidence (if it existed) would permit
only one reasonable interpretation about
the acceptability of the mechanism. The
second condition to be reported is the
existence of the needed data; and the
third condition is the relevance of any
existing data to the case being made.
The foundations of these arguments rest
on the credibility of the chamber data used,
and if the evaluator is to produce a suc-
cessful evaluation, a reviewer of the evalu-
ation must be convinced that the chamber
data used in the comparisons are believ-
able. Therefore, the fourth condition the
evaluator must report is chamber data
credibility and why he believes it has this
quality. The evaluator then must report
the extent to which the available, relevant,
credible chamber data meet the data need.
Where there is a significant shortfall in
available data, the evaluator must report
on how he will continue to conduct an
evaluation when only part of the chamber
data that is needed is available.
Once the chamber data are selected,
the evaluator must prepare the appropri-
ate summary observations for comparison
with simulation predictions in subsequent
work.
Reporting Simulations
For each selected experiment to be
simulated, the evaluator must build a simu-
lation solver input file containing the
model's representations of the chamber
conditions. First, species present in the
chamber experiment must be represented
in the simulation input file. For explicit
species in the model and in the chamber
this is no problem. For chamber species
that are represented as some type of gen-
eralized model species, the appropriate
transformations must be applied. For spe-
cies thought to be present in the cham-
ber, but that were not measured, the evalu-
ator must report his algorithms for esti-
mating these values. Data on light inten-
sity must also be converted into actinic
flux for each experiment. The evaluator
must report the algorithms he uses for this
calculation. Next, photolysis rates for the
photolyzing species in the mechanism
must be calculated from the actinic flux,
and these algorithms must be reported.
Finally, the simulation's representation of
chamber temperature, dilution, and water
vapor must be created and again the
evaluator must report the algorithms used
to produce these representations from the
observed chamber data. The simulation
solver input files must have internal docu-
mentation for their values and the files
must be available for distribution to inter-
ested parties.
After the simulations are run, the evalu-
ator must compute and report the corre-
sponding summary measures for the pre-
dictions as was done for the observations.
He then must calculate the absolute and
relative errors in amplitudes and in times
to events. Next, he must display these
errors on plots that show results on a
chamber by chamber basis. Other re-
quirements for these plots are described
that help detect internal compensating er-
rors among the parts of the principal
mechanism. All simulations with large
errors must be plotted as concentration
time profile plots of observations and pre-
dictions and the evaluator must offer ex-
planations for why the agreements were
so poor.
Reporting the Evaluation
The evaluator would now have all the
raw evidence at hand and he must build a
case to accept or reject the mechanism.
Three major obstacles stand in the way of
making compelling cases: (1) Because the
theories to support a chamber-auxiliary
mechanism are much weaker than those
for a principal mechanism and because
there are too few chamber characteriza-
tion data, the evaluator may have found it
necessary to use an auxiliary mechanism
that was more "tuned" than it was formu-
lated. Thus, there is the real possibility
that any demonstrated accuracy of the
combined mechanism is caused by com-
pensation between inaccurate components
in both the auxiliary mechanism and the
principal mechanism. (2) The chamber
data are incomplete in the necessary in-
puts to simulate accurately an experiment
and these inputs must be estimated by
the evaluator. If there is bias in the choice
of unmeasured but necessary simulation
inputs, then again there is the real possi-
bility any demonstrated accuracy of the
combined mechanism is caused by com-
pensating errors between the principal
mechanism and the simulation inputs. (3)
The chamber data are incomplete in their
coverage of single species and simple
mixtures. Thus, the accuracy of mecha-
nisms for some reacting species in the
principal mechanism cannot be tested at
all. Strategies for treating these difficul-
ties are discussed. The evaluator must
report his choices and procedures for ad-
dressing these problems.
The evaluator must make judgments
about each experiment he simulated and
must record these judgments in a table
form described in the text. He must report
these judgments by chamber and by ex-
perimental class. Finally, he must make
summary statements about the five char-
acteristics that acceptable mechanisms
must have.
Conclusions
We have presented criteria that influ-
ence decisions without specifying what
those decisions must be; we have com-
pletely described all the elements the
evaluator must consider in performing an
evaluation; we have made clear those is-
sues that are relevant to an evaluation;
and we have been explicit about what the
evaluator must report as the basis of the
decisions he makes. We have presented
a process for evaluating reaction mecha-
nisms that is based on sound scientific
principles, draws on classical logic and
formal arguments, and considers practical
difficulties. It is based upon evolving cur-
rent practice, but represents a significant
improvement over existing evaluations.
Using this report, an evaluator and a
sponsor can produce a specific set of
agreements—a protocol—for an evalua-
tion of a mechanism using a set of
chamber data. As an aid in this process,
the Appendix contains a summary of all
the necessary reports.
Chamber data from UNO and from UCR
that meet the criteria given in this report
will be available as part of this project
later in 1992. Additional chamber data
from the chambers operated by the Com-
monwealth Scientific and Industrial Re-
search Organization in Sydney, Australia,
are being documented and quality assured
now and should be available by the end
of 1993.
Because of the data incompleteness
problem that is described in Chapter 6,
however, it is presently not possible to
make a compelling case to accept a chemi-
cal reaction mechanism as accurately de-
scribing urban chemical transformations.
In fact, the chamber evidence may not
allow even a presumptive standard to be
used. For many species of interest, there
is pnly missing evidence. Therefore, until
relevant chamber data are made avail-
able, the evaluator and the sponsor must
be satisfied with vindicating the use of a
mechanism, that is, to claim that the
mechanism has been evaluated the best
we can currently do given the available
data.
•U.S. Government Printing Office: 1992 — 648-080/60049
-------
H.E Jeffries Is with the University of North Carolina, Chapel Hill, NC 27599. M. W.
Gary is with Atomospheric Research Associates, Boston, MA 02116. W. P. L
Carter Is with the University of California, Riverside, CA 92522.
Marcla C. Dodge is the EPA Project Officer (see below).
The complete report, entitled "Protocols for Evaluating Oxidant Mechanisms for
Urban and Regional Models," (Order No. PB92-205 848/AS; Cost: $26.00;
subject to change) will be available only from:
National Technical Information Service
5285 Port Royal Road
Springfield, VA 22161
Telephone: 703-487-4650
The EPA Project Officer can be contacted at:
Atmospheric Research and Exposure Assessment Laboratory
U.S. Environmental Protection Agency
Research Triangle Park, NC 27711
Untied States
Environmental Protection Agency
Center for Environmental Research Information
Cincinnati, OH 45268
Official Business
Penalty for Private Use
$300
BULK RATE
POSTAGE & FEES PAID
EPA
PERMIT No. G-35
EPA/600/SR-92/112
------- |