United States
Environmental Protection
Agency
Environmental Monitoring
Systems Laboratory
Research Triangle Park NC 27711
Research and Development
EPA/600/S4-86/012 June 1986
c/EPA Project Summary
Precision and Accuracy
Assessments for State and Local
Air Monitoring Networks 1983
Raymond C. Rhodes and E. Gardner Evans
Precision and accuracy data obtained
from state and local agencies during
1983 are summarized and evaluated.
Some comparisons are made with the
results previously reported for 1981
and 1982 to determine the indication
of any trends. Some trends indicating
improvement in the precision and ac-
curacy of monitoring data are given on
a national and regional basis. The annual
average results from each reporting or-
ganization are given so that compari-
sons may be made from 1981 to 1983
and with other reporting organizations.
A comparison of the precision and
accuracy data from the Precision and
Accuracy Reporting System and that
from the independent performance
audit program conducted by the En-
vironmental Monitoring Systems Labo-
ratory is given.
This Project Summary was developed
by EPA's Environmental Monitoring
Systems Laboratory, Research "Mangle
Park, NC, to announce key findings of
the research project that Is fully docu-
mented In a separate report of the same
title (see Project Report ordering In-
formation at back).
Introduction
The purpose of the full report is to
report the third year of data from the
Precision and Accuracy Reporting System
(PARS). Federal regulations promulgated
on May 10, 1979, require quality assur-
ance precision and accuracy (P and A)*
*When one speaks of precision and accuracy of
measurement data, one really means the precision
and accuracy of the measurement process from
which the measurement data are obtained. Precision
is a measure of the "repeatability of the measurement
process under specified conditions." Accuracy is a
measure of "closeness to the truth."
data to be collected. Collection started
January 1, 1981, according to require-
ments set forth in 40 CFR Part 58
Appendix A. These requirements provide
for more uniform Quality Assurance pro-
grams and specific precision and accuracy
reporting requirements across all State
and local air monitoring agencies.
The major portion of the full report
consists of summarizations and evalua-
tions of the P&A data obtained by the
efforts of the States and local agencies.
In addition, comparisons have been made
of the accuracy data collected for PARS
with the results of the National Per-
formance Audit Program (NPAP) which
has been an ongoing program conducted
by the Environmental Monitoring Systems
Laboratory (EMSL) since the early 1970's.
These summarizations and evaluations
of precision and accuracy data serve the
following purposes:
1. Quantitative estimates of the preci-
sion and accuracy of their monitor-
ing data are available to State and
local agencies.
2. A comparison of the data from all the
agencies can indicate the need to
improve quality assurance systems
in specific reporting organizations.
3. An evaluation of the results may
indicate a need for improvement in
monitoring methodology.
4. The assessments provide users of
data from the State and Local Air
Monitoring Stations (SLAMS) net-
work a quantitative estimate of the
precision and accuracy of the am-
bient air quality data.
Ambient air quality data, collected by
States and local agencies since 1957,
have been stored in the National Aero-
-------
..metric Data Bank (NADB). These data are
used in (1) planning the nation's air pol-
lution control strategy, (2) determining if
the National Air Quality Standards are
being achieved, and (3) determining
long-term trends of air quality. Prior to
the EPA air monitoring regulations of
May 10, 1979, the procedures used in
selecting monitoring sites, operating and
controlling the equipment, and calculat-
ing, validating and reporting the data
varied considerably among agencies.
Frequently the procedures being used
were not well documented. These condi-
tions made it difficult to intercompare
data from different sites and agencies.
Furthermore, little information was avail-
able on the reliability of the monitoring
data.
To help alleviate these problems, EPA's
air monitoring regulations imposed uni-
form criteria on network design, siting,
quality assurance, monitoring methods,
and data reporting after December 30,
1980. For example, only EPA reference,
equivalent, or other EPA-approved air
monitoring methods were to be used.
Also, calibration standards were to be
traceable to the National Bureau of Stan-
dards (NBS) or other authoritative stan-
dards. Further, the quality assurance
systems of the states were required to be
documented and approved by the EPA
Regional Offices. Finally, the reporting
organizations must also follow specific
procedures when assessing the P and A
of their measurement systems and must
report the P&A data to EPA quarterly.
Starting January 1, 1981, these regula-
tions became effective for National Air
Monitoring Sites (NAMS), and beginning
January 1, 1983, for all State and Local
Air Monitoring Stations.
The precision assessments were deter-
mined by performing repeated measure-
ments on ambient-level "calibration"
gases at two-week intervals for con-
tinuous methods, or by obtaining duplicate
results from collocated samplers for
manual methods. The accuracy assess-
ments were generally determined by
analyzing blind audit materials traceable
to NBS. During each calendar year, each
site or instrument must be audited at
least once. Details concerning the specific
procedures and computations used to
assess P and A are contained in the
regulations.
National Results
National Data Reporting
The third year of data collected by
State and local agencies for P&A has
been compiled and summarized. Obvious
improvements in the network operation
have been made. Table 1 shows the im-
provement in data reporting for the
Nation.
Table 1. National Percent Data
Reporting for Required
Precision and Accuracy
Pollutant
measurement
CO
SO
Nu
0*
TSP
Pb
SO (manual)
/VCT (manual)
1981
77
82
56
83
94
—
—
1982
89
93
72
89
97
—
—
1983
99
96
88
99
99
93
75
86
Improvement continues for the con-
tinuous N02 method; however, the per-
centage still lags behind that for
continuous CO, S02 and 0 methods.
Reporting for the manual methods for Pb,
S02 and N02 was required by the regula-
tions beginning January 1, 1983. The
fact that 1983 was the first year for
reporting the manual S02 and NO2
methods is perhaps one reason for the
percentage data capture being somewhat
low. Another reason may be the fact that
these manual methods are being replaced
by the continuous methods, which are
much more precise and accurate.
1983 Results From the
PARS Program
The measures of precision and accuracy
are required to be computed and reported
by the States and local agencies as per-
centage values. For precision, the repeat-
ability for each check is measured as the
deviation from expected values as a per-
centage of the expected value. For ac-
curacy, the deviation of the audit value
from the true value is measured as a
percentage of the true value. For both
precision and accuracy, 95 percent prob-
ability limits are computed for the per-
centage values from the average and
standard deviations of the individual
percentage values:
"D ± 1.96 S
where D = the average of the individual
percent differences;
S = the standard deviation of
the individual percent dif-
ferences;*
1.96 = the multiplication factor
corresponding to 95%
probability.
*Note: For the precision of manual
methods obtained from paired observa-
tions,the standard deviation, S, is divided
by \J2, to obtain variability estimates that
apply to individual reported values.
These upper and lower 95% probability
limits are reported and discussed in the
full report.
Moreover, it should be noted that the
data and the evaluations presented in the
full report include any outlier values
which may have been reported by the
States and local agencies. It is possible
that the presence of outliers might in-
fluence such comparisons by having
undue impact on average values for in-
dividual reporting organizations.
Table 2 shows the national values for
each of the manual pollutants. The prob-
ability limits in Tables 2 and 3 represent
the unweighted arithmetic averages of
all the reported probability limits for 1983.
Historically, probability limits have been
combined in this manner for the full
report. Thus, for continuity and compari-
sons to show trends, the unweighted
average method was used here. A more
statistically pure procedure for combining
probability limits, which is described in
Appendix B of the full report is now being
used in EPA's PARS system. By examining
the numbers of valid collocated data pairs
(16,816) and the number of audits (6989)
performed for TSP, one can appreciate
the amount of effort being expended in
this country to obtain these data quality
assessments.
The precision limits reflect the repeat-
ability of the methodology used in the
field to collect and analyze the samples at
ambient levels. The spread of the limits
may be somewhat inflated due to mea-
surements at relatively low concentration
levels.
The accuracy of the manual methods
indicates the limits at predetermined
concentration levels for the chemical
analysis performed in the samples for
lead, sulfur dioxide, and nitrogen dioxide.
For the TSP method, the accuracy mea-
surement is for the flow rate only. The
probability limits for manual accuracy are
very good and reflect the quality of work
-------
Pollutant
done in the chemical laboratories for 7feb/e2.
lead, sulfur dioxide, and nitrogen dioxide
analyses, and in the field for flow rate
measurement for the TSP method. Be-
cause of the continual replacement of
the manual SO and N02 methods with
continuous methods, further discussion
of the manual methods is limited. The
detailed results, however, are tabulated
in an appendix for each reporting
organization.
The precision and accuracy limits for
automated methods are presented in
Table 3. Apparent from the number of
precision checks, for example 36,887 for
S02, the effort expended for the collection
of quality assurance precision and ac-
curacy data is appreciable, but necessary
to assess data quality. Details of the
results are discussed in the analysis
section.
National Precision and Accuracy Probability Limit
Values for Manual Methods for 1983
Precision
Accuracy
Probability Limits f/o)
Number of
valid col- Probability
located limits (%) No. of Level 1 Level 2 Level 3
data pairs Lower Upper audits Lower Upper Lower Upper Lower Upper
TSP
Lead
Sulfur
dioxide
Nitrogen
dioxide
16,816
3,885
389
1,324
-11
-14
-28
-19
+12
+15
+41
+21
6,989
1.389
301
348
—
-8
-14
-6
—
+7
+7
+ 10
-6
-6
-9
-5
+6
+4
+5
+6
—
—
-7
-5
—
—
+4
+6
Tables.
National Precision and Accuracy Probability Limit
Values for Automated Analyzers for 1983
Precision
Accuracy
Probability Limits (%)
Item
National Precision
Results Comparison
Figure 1 shows the national values for
precision for the various methods. With
data from three years, some minor trends
are evident. Some slight improvement,
as measured by a reduction in the spread
of the limits, is noted for TSP and the
continuous methods, except for N02. The
persistent negative bias for the continu-
ous SO2 method indicates that on the
average there is some negative instru-
ment drift from the most recent calibration
or instrument adjustment to the time of
the biweekly precision check.
Although the manual methods for Pb,
SO2, and NO. were not required to be
reported until 1983, a number of agencies
began reporting in 1981. The results for
Pb show a decided improvement. The
manual SO2 and NO2 methods are much
more variable than the continuous
methods, and, although the limits were
worse in 1982 than 1981, the results for
1983 are appreciably better than in 1981.
National Accuracy
Results Comparison
Figures 2a and 2b show the national
values for accuracy audits for the manual
and continuous methods, respectively.
Improvement for the manual methods is
not evident except perhaps for Pb and
SO2 level 1. Slight improvement is evident
for all the continuous methods. The con-
tinuous methods for SO and N02 show
more inaccuracy than another methods. Figure 1.
No. of Probability
precision limits f/o) No. of Level 1 Level 2 Level 3
checks Lower Upper audits Lower Upper Lower Upper Lower Upper
so
o2
cd
N02
36,887
21,342
15,714
9,299
-13
-10
-8
-13
+8
+9
+6
+12
1.791
1.920
1.515
680
-15
-11
-12
-19
+10
+10
+9
+15
-12
-8
-6
-12
+10
+7
+6
+9
-11
-8
-5
-11
+9
+6
+4
+6
60
National Values for Precision
1981-1982-1983
^
National precision values for 1981, 1982, and 1983,
3
-------
National Values for Accuracy
1981 -1982-1983
Manual Methods
National Values for Accuracy
1981-1982-1983
Continuous Methods
Figure 2. National accuracy values for 1981. 1982, and 1983.
However, it \s pointed out that the ac-
curacy audits for the manual methods
check only a portion of the measurement
method.
The most consistent improvement has
occurred with the 03 method. Although
the continuous N02 method is more vari-
able than the other methods, it has shown
the greatest improvement, particularly for
the level 1 concentration.
The general, and expected, pattern of
variability across levels is very evident,
with the greatest percentage variability
at the lowest concentration levels. The
slight negative biases for the continuous
S02 and N02 methods are consistent
across all three levels. This indicates
that, on the average, there appears to be
a negative drift with these analyzers from
the time of last calibration or instrument
adjustment until the time of the accuracy
audit.
Comparison of Results
from the PARS and the
Performance Audit
Program
A general comparison between the ac-
curacy data of the PARS program and the
Performance Audit (PA) data is included
in the full report. The audit data are the
results of an independent check, the
National Performance Audit Program
(NPAP), conducted by the Quality Assur-
ance Division (QAD) of the EMSL.
In the NPAP, specially prepared samples
or devices are sent from EMSL to the
ambient air monitoring agencies. The
samples or devices are carefully and
accurately assessed by EMSL utilizing
NBS Standard Reference Materials
(SRMs) or standards. The monitoring
agencies analyze or measure the samples
or devices as unknowns or blinds and
report their results to EMSL for evalua-
tion. Audit programs are conducted for
the following pollutant measurements
using the materials indicated:
Measurement
Audit materials
The audit materials or devices are
prepared at three to six different con-
centrations or flow levels. Separate re-
ports on the evaluation of the PA data are
published by EMSL
As indicated above, the NPAP does not
yet include an audit for the ozone or
continuous N02 methods. Therefore, no
comparisons of the NPAP or PA data with
the PARS data are possible for these
measurements.
Since precision assessments are not
made in the PA program, only accuracy
can be compared across the PARS and
the PA programs. In the full report, the
results from PARS and the PA system are
compared at approximately the same
levels by matching laboratories and re-
porting organizations. Since the PARS
Portion of measure-
ment system audited
S02 (manual)
NO, (manual)
Pb
TSP
CO
S02
Freeze-dried sodium sulfite
Aqueous sodium nitrite
Filter strip with lead nitrate
Reference flow device
Cylinders containing CO gas
Cylinder containing S02 gas
Chemical analytical
Chemical analytical
Chemical analytical
Flow
Continuous instrument
Continuous instrument
-------
data are presented with outliers, the same
approach was taken with the audit data.
Knowledge of the historical audit data
reports, however, indicates that the pre-
sence of outliers may make a significant
difference in the audit results for some
agencies.
Comparisons of the national values of
the probability limits (Table 4) show good
agreement between the results of the
two programs. However, there is con-
siderable variation between the results of
the two programs when comparisons are
made on Regional and reporting organiza-
tion bases. Lack of better agreement
results from several factors. First, the
inclusion of outlier values in the PA data
appears to have introduced some exces-
sive distortion of general trends. Second,
even though the PARS averages in Table
4 are weighted by the number of audits,
variations due to many sources of error
for both data sets are averaged together
to obtain the national values, thereby
masking any correlations which may have
existed for the results of individual
agencies. Third, the concentration levels
for the two systems do not coincide
exactly at each of the audit levels. Fourth,
the PA data are the results of independent
external audits, while the PARS accuracy
data are based on the results of in-
dependent internal audits. The expected
effects of the last-mentioned factor would
cause the spread of the limits for the PA
to be wider than that for the PARS. The
results (see Table 4) confirm these
expectations.
Conclusions and
Recommendations
The results of PARS data for 1983
indicate some general improvement over
the data for 1982. However, considerable
differences exist among Regions and
individual reporting organizations for
most measurement methods. Investiga-
tions should be made by the Regions and
the states to determine the causes of
these significant differences.
Comparison of PARS and PA data show
more variability of the PA data than for
PARS. These differences are presumably
due to the fact that the external PA
accuracy audits are more completely
independent then the internal PARS
accuracy audits. These differences have
been consistent for the years 1981,1982,
and 1983.
Further improvement in the data quality
assessments, which are measures of the
monitoring data quality, can be achieved
only through continuing efforts of State
and local agency personnel involved (first-
hand) with the operation and quality
control of their measurement systems.
Regional Quality Assurance (QA) Coordi-
nators can also assist through their review
of the operations and quality control
practices across the States in their
Regions.
Each Regional QA Coordinator should
evaluate the PARS data from all the re-
porting organizations within his Region
to identify those organizations having
excessively large variations of probability
limits. Investigation should be made to
determine the causes and correct them
to preclude future excessive deviations.
Similarly, Regional QA Coordinators
should review the operations of the re-
porting organizations having significantly
better precision and accuracy results in
order to identify specific procedures
which should be uniformly used through-
out the Region and the Nation to further
improve the reliability of the monitoring
data in the National Aerometric Data
Base.
Table 4. Summary Comparison of EMSL Performance Audits
(PA) vs. PARS Accuracy Audit Data for Year 1983
National values
probability limits (%)
Pollutant
CO
PA
PARS
NO (manual)
f*A
PARS
SO (manual)
fa
PARS
LEAD
PA
PARS
TSP
PA
PARS
SO, (Com)
fa
PARS
Audits
1753
(1228)
78
( 248)
59
( 184)
644
(1097)
2700
(5996)
506
(1281)
Level 1
Lower Upper
-23
(-15)
(- 9)
-45
(-26)
-24
(-12)
-26
(-18)
+21
(+13)
(+12)
+43
MS)
+23
(+12)
+23
M7)
Level 2
Lower Upper
-10
(- 8)
-15
(- 8)
-15
(-18)
-25
(-10)
-11
(- 7)
-20
(-12)
+13
(+ 8)
+ 7
1+10)
+19
(+11)
+22
(+ 9)
+10
(+ 7)
+ 18
(+13)
Level3
Lower Upper
-14
(- 7)
- 9
(- 7)
-13
(-14)
-20
-18
(-12)
+ 16
(+ 6)
+ 7
(+ 8)
+19
(+ 7)
+19
+15
(+12)
Level 4
Lower Upper
(- 4) {+ 3)
- 8 +12
-6 +6
(- 8) (+ 8)
•&U. S. GOVERNMENT PRINTING OFFICE:1986/646-l 16/20856
-------
The EPA authors Raymond C. Rhodes (also the EPA Project Officer, see below),
and E. Gardner Evans are with the Environmental Monitoring Systems
Laboratory, Research Triangle Park, NC 27711.
The complete report, entitled "Precision and Accuracy Assessments for State and
Local Air Monitoring Networks 1983," (Order No. PB 86-171 386/AS; Cost:
$16.95, subject to change) will be available only from:
National Technical Information Service
5285 Port Royal Road
Springfield, VA 22161
Telephone: 703-487-4650
The EPA Project Officer can be contacted at:
Environmental Monitoring Systems Laboratory
U.S. Environmental Protection Agency
Research Triangle Park, NC 27711
United States
Environmental Protection
Agency
Center for Environmental Research
Information
Cincinnati OH 45268
Official Business
Penalty for Private Use $300
EPA/600/S4-86/012
0000329 PS
U S ENVIR PROTECTION AGENCY
REGION 5 LIBRARY
230 & DEARBORN STREET
CHICAGO IL 60604
------- |