United States
Environmental Protection
Agency
Environmental Monitoring
Systems Laboratory
Research Triangle Park NC 27711
Research and Development
EPA/600/S4-86/031 Jan. 1987
Project Summary
Precision and Accuracy
Assessments for State and
Local Air Monitoring
Networks 1984
Raymond C. Rhodes and E. Gardner Evans
Precision and accuracy data obtained
from state and local agencies during 1984
are summarized and evaluated. Some
comparisons are made with the results
previously reported for 1981, 1982, and
1983 to determine any trends. Some
trends indicated continued improvement
in the completeness of reporting of preci-
sion and accuracy data. The national sum-
maries indicate a further improvement in
the precision and accuracy assessments
of the pollutant monitoring data collected.
The annual results from each reporting
organization are given so that comparisons
may be made from 1981 to 1984 and also
with other reporting organizations.
A comparison of the precision and accu-
racy data from the Precision and Accuracy
Reporting System with those from the in-
dependent performance audit program
conducted by the Environmental Monitor-
ing Systems Laboratory is given.
This Project Summary was developed
by EPA's Environmental Monitoring
Systems Laboratory, Research Triangle
Park, NC, to announce key findings of the
research project that is fully documented
in a separate report of the same title (see
Project Report ordering information at
back).
Introduction
The purpose of the full document is to
report the third year of data from the Preci-
sion and Accuracy Reporting System
(PARS). Federal regulations promulgated
on May 10, 1979, require quality assur-
ance precision and accuracy (P and A)*
data to be collected. Collection started
January 1, 1981, according to require-
ments set forth in 40 CFR Part 58 Appen-
dix A. These requirements provide for
more uniform Quality Assurance programs
and specific precision and accuracy as-
sessment and reporting requirements
across all State and local air monitoring
agencies.
The major portion of the report consists
of summaries and evaluations of the P and
A data obtained by the efforts of the
states and local agencies. In addition,
comparisons have been made of the ac-
curacy data collected for PARS with the
results of the National Performance Audit
Program (NPAP), which has been an ongo-
ing program conducted by the Environ-
mental Monitoring Systems Laboratory
(EMSL) since the early 1970's.
These summaries and evaluations serve
the following purposes:
1. Quantitative estimates of the preci-
sion and accuracy of their monitor-
ing data are available to state and
local agencies.
2. A comparison of the data from all the
agencies can indicate the need to im-
"When one speaks of precision and accuracy of
measurement data, one really means the precision
and accuracy of the measurement process from
which the data are obtained. Precision is a measure
of the "repeatability of the measurement process
under specified conditions." Accuracy is a measure
of "closeness to the truth."
-------
prove quality assurance systems in
specific reporting organizations.
3. An evaluation of the results may indi-
cate a need for improvement in mon-
itoring methodology.
4. The assessments provide users of
data from the State and Local Air
Monitoring Stations (SLAMS) net-
work a quantitative estimate of the
precision and accuracy of the am-
bient air quality data.
Ambient air quality data, collected by
states and local agencies since 1957, have
been stored in the National Aerometric
Data Bank (NADB). These data are used
in (1) planning the nation's air pollution
control strategy, (2) determining if the
National Air Quality Standards are being
achieved, and (3) determining long-term
trends of air quality. Prior to the EPA air
monitoring regulations of May 10, 1979,
the procedures used in selecting monitor-
ing sites, operating and controlling the
equipment, and calculating, validating and
reporting the data varied considerably
among agencies. Frequently the proce-
dures being used were not well docu-
mented. These conditions made it difficult
to intercompare data from different sites
and agencies. Furthermore, little informa-
tion was available on the reliability of the
monitoring data.
To help alleviate these problems, EPA's
air monitoring regulations imposed
uniform criteria on network design, siting,
quality assurance, monitoring methods,
and data reporting after December 30,
1980. For example, only EPA reference,
equivalent, or other EPA-approved air
monitoring methods were to be used.
Also, calibration standards were to be
traceable to the National Bureau of Stand-
ards (NBS) or other authoritative stand-
ards. Further, the quality assurance
systems of the states were required to be
documented and approved by the EPA
Regional Offices. Finally, the reporting
organizations must also follow specific
procedures when assessing the P and A
of their measurement systems and must
report the P and A data to EPA quarterly.
Starting January 1,1981, these regulations
became effective for National Air Monitor-
ing Sites (NAMS), and beginning January
1, 1983, for all State and Local Air
Monitoring Stations.
The precision assessments were deter-
mined by performing repeated measure-
ments of ambient-level "calibration" gases
at two-week intervals for continuous
methods, or by obtaining duplicate results
from collocated samplers for manual
methods. The accuracy assessments were
generally determined by analyzing blind
audit materials traceable to NBS. During
each calendar year, each site or instrument
must be audited at least once. Details con-
cerning the specific procedures and com-
putations used to assess P and A are con-
tained in the regulations.
National Results
National Data Reporting
The fourth year of data collected by
state and local agencies for P and A has
been compiled and summarized. The net-
work operation has been continually im-
proved. Table 1 shows the improvement in
data reporting for the nation.
Improvement continues for the contin-
uous NO2 method; however, the percent-
age still lags behind that for continuous
CO, S02 and 03 methods. Reporting for
the manual methods for Pb, SO2 and NO2
was required by the regulations beginning
January 1, 1983. Reporting for Pb is
negligibly different from 1983 to 1984.
Reportings for the manual methods for
S02 and N02 have significantly improved
from 1983 to 1984.
1984 Results From The Pars
Program
The measures of precision and accuracy
are required to be computed and reported
for each calendar quarter by each report-
ing organization (a state or local agency)
as percentage deviation values. For preci-
sion, the repeatability for each check is
measured as the deviation from the ex-
pected value as a percentage of the ex-
pected valua For accuracy, the deviation
of the audit value from the true value is
measured as a percentage of the true
value. For both precision and accuracy, 95
percent probability limits are computed for
the percentage values from the average
and standard deviations of the individual
percentage values:
D ± 1.96 S
where D = the average of the individual
percent differences;
S = the standard deviation of
the individual percent
differences;*
1.96 = the multiplication factor
corresponding to 95%
probability.
Table 1. Percent of Reporting
Organizations Reporting
Precision and Accuracy Data
Pollutant
measurement 1981 1982 1983 198'
CO
SO2
NO2
03
TSP
Pb
SO2 (manual)
N02 (manual)
77
82
56
83
94
—
—
—
89
93
72
89
97
—
—
—
99
96
88
99
99
93
75
86
95
91
94
9S
9&
92
8C
IOC
"Note: For the precision of manual methods obtain-
ed from paired observations, the standard
deviation, S, is divided by \f2, to obtain
variability estimates that apply to individual
reported values.
It is these upper and lower 95% probabilit
limits which are reported and discussed i
the full report.
Moreover, it should be noted that th
data and the evaluations presented in th
report include any outlier values whic
may have been reported by the states an
local agencies. The presence of outlier
can influence such comparisons by havin
undue impact on average values for ind
vidual reporting organizations.
Table 2 exhibits the national probabilit
limits for each of the manual methods. Th
probability limits in Tables 2 and 3 are cor
solidated and weighted limits of all th
reported limits for 1984. They are th
limits that would be obtained if the result
of all the individual precision (or accurac
checks in the nation were combined i
one sample. The national limits for th
report more correctly reflect the tot
variability in the data and are somewhi
wider than the corresponding limits f(
previous reports due to a change in th
computation of these limits.
The precision limits reflect the repea
ability of the methodology used in the fie
to collect and analyze the samples at an
bient levels. The spread of the limits me
be somewhat inflated due to measur
ments at relatively low concentratic
levels.
The accuracy of the manual methods i
dicates the limits at predetermined co
centration levels for the chemical analys
performed in the samples for lead, sulf
dioxide, and nitrogen dioxide. For the TJ
method, the accuracy measurement is f
the flow rate only. The probability limits f
manual accuracy are very good and refle
the quality of work done in the chemic
laboratories for lead, sulfur dioxide, ai
nitrogen dioxide analyses, and in the fie
for flow rate measurement for the T!
method. Because of the continual replac
ment of the manual SO2 and Nl
methods with continuous methods, fi
ther discussion of the manual methods
-------
Table 2. National Precision and Accuracy Probability Limit Values for
Manual Methods for 1984
Precision Accuracy
Number of Probability limits (%)
valid col- Probability ~~~ " ——— —
located limits (%) No. of Level 1 Level 2 Level 3
Pollutant data pairs Lower Upper audits Lower Upper Lower Upper Lower Upper
TSP
Lead
Sulfur
dioxide
Nitrogen
dioxide
17,152
3,937
297
691
-16
-18
-33
-27
+ 17
+ 20
+ 31
+ 27
7,436
1,657
203
175
—
- 17
-20
-8
—
+ 15
+ 9
+ 10
-8
- 11
-14
-7
+ 8
+ 10
+ 7
+ 8
—
—
-12
-6
—
—
+ 7
+ 7
5?
•Q
CO
•8
National Values for Precision
1981-1984
Figure 1. National precision probability limits for 1981 through 1984.
limited. The detailed results for each
reporting organization are tabulated in an
appendix to the full report.
The precision and accuracy limits for
automated methods are presented in Table
3. The effort expended for the collection
of quality assurance precision and accur-
acy data is appreciable, but it is necessary
to assess data quality.
National Precision Results
Comparison
Figure 1 shows the national probability
limits for precision for the various meth-
ods. With data from four years, some
minor trends are evident. Some slight
improvement, as measured by a reduction
in the spread of the limits, is noted for TSP
and the continuous methods, except for
NO2. The slight but persistent negative
bias for the continuous SO2 method indi-
cates that on the average there is some
negative instrument drift from the most
recent calibration or instrument adjust-
ment to the time of the biweekly precision
check.
Although the manual methods for Pb,
S02, and N02 were not required to be re-
ported until 1983, a number of agencies
began reporting in 1981. The results for Pb
show a decided improvement. The manual
S02 and N02 methods are much more
variable than the continuous methods.
However, they do show considerable
improvement over the four-year period.
National Accuracy Results
Comparison
Figures 2a and 2b show the national
probability limits for accuracy audits for
the continuous and manual methods,
respectively. Improvement for the manual
methods is not evident except perhaps for
TSP and SO2. The variability for the Pb
method is increased and for the NO2
method has shown no definite trend.
Slight improvement is evident for all the
continuous methods. The continuous
methods for S02 and N02 show more in-
accuracy than all other methods. However,
Table 3. National Precision and Accuracy Probability Limit Values for Automated Analyzers for 1984
Precision Accuracy
CO
SO2
NO2
03
No. of
precision
checks
14,692
38,312
8.653
20,031
Probability
limits 1%)
Lower Upper
-9
-12
-14
-12
+ 8
+ 11
+ 13
+ 10
No. of audits
Total
1,288
1,666
613
1,773
Level
4
23
166
24
144
Probability
Level 1
Lower Upper
- 14
-16
-21
-16
+ 13
+ 14
+ 20
+ 14
Level 2
Lower Upper
-9
-12
-13
-12
+ 8
+ 11
+ 12
+ 10
limits (%)
Level 3
Lower Upper
_ g
-12
-13
- 11
+ 8
+ 11
+ 10
+ 10
Level 4
Lower Upper
-10
-13
-18
-6
+ 9
+ 12
+ 14
+ 5
-------
(a)
National Values for Accuracy
1981-1984
Continuous Methods
30 -
^ 20 -
to
1 1°-
1
°- -10 -
-20-
-30 -
Mill
: • j ^*
'It
^
*u -* ( \ \ \ \
IE
\
_
'•• ' Bl
I • j-
"
81 82 8384
II BD
, .
f
•
IP
1
•
!•
3
!,
cP c& & °° °° c° s°a s°a s° t*° \*°a v*°
National Values for Accuracy
1981-1984
Manual Methods
Figure 2. National accuracy probability limits for 1981 through 1984.
in the accuracy audits for the manual
methods, only a portion of the measure-
ment method is checked.
Although the continuous N02 method
is more variable than the other methods, it
has shown the greatest improvement, par-
ticularly for the level 1 contentration.
The general and expected pattern of
variability across levels is very evident,
with the greatest percentage variability at
the lowest concentration levels. The slight
negative bias for the continuous SO2
method is consistent across all three
levels. A possible cause is that, on the
average, a negative drift occurs with these
analyzers from the time of last calibration
or instrument adjustment until the time of
the accuracy audit.
Comparison of Results from the
PARS and the Performance Audit
Program
A general comparison between the ac-
curacy data of the PARS program and the
Performance Audit (PA) data is included in
the full report. The Performance Audit data
are the results of an indpendent check
conducted by the Quality Asssurance Divi-
sion (QAD) of the EMSL under the Na-
tional Performance Audit Program (NPAP).
In the NPAP, specially prepared audit
samples or devices are sent from QAD to
the participating ambient air monitoring
agencies. The samples or devices are care-
fully and accurately assessed by EMSL
utilizing NBS Standard Reference Materi-
als (SRM's) or standards. The monitoring
agencies analyze or measure the samples
or devices as unknowns or blinds and
report their results to QAD for evaluation.
Audit programs are conducted for the
following pollutant measurements using
the materials indicated:
Since precision assessments are nc
made in the PA program, only accurac
can be compared across the PARS and th
PA programs. For the purpose of the fu
report, the results from PARS and the P/
system are compared at approximately th
same levels by matching laboratories an
reporting organizations. Since the PAR!
data are presented with outliers, the sam
approach was taken with the audit date
Knowledge of the historical audit dat
Portion of measure
Measurement
S02 (manual)
N02 (manual)
Pb
TSP
CO
S02
Audit materials
Freeze-dried sodium sulfite
Aqueous sodium nitrite
Filter strip with lead nitrate
Reference flow device
Cylinders containing CO gas
Cylinder containing S02 gas
ment system audite
Chemical analytical
Chemical analytical
Chemical analytical
Flow
Continuous instrument
Continuous instrument
The audit materials or devices are pre-
pared at three to six different concentra-
tions or flow levels. Separate reports on
the evaluation of the PA data are published
by EMSL.
As indicated above, the NPAP does not
yet include an audit for the ozone or con-
tinuous NO2 methods. Therefore, no com-
parisons of the NPAP or PA data with the
PARS data are possible for these
pollutants.
reports, however, indicates that the pres
ence of outliers may make a significant di
ference in the audit results for som
agencies.
Comparisons of the national values c
the probability limits (Table 4) exhibit fair!
good agreement between the results c
the two programs. However, there is cor
siderable variation between the results c
the two programs when comparisons ar
made on Regional and reporting organize
4
-------
Table 4. Summary Comparison of EMSL Performance Audits (PA) vs.
PARS Accuracy Audit Data for Year 1984
National values
probability limits (%)
Level 1 Level 2 Level 3 Level 4
Pollutant Audits Lower Upper Lower Upper Lower Upper Lower Upper
CO
PA
PARS
SO2
PA
PARS
TSP
PA
PARS
Pb
PA
PARS
SO2 (manual)
PA
PARS
NO2 (manual)
PA
PARS
771
974
357
819
2447
6559
723
1259
30
190
30
139
- 9
-14
-23
- 13
-35
-17
- 18
- 5
- 6
12
13
19
11
30
15
8
*
8
-20
- 8
-16
- 12
- 15
- 6
- 17
-11
-15
- 12
_ j
- 6
21
8
14
11
18
7
11
10
6
6
- 2
7
- 7
- 8
-17
-12
-22
-18
-12
- 3
- 4
8
7 -10
14 -22
10 -11
14
15 -14
6
4-7
5
8
20
9
16
- 3
tion bases. Lack of better agreement re-
sults from several factors. First, the inclu-
sion of outlier values in the PA data ap-
pears to have introduced some excessive
distortion of general trends. Second, even
though the PARS averages in Table 4 are
weighted by the number of audits, varia-
tions due to many sources of error for both
data sets are averaged together to obtain
the national values, thereby masking any
correlations which may have existed for
the results of individual agencies. Third,
the concentration levels for the two sys-
tems do not coincide exactly at each of
the audit levels. Fourth, the PA data are the
results of independent external audits,
while the PARS accuracy data are based
on the results of independent internal
audits. The expected effects of the last-
mentioned factor would cause the spread
of the limits for the PA to be wider than
that for the PARS. Examination of the
results (see Table 4) confirm these
expectations.
Conclusions and
Recommendations
The results of PARS data for 1984 in-
dicate some general improvement over the
data for previous years. However, consid-
erable differences exist among Regions
and individual reporting organizations for
most measurement methods. Investiga-
tions should be made by the Regions and
the states to determine the causes of
these significant differences.
Comparison of PARS and PA data show
more variability of the PA data than for
PARS except for CO. These differences are
presumably due to the fact that the exter-
nal'PA accuracy audits are more complete-
ly independent than the internal PARS
accuracy audits. These differences have
been consistent for past years.
Further improvement in the data quality
assessments, which are measures of the
monitoring data quality, can be achieved
only through continuing efforts of state
and local agency personnel involved first-
hand with the operation and quality con-
trol of their measurement systems. Re-
gional QA Coordinators can also assist
through their review of the operations and
quality control practices across the states
in their Regions.
Each Regional QA Coordinator should
evaluate the PARS data from all the report-
ing organizations within his Region to
identify those organizations having exces-
sively large variations of probability limits.
Investigation should be made to determine
the causes and correct them to preclude
future excessive deviations. Similarly,
Regional QA Coordinators should review
the operations of the reporting organiza-
tions having significantly better precision
and accuracy results in order to identify
specific procedures that should be uni-
formly used throughout the Region and
the nation to further improve the reliability
of the monitoring data in the National
Aerometric Data Base.
-------
The EPA authors Raymond C. Rhodes (also the EPA Project Officer, see below)
and E. Gardner Evans are with the Environmental Mentoring Systems
Laboratory, Research Triangle Park, NC 27711.
The complete report, entitled "Precision and Accuracy Assessments for State
and Local Air Monitoring Networks 1984," (Order No. PB 87-111 720/AS;
Cost: $18.95. subject to change) will be available only from:
National Technical Information Service
5285 Port Royal Road
Springfield, VA 221611
Telephone: 703-487-4650
The EPA Project Officer can be contacted at:
Environmental Monitoring Systems Laboratory
U.S. Environmental Protection Agency
Research Triangle Park, NC 27711
United States
Environmental Protection
Agency
Center for Environmental Research
Information
Cincinnati OH 45268
Official Business
Penalty for Private Use S300
EPA/600/S4-86/031
1 VhB "-'8-'
N , .-.
U.S.POSUGt
I 0 .3 ?. r:
0000329 PS
------- |