vvEPA
United States Environmental Monitoring and Systems EPA-600/4-83-023
Environmental Protection Laboratory June 1983
Agency Research Triangle Park NC 27711
Research and Development
Guideline on the
Meaning and Use of
Precision and Accuracy
Data Required by
40 CFR Part 58,
Appendices A and B
-------
EPA-600/4-63-023
June 1933
GUIDELINE ON THE MEANING AND USE OF PRECISION AND ACCURACY
DATA REQUIRED BY 40 CFR PART 58, APPENDICES A AND B
by
Raymond C. Rhodes
Quality Assurance Division
Environmental Monitoring Systems Laboratory
U.S. Environmental Protection Agency
Research Triangle Park, North Carolina 27711
U.S. ENVIRONMENTAL PROTECTION AGENCY
OFFICE OF RESEARCH AND DEVELOPMENT
ENVIRONMENTAL MONITORING SYSTEMS LABORATORY
RESEARCH TRIANGLE PARK, NORTH CAROLINA 27711
U.S. Environmental Protection Agency
Region 5, library (PL-12J)
77 West Jackson BoufevajjJ. 12th floor
Chicago, 11 60604-3590
-------
NOTICE
This document has been reviewed in accordance with
U.S. Environmental Protection Agency policy and
approved for publication. Mention of trade names
or commercial products does not constitute endorse-
ment or recommendation for use.
11
-------
FOREWORD
Measurement and monitoring research efforts are designed to anticipate
potential environmental problems, to support regulatory actions by develop-
ing an in-depth understanding of the nature and processes that impact
health and the ecology, to provide innovative means of monitoring compli-
ance with regulations and to evaluate the effectiveness of health and
environmental protection efforts through the monitoring of long-term trends.
The Environmental Monitoring Systems Laboratory, Research Triangle Park,
North Carolina, has the responsibility for: assessment of environmental
monitoring technology and systems; implementation of agency-wide quality
assurance programs for air pollution measurement systems; and supplying
technical support to other groups in the Agency including the Office of
Air, Noise and Radiation, the Office of Pesticides and Toxic Substances
and the Office of Enforcement Counsel.
Knowledge of the quality of air pollution measurements from the
national monitoring networks is important in determinating air quality
trends, assessing compliance to air quality standards, and developing
control strategies. Federal regulations for ambient air quality surveil-
lance were revised May 10, 1979 to require the states to develop and conduct
quality assurance programs approved by the EPA Regional Offices. In addi-
tion, the states are required to submit to EPA the results of specific
tests and comparisons to assess the precision and accuracy of their measure-
ment systems. This document is intended to help states and local agencies
achieve the maximum benefits from the new requirements.
Thomas R. Hauser, Ph.D.
Director
Environmental Monitoring Systems Laboratory
Research Triangle Park, North Carolina
-------
ABSTRACT
Federal regulations for ambient air quality surveillance were revised
May 10, 1979, tc include requirements that states perform certain specified
tests to assess the precision and accuracy of their air pollution measure-
ment systems and to report the results to EPA routinely.
This report discusses the concepts and definitions of precision and
accuracy as they relate to ambient air pollution measurement systems. The
rationale used in developing the specified procedures for acquiring
precision and accuracy assessments is explained for both manual and auto-
mated measurement methods. The computational procedures specified for the
handling of the precision and accuracy data and the development of the
statistical assessments to be reported to EPA are reviewed.
Particular emphasis is given to the potential use of the precision
and accuracy data by the states and local agencies as an adjunct to their
routine quality assurance programs. A number of statistical quality
control charts are recommended for routine use by the states and local
agencies.
Finally, answers are provided for many questions concerning inter-
pretation of the requirements of the regulation and procedures for handling
special case situations not specifically detailed in the regulations.
IV
-------
CONTENTS
Number Page
1 General Background 1
2 Overview of Quality Assurance Requirements 13
3 Conditions for Precision Checks 15
4 Conditions for Accuracy Audits 20
5 Statistics of Precision 22
6 Statistics of Accuracy 31
7 Use of Precision and Accuracy Data 38
8 Summary Analysis of Precision and Accuracy Data 40
9 Comparison of SLAMS and PSD Requirements 42
10 Questions and Answers 45
11 References 65
-------
FIGURES
Number Page
1 Multiple reporting organizations with central laboratory
and separate field operations typical of manual
methods 9
2 Multiple reporting organizations with central field
operations and separate laboratories and data analysis
functions for manual methods 10
3 Multiple reporting organizations with central field
operations and data analysis and with separate cali-
bration systems for automated methods 11
4 Multiple reporting organizations with separate field
operations and central calibration and analysis labora-
tories and data analysis unit typical of large-scale
automated methods 12
5 Graphical interpretation of precision data 31
6 Graphical interpretation of accuracy data 37
-------
TABLES
Number Page
1 Automated Analyzer Audit Concentrations 22
2 Computed Probability Limits for Precision for Manual
Methods and Automated Analyzers 28
3 Summary of 95 Percent Probability Limits for Precision
and Their Meanings for Manual Methods and Automated
Methods 29
4 Computed Probability Limits for Accuracy for Manual
Methods and Automated Analyzers 34
5 Summary of 95 Percent Probability Limits for Accuracy
and Their Meanings for Manual Methods and Automated
Analyzers 36
6 Recommended Control Charts and Control Limits for
Precision Checks and Accuracy Audits for State and
Local Agencies 41
7 Comparison of QA Requirements for Appendix A (SLAMS)
and Appendix B (PSD) 43
-------
GUIDELINE ON THE MEANING AND USE OF PRECISION AND ACCURACY
DATA REQUIRED BY 40 CFR PART 58, APPENDICES A AND 8
APPENDIX A: QUALITY ASSURANCE REQUIREMENTS FOR STATE AND LOCAL AIR
MONITORING STATIONS (SLAMS)
APPENDIX B: QUALITY ASSURANCE REQUIREMENTS FOR PREVENTION OF
SIGNIFICANT DETERIORATION (PSD) AIR MONITORING
1. GENERAL BACKGROUND
1.1 Need for Mandatory Quality Assurance (QA)
Prior to the May 10, 1979 promulgation of the Regulations set forth
in 40 CFR Part 58, (44 FR 27558-27604), the quality assurance and quality
control practices of State and local agencies were strictly voluntary,
although many forms of guidance and assistance had been provided by the
EPA Regional Offices and the Environmental Monitoring Systems Laboratory,
Research Triangle Park, North Carolina (EMSL/RTP). Consequently, there was
a wide diversity among the State and local agencies in the scope and
effectiveness of their QA program. As described below, numerous indications
pointed to the need for more extensive and more uniform QA programs of the
state and local agencies.
1.2 Need for Quality Data
Many important EPA decisions are based on the nationwide monitoring
data obtained by the State and local agencies. Data collected and reported
to the National Aerometric Data Bank (NADB) in Durham, North Carolina are
used by EPA to aid in planning the Nation's air pollution control strategy
and to measure achievement toward meeting national air quality standards.
In addition, the data are used locally for determining attainment of the
standards. Further, the data in the NADB are made available to numerous
requestors, who may use the data for various research projects, special
studies, or other purposes.
-------
Unfortunately, none of the data in the NADB prior to January 1, 1981
is accompanied by estimates of its quality. Although the capability
(accuracy and precision) of the EPA-developed measurement methods have been
determined in inter!aboratory collaborative studies, those levels of method
precision and accuracy are seldom achieved in the real world of day-to-day
routine monitoring. To assure the most knowledgeable use of the data, the
quality of the national monitoring data should be determined and made
known to all data users.
Further, many of the monitoring methods used in the past by State and
local agencies were not reference, equivalent, or approved methods, so
designated by EPA after careful and thorough evaluation. Because of the
likelihood of different methods producing differing results, all national
monitoring should be performed using reference, equivalent, or approved
methods.
1-3 Regulation of May 10, 1979
The ambient air monitoring regulations, as revised on May 10, 1979,
contain a new Part 58 (1) that includes various requirements for the tech-
nical improvement of the national monitoring. Some of the new requirements
are the specification of:
0 Monitoring methods;
0 Instrument siting and probe location;
0 Scheduling of monitoring;
0 Network design;
0 Air quality reporting.
Part 58 contains several appendices, two of which specify requirements
for quality assurance (QA) and data quality assessment:
1. Appendix A. Quality Assurance Requirements for State and
Local Monitoring Stations (SLAMS);
2. Appendix B. Quality Assurance Requirements for Prevention of
Significant Deterioration (PSD) Air Monitoring.
-------
The QA requirements of these appendices involve two separate areas:
1. Documentation of each agency's Quality Control Program.
2. Assessment and reporting of the quality of each agency's air
monitoring data.
Documentation of each agency's Quality Control Program is to cover.
as a minimum, the following:
1. Selection of methods, analyzers, or samplers;
2. Installation of equipment;
3. Calibration;
4. Zero/span checks and adjustments of automated analyzers;
5. Control checks and their frequency;
6. Control limits for zero, span, and other control checks,
and respective corrective actions when such limits are
surpassed;
7. Calibration and zero/span checks for multiple range
analyzers;
8. Preventive and remedial maintenance;
9. Quality control procedures for air pollution episode
monitoring;
10. Recording and validation of data; and
11. Documentation of quality control information.
For the data quality assessment, specific procedures are delineated
using special quantitative checks, to determine the precision and accuracy
of each of the automated and manual methods used to measure the criteria
pollutants.
These latter procedures were developed to measure the precision and
accuracy under operating conditions as nearly typical as possible.
Furthermore, the precision and accuracy (P and A) data are required to be
reported to EPA for several important reasons. First, the P and A data
are appropriately filed by EPA so that users of the routine monitoring data
filed in the National Aerometric Data Bank (NADS) receive the corresponding
P and A data for the particular network(s) and the particular time periods
-------
involved. Second, the P and A data are evaluated from Regional and National
standpoints to identify (a) regions, states, or local agencies that
require improvement in their data quality (i.e., improvement in their QA
system) and (b) pollutant measurement methods that may need remedial
changes in the methodology to improve the precision and/or accuracy of the
methods in the real monitoring world.
In addition to the above documentation and assessment requirements,
the regulations require the following:
1. All criteria pollutant measurement calibration standards and
flow measurement calibration standards must be traceable to
National Bureau of Standards (NBS) Standard Reference Materials
(SRM) or other primary standards;
2. All agencies must participate in National EPA performance audits
and must permit EPA system audits of their monitoring and QA pro-
cedures.
3. All methods used for measurements of criteria air pollutants in
SLAMS must be reference, equivalent, or approved methods.
A recent amendment to 40 CFR Part 58, promulgated on September 3, 1981
(46 FR 44159-44172) (2), makes the requirements for assessing precision and
accuracy applicable to monitoring for lead (Pb) and includes several
corrections to the 40 CFR Part 58 regulations. A separate EPA guideline
document has been issued concerning the monitoring for Pb in the vicinity
of point sources (3).
1-4 Measures of Data Quality
The quality of monitoring data can be expressed in terms of
representativeness, comparability, completeness, precision and accuracy.
Aspects of representativeness have been strongly considered in the
portions of Part 58 which deal with network design, siting, and probe
location—factors that relate to the representativeness of the samples.
The extent to which the samples represent the ideal locations, conditions,
-------
and times of sampling are measures of representativeness, and have meaning
with respect to the objective or purpose of the monitoring.
Comparability of data obtained across the entire Nation is achieved,
to a large extent, by the use of the standardized (designated) sampling and
analysis methods specified by the regulations, along with consistency in
reporting units.
Completeness of data sets is an important concern in monitoring
because of the adverse effects of gaps or "holes" in the data base. The
statistical validity of sets of monitoring data is a direct function of
the extent and pattern of missing data. Although the completeness of a
given data set is a major concern to the data user, and its importance is
emphasized in the regulation, the regulation does not require any special
reporting with respect to data completeness. The number of individual data
values reported to the NADB for each monitoring site can be determined and
are reported with data for specific sites.
The measures of data quality which are required to be obtained and
reported by the States and local agencies beginning January 1, 1981, are
those for precision and accuracy. When one speaks of precision and accuracy
of measurment data, one really means the precision and accuracy of the
measurement process from which the measurement data are obtained. Precision
is a measure of the "repeatability of the measurement process under
specified conditions." Accuracy is a measure of "closeness to the truth."
The definitions and concepts of precision and accuracy as they relate to
the requirements of Appendices A and B of the regulation are discussed
further in the next section.
1.5 Precision and Accuracy
As defined above, the accuracy of a measurement is its "closeness to
the truth." Deviations from the truth result from both random errors and
systematic errors. Precision, the repeatability of a measurement process,
is associated with the random errors. The average inaccuracy, or bias, of
a measurement process over some period of time or set of conditions is
associated with the systematic error. Deviation that appears to be
constant, or systematic, under one set of conditions may actually be random
-------
under a set of conditions of wider scope. For example, the systematic
error of a given instrument is associated with average accuracy for that
instrument over some specified period of time. However, the variability of
average inaccuracies from a number of instruments in a network may appear
to be random and can, therefore, be associated with the "precision"
for the network.*
1.5.1 Precision - Precision is used in 40 CFR Part 58, Appendices A and
B, in the sense of "repeatability of measurement values under specified
conditions." Since specified conditions may vary considerably, there are
many levels of repeatability or precision. For example, with an automated
continuous air pollution sensor, the random fluctuations in response over
a short time, e.g., within a minute, when an instrument is measuring a gas
of constant pollutant concentration is a very "local" measurement of pre-
cision. Another measure of repeatability would be the variability of
span measurements made each day on an instrument over some longer period of
time. The measure of precision (repeatability) used in 40 CFR Part 58,
Appendices A and B, for automated methods is the variability of one-
point precision checks made at biweekly intervals on the same instrument
(Instrument Precision). Agency* precision, however, is the average
repeatability of all the instruments of the agency during the calendar
quarter. A given precision check may be considered as representative of
an hourly average value that would have been obtained from the instrument
if the air pollution concentration remained at the same level as that for
the precision check.
Throughout this guideline, agency or network is used in a general
sense corresponding to the definition of a "reporting organization" as
defined in Section 3 of Appendix A, 40 CFR Part 58, and as discussed in
Section 1.6 of this guideline. A reporting organization may consist of
one or more governmental air pollution agencies (networks), or in some
special cases there may be more than one reporting organization in the
same governmental air pollution agency (network).
-------
Because the lack of precision from hour to hour is generally proportional
to concentration, it may be further assumed without much error that the
same percentage variation exists at other concentration levels, except for
very low or very high concentrations.
1.5.2 Accuracy - Accuracy is used in Appendices A and B in the sense of
"closeness to the truth." Although the ultimate truth cannot be known,
accepted as the closest to the truth are the values determined by NBS or
other nationally recognized measurement standards body. In assessing the
accuracy of measurements of an air pollution monitoring agency, measure-
ments are made through the mechanism or procedure of independent audits in
which the measurement systems are challenged with standards (materials or
devices) having traceability as directly as possible to NBS standards.
Some error or uncertainty exists even in NBS Standard Reference
Materials (SRM's), which are labeled with computed tolerances based on
empirical data and which are applicable only under certain specified
conditions and procedures for use. Obviously, some errors are introduced
in the use of secondary standards that have been prepared by reference
against NBS SRM's. Further, if the use of secondary standards in conduct-
ing independent audits involves other measurements, such as flow measure-
ments when diluting audit gases, additional errors are introduced.
Nevertheless, when measurements are made at State and local agencies,
through the independent audits described in Appendices A and B, the
auditors' assessed values are considered as the "truth." Their values are
considered as "true" values in the metrology sense—not in any statistical
sense. As described in sections 3.1.2 and 3.2.2 of Appendix A, periodic
independent sample audits are made using known materials, or using devices
having known properties. These independent audits are used as a check on
the routinely-used calibration materials, equipment, and procedures.
Because of the independence and infrequent and special nature of the
audits, the audit materials and assessments must be considered as the
"known" or true value and any consistent lack of agreement is due to bias
of the routine calibration process and/or drift (change of bias) in the
routine measurement process.
-------
Measurements at a given agency may, ori the average, be biased from the
true audit values due to some systematic errors in the local routine cali-
bration process. These average biases over a given time period (e.g., a
calendar quarter) may be considered as the inaccuracy of the agency's
measurement system for that calendar quarter. There will also be some
variability in the inaccuracy of measurements* made at an agency during a
calendar quarter. This variability of inaccuracy may be considered as a
hgiher level of imprecision when considering a measurement chosen at random
from the given agency during the quarter. Carrying the extension in time
a step further, biases which exist from quarter to quarter at a given
agency may also vary in a random way. Therefore, the annual average of the
quarterly biases may be considered as the bias or average inaccuracy of
the agency's measurement system for the year. And the variability of the
bias from quarter to quarter may be considered a part of the overall
within-year imprecision for the agency.
1.6 Reporting Organization
The Regulation, Section 3 of Appendix A, requires that measures of
data quality, i.e., precision and accuracy, be "reported on the basis of
'reporting organization.1 A reporting organization is defined as a State
or subordinate organization within a State which is responsible for a set
of stations which monitor the same pollutant and for which precision and
accuracy assessments can be pooled. . . . and can be expected to be
reasonably homogeneous as a result of common factors."
*The concept of a variable component of systematic error is discussed by
Dr. Churchill Eisenhart's lengthy article, "Realistic Evaluation of the
Precision and Accuracy of Instrument Calibration Systems," Journal of
Research of the National Bureau of Standards, Vol. 67C, No. 2, April-
June, 1963. See also "Systematic Measurement Errors", by Rolf B.F.
Schumacher, Journal of Quality Technology, Vol. 13, No. 1, January 1981.
pp. 10-24.
-------
"Common factors which should be considered ... include:
"(1) operation by a common team of field operators,
(2) common calibration facilities, and
(3) support by a common laboratory or headquarters."
Several examples of reporting organizations are presented in Figures
1 through 4.
FIELD OPERATIONS
LABORATORY
CALIBRATION ANALYSIS
REPORTING 1 2
ORGANIZATIONS
Figure 1. Multiple reporting organizations with central laboratory
and separate field operations typical of manual methods.
In Figure 1, the field operations, which may be spread over a wide
geographical area, are handled by two different working groups, each
using their separate procedures, field calibration (flow) equipment and
standards, preventive maintenance schedules, etc. The samples are
analyzed, however, in a central laboratory with central laboratory
personnel, procedures, calibration chemicals, calibrated balances, etc.
In this example, there are two separate reporting organizations as indi-
cated by lines 1 and 2.
-------
Figures 2 and 3 illustrate situations where the field operations
carried out by a single group; however, two different chemicals labora-
tories are involved, each of which performs all functions associated with
calibration and analysis. Further data are analyzed and processed by
separate units in Figure 2, but in Figure 3, the data handling is per-
formed by one unit. In each case, there are two separate reporting
organizations, defined by the two lines.
FIELD OPERATIONS
LABORATORY
CALIBRATION ANALYSIS
DATA HANDLING
REPORTING
ORGANIZATIONS
Figure 2. Multiple reporting organizations with central field
operations and separate laboratories and data analysis
functions for manual methods.
10
-------
FIELD OPERATIONS
CALIBRATION
DATA HANDLING
REPORTING
ORGANIZATIONS
Figure 3. Multiple reporting organizations with central field
operations and data analysis and with separate
calibration systems for automated methods.
11
-------
FIELD
OPERATIONS
MAINTENANCE
CALIBRATION
LABORATORY
DATA HANDLING
REPORTING
ORGANIZATIONS
CDC
—>
(,
1 2 3
Figure 4. Multiple reporting organizations with separate field
operations and central calibration and analysis
laboratories and data analysis unit typical of large-
scale automated methods.
12
-------
Figure 4 represents three reporting organizations, each with its own
field operations for sampling and instrument maintenance. In such a large
operation, the field operation functions are performed by different sets of
personnel at widely separated locations. However, each organization uses
the same calibration and analysis laboratories and data handling facilities.
As can be deduced from these examples, the definition of reporting
organization does not relate to which agency or organization reports the
routine monitoring or to which agency or organization reports the precision
and accuracy data, but rather to the total operational system involved in
sampling, calibration, analysis, and reporting for routine monitoring for a
specific pollutant.
It is important to emphasize that the definition of reporting organi-
zation is pollutant-specific. It is possible that a given sampling site
may be identified with different reporting organizations for different
pollutants. The concept or definition of reporting organization has no
meaning, of course, for PSD monitoring. For PSD monitoring, the measure-
ment and reporting of precision and accuracy data are accomplished for each
site or sampling location.
2. OVERVIEW OF QUALITY ASSURANCE REQUIREMENTS
Before discussing the details of the requirements for precision and
accuracy determination for SLAMS and PSD, it is desirable to summarize the
general requirements of the regulations relative to quality assurance.
The precision and accuracy determinations are made by performing
specified internal checks made by or for the reporting organization (SLAMS)
or by the owner/operator (PSD). Closely related to these internal checks
are the external performance audits and system audits conducted by EPA.
The responsibility for obtaining and reporting the precision and
accuracy data belongs to the reporting organization and is therefore
considered as "internal" to it. The conduct of the Performance Audit
program by EMSL/RTP and the conduct of the Quality System Audit program by
the EPA Regional Offices are considered as "external" to the reporting
organization, because the programs are conducted by EPA, even though the
reporting organizations are involved by their participation in the audits.
13
-------
Precision for each of the manual methods (except for Pb) is determined
from the results of collocated samplers located at two sites expected to
have a measurable concentration of the pollutant. The precision checks for
Pb are made by analyzing duplicate strips (or duplicate aliquots for equiv-
alent methods) from a single site of expected high Pb concentration.
Accuracy is determined from the results of local independent audits for
the flow or analytical measurement portion of the methods. The accuracy
checks are essentially internal-but-independent checks on the local routine
calibration process.
The external audits for accuracy are EPA performance audits, in which
reference samples or devices from EMSL/RTP are distributed as blinds on an
annual or semi-annual frequency to the organizations involved. The results
from these "unknowns" are transmitted to EMSL/RTP, which then sends the
"true" value to the organizations. Each year an annual summary report,
prepared by EMSL/RTP, provides an overall analysis by EMSL/RTP through the
dissemination of (a) simulated bubbler samples for the S02 and N02 methods,
(b) reference flow devices for the high-volume TSP method, and (c) refer-
ence flow devices and spiked high-volume filter strips for Pb. Note that
only the chemical analysis portion of the bubbler methods is audited; the
flow measurement is not audited. For the TSP method, only the flow
measurement portion of the method is audited; the sample handling, sample
conditioning, and weighing portions of the method are not audited. For
Pb, the chemical analysis portion of the method is audited, and the flow
measurement is audited.
Annual systems audits of each state agency may be conducted by the
EPA Regional Offices. These audits should cover all the aspects of the
State QA program, with particulate emphasis on the eleven items listed in
Section 2.2 of Appendix A to Part 58 and repeated in Section 1.3 of this
document. See also Section 2.0.11 of Reference 5.
2.2 Automated Analyzers
Precision of automated analyzers is determined from biweekly precision
checks. The precision checks are actually measurements of the analyzer
response at a concentration level near the national average for ambient air.
Accuracy is determined from the results of local audits using in-
dependently prepared standards. The accuracy checks are essentially
internal-but-independent checks on the local routine calibration process.
14
-------
The external performance and system audits for automated methods are
similar to those for the manual methods, except that at the current time
(December, 1982), reference materials are available only for the CO and SCL
measurement systems. Audit materials/devices are being developed by
EMSL/RTP for automated N02 and 03 methods. When developed, the reporting
organizations will be required to participate in these performance audits
also.
2.3 PSD Requirements
The requirements for precision and accuracy assessment for PSD
monitoring methods are very similar to those for SLAMS. In those instances
where the requirements differ, special note will be made. Otherwise, the
reader should assume that the requirements are the same. A separate
section, Section 9, summarizes the major similarities and differences
between the requirements for SLAMS and PSD. EPA has issued other guidance
concerning the monitoring for PSD (6,7).
2.4 Reporting Precision and Accuracy Data
The procedures for obtaining precision and accuracy data, including
the necessary computations, are included in Appendicies A and B of 40 CFR
Part 58. For reference, a copy of the reporting form, Form 1 (front and
back), is given on the following page.
3. CONDITIONS FOR PRECISION TESTS
3.1 Typical Conditions
It is very important that the estimates of precision for the above
purposes be obtained under conditions that are as typical as possible. The
measurements from which the estimates are made should be obtained under
conditions of operation, maintenance, and calibration that are representa-
tive of normal routine activities of the monitoring agencies. The follow-
ing precautions should be observed by the State and local agencies in
obtaining the data used to estimate precision.
15
-------
REPORTING
STATE ORGANIZAT ION
I I 1 I I I
12 345
DATA ASSESSMENT REPORT
YtAR
OM* He tM-NWII
QUARTER SEND COMPLETED FORM
ml 1 TO REGIONAL OFFICE
WITH COPY TO EMSL/RTP
6 J
MAME Of REPORTING ORGANIZATION —
AUTOMATED ANALYZERS
PRECISION
•OW ~~
NO or
ANALV2ERS PRECISION CHECKS PROBABILITY LIMIT!
LOWER UPPER
A CO
I NO,
rm LJLLJU
15-17
'
18-71
LOWER UPPER
• -14
_
rm i 1 1 i i [•„!„! ri 1 1
'*-" *-» LOWER UPPER
rm rrm i-jj ri 1 1
15-17 11-21 22-27
_ __________ LOWER UPPER
rm i i i i i i i- 1 i i
-
22-Z7
LOWER U»ER
•-M
__—_——. ______
i i i i i rm iiiii ri Ltuu
CTi
ACCURACY
•OURCI OP
PROBABILITY LIMITS
T»*CtAiltlTY -y NO OF
_ ITANDARO2 AUDITS
LEVEL 2
LEVEL 3
LOWER UPPER
LOWER UPPER
IC|«N.I.N n
"' **
LOWER UPPER
'/TT~I
n rm MJ •• i i i r;tiMi n 1 1 \$~\ I
ICJ.I.I.I.N n n rm
**'M M * »-»
icN.|«i"i'i g z: r
»-" **
i.hi n
LOWER UPPER
LOWER UPPER
LOWER UPPER
LEVEL 4
LOWER UPPER
LOWER UPPER
___
I:'MI i"i 1 1 r;.g ri i i
LOWER UPPER
M
LOWER UPPER
.
LOWER UPPER
LOWER UPPER
LOWER UPPER
LOWER UPPER
LOWER UPPER
1*'M
i*
n rm 11 r-i 1 1 i n 1 1 i n 1 1
LOWER UPPM
LOWER UPPER
LOWER UPPER
LOWER UPPER
LOWER UPPtd
NO Of AUDIT*
At LEVtL 4
'; i 1 1 rm
UPPER
n 1 1 rm
n i i rm
^^ ,__ UPPtd _**_**_
i n i i i n n rm rj^i i" -i i i i^i i\ \ \ wj ri i i tg n i i rm
-»! M » 3«-» *-** *-*• 17"™
1 COUNT ONLY REPERENCt OK EQUIVALiHT MONITOMINQ MCTHOOS
MMttity M-ldnn to Ih. loKunlBj
A NBt SRM
• EMSl REFERENCE OAt
C. VENDER CRM
O PHOTOMETER
I. CAKI
f OTHER (PfCIPV
FORM 1 (FRONT)
-------
OMB No. 1M-R0012
DATA ASSESSMENT REPORT
STATE ORGANIZATION
SEND COMPLETED FORM
| | | | 1 1 | 1 J 1 | WITH COPY TO EMSL/RTP
NAD
A. TIP I1 I1 I1 I1
•-14
HO. I1 I4 I1 I4
•-14
C. NO, | 1 4 |l [•
1 »-u'
0. Ph 1 ' 1 » 1 1
•-14
A TIP n i' i'
30-16
•.10, I 'I4 I2
30-36
C, NO. [ ' | 4 | 2
30-36
o. Pk 1 ' 1 ' h
12345 67 8
MANUAL METHODS
PRECISION
NO OF NO OF PROBABILITY LIMITS
NO OF COLLOCATED COLLOCATED
SAMPLERS1 SITES SAMPLES -W
1 i 1
M-tt
111!
M-«
-III
«_17 1«-1» 20-23 " " ••-«'
ACCURACY
PROBABILITY LIMITS
NO OF LEVEL 1 LEVEL 2 LEVEL 3
AUDITS LOWER UPPER LOWER UPPER LOWER UPPER
o hi 1*1 1 1 1 1 |"-| 1 !•'-! 1 1 M 1 I"' 1
* 37~3* LOWER UPPER LOWER UPPER
• eii| [•! | | | i M | |"-| | | M 1 •" 1
M J7-» LOWfR UPPER LOWER UPPER
. «M |ci i i i i M i M i i n i i"- i
™~m LOWER UPPER LOWER UPPER
i iji] H 1 1 1 1 N 1 M 1 1 l"-l 1 I"' I
ItA, r-l 1 1
LOWER UPPER
H l"-| 1 1
LOWER UPPER
\:i, n 1 1
LOWER UPPER
I'M I l"-| I I
J *-u-vil — • — ' — ' — '
37-3»
1 COUNT ONLY REFERENCE OR EQUIVALENT MONITORING METHODS.
FORM 1 (BACK)
-------
In preparation for the performance of the precision checks, no special
adjustments, calibrations, or maintenance of the instruments should be
performed. For example, the biweekly precision checks should be made prior
to any routine or special checks or adjustments made in connection with
zero/span, calibration, or maintenance scheduled on the same day as the
precision checks. If routine zero or span checks, adjustments, cali-
brations or maintenance are performed at some scheduled frequency, the
biweekly precision checks should be made at various random times in
between these scheduled operations. In other words, the special checks
for precision should be made at times which, as a sample, are representa-
tive of the typical conditions existing within the calendar quarter. From
practical, logistic considerations, the precision checks could be made
just prior to (a) scheduled zero/span checks that may result in instrument
adjustment, (b) scheduled calibrations, or (c) scheduled maintenance.
3.2 Manual Methods
As previously pointed out (Section 1.5.1), the conditions under which
precision is determined must be very specifically stated. The intent of
the regulation is to obtain precision estimates that reflect the repeat-
ability of the entire measurement process. The best known way of measuring
the repeatability of the entire process is through the use of collocated
samplers. In this way, most of the variables acting throughout the entire
measurement process are independently involved for each of the two separate
samplers. Even so, there will be commonalities of conditions for the
paired data from the collocated samplers that will enhance better agreement
than would be achieved if the two samplers and samples were completely
independent. For example, the paired samples will be handled under the
same conditions, and will be analyzed under the same conditions in the
laboratory. Because of such commonalities, the precision estimates
obtained will be somewhat optimistic, i.e., they will tend to underestimate
the true inherent variability (imprecision) of the total measurement
process.
Internal (local) precision checks are made using collocated samplers
at a minimum of two sites of high concentration. One of the collocated
18
-------
samplers is randomly designated as the official sampler for routine
monitoring; the other is considered the duplicate. After the designated
sampler is so identified, its designation should not be changed. Results
from the duplicate sampler are to be obtained each day the designated
sampler is operated unless the samplers are operated more frequently than
every sixth day, in which case at least one collocated sample is required
each week.
Ideally, collocated samplers should also be required for Pb. However,
because of the added expense of establishing duplicate samplers at Pb
sites, resort has been made to analyses of duplicate strips or aliquots
from filters from a single sampler at a high concentration Pb site. The
estimates of precision from the duplicate strips will not include vari-
abilities from sampler to sampler and thus will underestimate the
imprecision of the total measurement process.
3.3 Automated Analyzers
For automated analyzers, the use of collocated analyzers would be best
to measure repeatability; however, the cost would be prohibitive. The next
most desirable technique is to perform response checks at approximately
ambient concentration levels at random times between successive instrument
adjustments. In this way, the precision is a measure of instrument drift
from the time of the most recent instrument adjustment or calibration to
the time of the precision check. The regulations require the precision
checks to be made at two-week intervals or more frequently. Although not
stated in the regulations, an average of the instrument output should be
obtained over some relatively short period of time, e.g., five minutes
following introduction of the "precision" gas and after reaching equilib-
rium. Thus, the precision estimates have meaning only with respect to the
time-averaging period over which the average values are obtained. Precision
estimates for other time-averaging periods would have to be determined by
knowing or assuming a drift pattern between successive instrument adjust-
ments and calibrations.
19
-------
Precision checks are conducted at least biweekly and are made with
concentrations of test gases in the following ranges:
0.08 - 0.10 ppm for S02> 03> N02;
8 - 10 ppm for CO.
These precision checks may use the same materials, equipment, and personnel
routinely used for instrument calibration or span checks.
4. CONDITIONS FOR ACCURACY AUDITS
4.1 Typical Conditions
Data for estimating accuracy should be obtained under conditions as
typical as possible, i.e., under normal, routine activities of the moni-
toring agencies. Thus, consistent with the previous discussion under
Section 3.1 for precision, no special adjustments, calibrations, or
maintenance of the instruments should be performed immediately prior to the
internal accuracy audits.
To measure the closeness of an observed measurement value to the
truth, some material or condition of known (true) property (standard) must
be measured by the measurement system being checked. The measurement
system is "challenged" with the "known" to obtain the observed measurement.
The difference between the observed value and the known value is a measure
of the bias or inaccuracy of the observed value. Standard convention is
to obtain a signed difference by subtracting the known value from the
observed value so that the sign indicates the direction of the bias.
More specific details concerning the conduct of accuracy audits is given
in Reference 1.
4.2 Manual Methods
For manual methods, it is difficult to challenge the total measurement
system with "knowns". Therefore, an accuracy audit is made of only
a portion of the measurement system. The two major portions of manual
measurement systems are the flow measurements and the analytical measure-
ments. The flow measurement portion of the TSP and Pb reference methods,
20
-------
and the analytical measurement portion of the Pb and the NOp and SCL
bubbler methods are audited for accuracy. The flow rate audits for the
TSP and Pb methods are made at a flow rate near the normal operating flow
rate. Twenty-five percent of the combined total sites for TSP and Pb must
be audited internally each quarter, so as to represent a random sample for
the entire network. However, at least one site must be audited each
quarter, and all sites must be audited internally each year.
For the N02 and S02 methods, analytical audit samples (standards) in
the following ranges are used:
1. 0.2-0.3 ug/ml;
2. 0.5-0.6 ug/ml;
3. 0.8-0.9 ug/ml.
For the Pb method, the standards are spiked strips containing 100-300 ug/Pb
strip and 600-1000 ug/Pb strip are used. An internal audit at each concen-
tration level must be made on each day of analysis of routine monitoring
samples, and the audits must be made at least twice each quarter.
4.3 Automated Analyzers
For automated analyzers, "known" gaseous pollutant standard concen-
trations, independently certified and obtained with equipment different
from that used for routine calibration and spanning, are introduced into
the measurement instruments. In this way, two different calibration
systems are involved: the one used for routine monitoring and the one
used to establish the audit standards. For SLAMS, the accuracy audits may
be conducted by the same personnel who normally calibrate the instruments.
However, in the case of PSD, different personnel must be used.
Automated analyzers are challenged (audited) with known pollutant
concentration standards at three levels (four levels in the case of high-
range analyzers) in accordance with Table III. The internal, independent
accuracy audits are the responsibility of each reporting organization and
can be performed by personnel of the reporting organization. However, the
reporting organization could, if desired, have the accuracy audits
conducted by a contractor, or they could, by mutual agreement, be performed
by a Regional team, a contractor of EPA, or some other independent
organization.
21
-------
Table I. Automated Analyzer Audit Concentrations
Audit
Level
1
2
3
4
Concentration range, ppm
S02, Nl
0.03 -
0.15 -
0.35 -
0.80 -
D2' °3
0.08
0.20
0.45
0.90
3
15
35
80
CO
- 8
- 20
- 45
- 90
External audits of automated analyzer measurement systems are con-
ducted by EPA. Semiannual performance audits for CO and S02 automated
methods are conducted by EMSL/RTP through the dissemination of small
cylinders containing CO gas and through the dissemination of small
cylinders containing S02 gas used in conjunction with a dilution system.
Materials or devices for conducting performance audits for the measurement
systems for N02 and 03 are being developed by EMSL/RTP. Participation in
these latter audits will be required of the State/local agencies when the
audit materials become available.
The annual systems audits conducted by the EPA Regional Offices were
previously discussed.
5. STATISTICS OF PRECISION
The choice of the particular statistics used for precision are
described in the following section. However, it should be stated at this
point that the statistical procedures and computations specified in 40 CFR
Part 58, Appendices A and B represent a tradeoff or compromise between (a)
the amount of effort and data that would be "nice to have" for statistical
analysis, and (b) the amount of effort that can be reasonably expected and
(c) the amount of data that can be efficiently and effectively handled by
State and local agencies. Thus, the statistics of Appendices A and B
represent a compromise between (a) theoretical statistical exactness, and
(b) simplicity and uniformity in computational procedures.
22
-------
5.1 Signed Percentage Differences
The reason for using percentage differences instead of actual differ-
ences is that errors in precision are generally proportional to concen-
tration levels. In the case of the biweekly precision checks, which are
made at one fixed level, either actual differences or percentage differ-
ences could be used. However, since other comparisons are made on a
percentage basis, percentage differences are used throughout for simplicity
and consistency. It is recognized that the percentage errors may be
somewhat higher at very low concentrations.
To obtain signed differences, the measurement associated with one
identified factor is always subtracted from the measurement associated
with the other identified factor. For example, for collocated samplers
the value from the designated sampler is always subtracted from the
value for the duplicate sampler. Therefore, each difference will be
either positive or negative in sign.
The reason for using signed percentage differences instead of abso-
lute percentage differences is to obtain important information on the
possible presence of systematic errors. The calculation of the average
difference values using signed percentage differences reveals or highlights
any systematic errors which may need investigation and corrective action
to improve the precision of the monitoring data. Further, the statistical
significance of these systematic errors can be determined with the average
and the standard deviation of the signed percentage difference values.
With absolute percentage differences, it would not be possible to separate
the systematic errors from the random errors. Ideally, the average, signed
percentage difference values obtained for each instrument or site should
be zero. Where these values are significantly different from zero, the
resulting probability limits will be noticeably asymmetrical about zero.
5.2 Manual Methods
For manual measurement methods, precision or repeatability is deter-
mined from the discrepancy between measurements from collocated samplers
presumably sampling the same air parcel over the same time period (Instru-
ment or Site Precision).
23
-------
Because it is desired to obtain a measure of precision associated
with a result from a single sampler, the variability (standard deviation)
of percentage differences between the collocated instruments is divided
by J2, since both instruments are assumed to be equally imprecise. The
division by J2 compensates for the fact that the variability (standard
deviation) of percentage differences from two measurement systems of equal
imprecision is increased by a factor of ^2 over the error variability of a
single measurement system. After division by V2, the repeatability
represents the variation in results which would be obtained if a large
number of like instruments of the same imprecision as those at the
collocation site were located at the same site sampling the same air over
the same period.
Because of the additional cost of establishing collocated samplers
for the estimation of precision for Pb, resort has been made to the
measurement of agreement between the analysis of duplicate strips from a
single filter or analyses of duplicate aliquots of the extracts. Whichever
method is used, the V? factor should be used in the calculation of the
probability limits. The precision includes only the analytical portion of
the method and does not include the sampling and flow measurement portions
of the method.
5.3 Automated Analyzers
A given precision check may be considered as representative of an
hourly average which would have been obtained from the instrument if the
air pollution concentration were the same as the concentration level of
the gas used for the precision check. Because the lack of precision is
generally proportional to concentration, nearly the same percentage
variation exists at other concentration levels, except for very low
concentrations for a given measurement system and quality control system.
5.4 Probability Limits
Throughout Appendices A and B, "probability" limits* are computed to
measure the expected spread or variability of the data from a particular
24
-------
population. These expected limits are expressed simply as a mean plus or
minus a constant (1.96) times the standard deviation as follows:
L = x ± ks (1)
where:
L = probability limits (upper limit, L.; lower limit, L)
x = mean value
k = 1.96, a constant
s = standard deviation
Under the assumptions of (a) an underlying normal population, (b)
the mean x, being the estimate of the true mean, u, of the underlying
population, and (c) the standard deviation, s, being the estimate of the
true standard devication, a, of the underlying distribution, then
x ± 1.96s represents the expected limits which should include 95 percent
of all the individual measurement of the population. Under the assemption
given, x ± 1.96s limits are the expected 95 percent probability limits,
regardless of the sample size.**
The requirement for the computation of "probability" limits (rather
than confidence limits) is to provide the State and local agencies with
limits which will be of practical meaning and usefulness for internal
control applications without involving overly complicated and sophisti-
cated statistics. The selection of the 95 percent level was made because
even for non-statisticians, the chance or probability of obtaining one
value out of twenty exceeding the limits has practical meaning.
Note that the limits are not "confidence limits," which could be
computed if one desired to determine limits that would include the true
mean, u, with a specified confidence probability. With a given average,
x, and standard deviation, s, confidence limits on the "true"* statistical
mean would be:
x ± ts/Vn (2)
*See O.L. Davies, "Statistical Methods in Research and Production,"
Oliver and Boyd (1949), p. 249 for a discussion of probability limits.
**See A. Hald, "Statistical Theory with Engineering Applications," Wiley
(1952), pp. 311-312.
25
-------
where:
t = a value from the t-distribution
n = number of sample values
With the limits computed for an instrument, site, analysis-day, or
agency, along with the appropriate sample size, confidence limits on the
true mean could be computed, if so desired.
Note also that the limits, x ± 1.96s, are not "tolerance" limits
according to the usual definition of "limits which will include at least
a fraction P of the individual values of a population with a stated degree
of confidence y." Such two-sided tolerance limits are expressed in the
same form as equation 1, x ± ks, but the value of k here is different
from that in equation 1 and depends on the specified values of population
fraction, P, and confidence coefficient, y. Tabulated values for k are
often given** for values of P of 0.75 and above, and values of y of 0.75
and above. For example, for sample size, n, of 13, P = 0.95 and y = 0.75,
the k value is 2.424. Thus, the tolerance limits, x ± 2.424s, will include
at least 95 percent of the individual values of the underlying normal
distribution with a confidence of 75 percent.
In a sense, the x ± 1.96s "probability" limits are a special type of
tolerance limit where the confidence level is at the "expectation," or
near the 50 percent confidence level. (It is not exactly the 50 percent
confidence level because the distributions of x + ks and x - ks are not
normal for small sample sizes.) In other words, approximately 50 percent
of the time, the probability limits will include 95 percent of the individ-
ual values of the underlying distribution.
*The "true" mean in the statistical sense is a quantity the confidence
limits for which includes considerations for the variations due to
random sampling and random measurement repeatability. The "true" mean
in the statistical sense is not the same as the "true" mean in the
meteorological sense.
**Handbook 91, "Experimental Statistics," U.S. Dept. of Commerce,
National Bureau of Standards, pp. 2-13 through 2-15 and Table A-6;
see also A. Hald, "Statistical Theory with Engineering Applications,"
pp. 313-315.
26
-------
A summary of the various probability limits for precision, computed as
outlined in Appendix A, is presented in Table 2 for manual methods and
automated analyzers. Note that a condition and measure of bias or
systematic error is always associated with the d.'s and the D's, and a
J
condition and measure of repeatability or random error is associated with
the ± 1.96 S. and the ± 1.96 S3 terms of the limits.
J a
5.5 Meaning of Precision
Table 3 summarizes and interprets the probability limits for precision.
The d.'s and the S.'s are the means and standard deviations, respectively,
for the calendar quarter for particular instruments, particular sites,
particular instrument-site combinations, or particular analysis days.
S. represents the variability of the measurement process under the
J
most similar conditions and may be considered as the statistical "error."
d, can be considered in a statistical sense as a local, within-quarter
•J —
instrument bias or inaccuracy. However, the d.'s may not necessarily be
statistically different from zero. If the d.'s are significantly different
from zero, a persistent drift in instrument response is occurring, and the
cause must be identified and corrected. Whether or not the d.'s are signi-
\J
ficantly different from zero, for a particular instrument, site, or analysis
day, the d.'s will probably vary in a random way among instruments, among
J —
sites, and among analysis days; therefore, the variability of the d.'s
J
may be considered as another level of precision, when considering the
agency monitoring system as an entity.
For a specific agency S represents the "averaged" or pooled within-
a
instrument, within-site, or within-analysis-day variability. In other
words, it is the agency estimate for the calendar quarter of the within-
instrument, within-site, or within-analysis-day variability—an average
error term.
The D's may be considered as a within-quarter agency bias or in-
accuracy. However, the D's may not necessarily be statistically signifi-
cant from zero. A part of the EPA analysis of the data includes a test for
significance of the D's. (Such a test should also be performed by eacn
agency.) If the D's are significantly different from zero, a persistent
27
-------
Table 2. Computed Probability Limits for Precision for
Manual Methods and Automated Analyzers
Manual methods
Precision (from daily signed percentage differences between collocated
instruments for SO^, NO^, and TSP, or from signed percentage
differences between duplicate strips or duplicate analyses
di
for Pb).
Single Site*
Agency
bias between samplers,
strips or analyses
(systematic error)
agency bias
(systematic error)
1.96 S
j
V2
within-site variability,
individual daily value basis
(random error)
1.96 S
average within-site variability
(random error)
Automated analyzers
Precision (from biweekly precision checks at one fixed level)
Single Instrument*
Agency
instrument bias
(systematic error)
agency bias
(systematic error)
1.96 S.
J
within-instrument variabi1ity
(random error)
1.96 S
a
average within-instrument
variability
(random error)
^Limits for each instrument, site, or analysis day are not required to be
reported to EPA. However, they should be computed for internal agency use.
28
-------
Table 3. Summary of 95 Percent Probability Limits for Precision and Their
Meaning for Manual Methods and Automated Methods.
95 percent
probability
limits
Meaning of limits
Manual methods
Single site*
Agency
d. ± 1.96
D ±1.96 -
Expected variability (imprecision) during the calendar
quarter of an individual air pollution measurement from
the particular site.
Expected variability (imprecision) during the calendar
quarter of an individual air pollution measurements from
any site within the agency.
Automated analyzers
Single Instrument* d. ± 1.96 S.
J J
Agency
D ± 1.96 S
Expected variability (imprecision) during the calendar
quarter of air pollution measurements at the precision
check concentration from the particular instrument.
Expected variability (imprecision) during the calendar
quarter of air pollution measurements at the precision
check concentration from any instrument operated by
the agency.
^Limits for each instrument, site, or analysis day are not required to be reported to EPA.
However, they should be computed for internal agency use.
-------
drift in the same direction very likely exists for most of the instruments.
Whether or not the D's are significant from zero, the D's may vary in a
random way among quarters for the same agency; therefore, the variability
of the D!? may be considered as a third level of imprecision. The EPA
analyses of the data for each calendar year include a test of D, the
average of the four quarterly D's for a given agency against zero, to
detect any persistent drifts throughout the entire year for all instruments
of the same type.
From the probability limits reported by an agency, one could back-
calculate the agency average, D, and the agency standard deviation, Sa- As
discussed previously, if the computations were made using unsigned
percentage differences, it would not be possible to determine the D and S
values; thus, it would not be possible to determine the possibly signifi-
cant systematic agency errors. In other words, it would not be possible
to separate the systematic errors from the random errors.
Figure 5 graphically illustrates the meaning of the calculated values
of d., S., D, S , and the 95 percent probability limits for precision. The
J J a
individual x's represent the individual d values for each of four instru-
ments or sites of the example. For each of the instruments or sites, the
d., the average of the d's, represents the bias from zero, and S,
represents the variability of the d values. The pair of short parallel
lines in the tails of the distribution represent the 95 percent probability
limits for the assumed underlying normal distribution of individual d
values.
The normal distribution shown in Figure 5 under Quarterly Report shows
D (the weighted average of the d.'s), and $a, representing the pooled or
weighted "average" of the individual S. values. The short parallel lines
in the tails of the distribution represent the corresponding 95 percent
probability limits.
30
-------
INSTRUMENTS OR SITES
QUARTERLY
REPORT
en
IV
u
c
Ol
-------
Also, as with precision discussed previously, the reason for using
signed percentage differences for accuracy instead of absolute percentage
differences is to obtain important information on possible systematic
errors. The calculation of the difference values using signed percentage
differences reveals or highlights any systematic errors which may need
investigation and corrective action to further improve the accuracy of the
monitoring data. Using the average and the standard deviation of the
signed percentage difference values, the statistical significance of these
systematic errors can be determined. With absolute percentage differences,
it would not be possible to separate the systematic errors from the random
errors. Ideally, the average signed percentage difference values obtained
for each analyzer should be zero. Where the average difference values are
significantly different from zero, the resulting probability limits will
be noticeably asymmetrical about zero.
6.2 Manual Methods
The accuracy of manual sampling methods is assessed by auditing a por-
tion of the measurement process. For TSP and Pb, the flow rate during
sampling is audited. For S02> N02, and Pb. the analytical measurement is
audited. For single samplers, the accuracy is the signed percentage dif-
ference value, which is the observed or measured value minus the known
value divided by the known value and converted to a percentage. For an
agency, the accuracy is the mean of the signed percentage difference
values from the samplers.
6.3 Automated Analyzers
The audit is performed by challenging the analyzer with known concen-
trations at three levels (four levels for analyzers with extended ranges).
The accuracy at each level is calculated as described previously for the
manual method (Section 6.2).
6.4 Probability Limits
The statistical concepts discussed previously for precision are also
applicable to the computation of probability limits for accuracy.
32
-------
Table 4 summarizes the various probability limits for accuracy
computed as outlined in 40 CFR Part 58, Appendix A, for manual methods and
automated analyzers. The bias or systematic error is associated with the
single-instrument and agency signed percentage differences, d.'s and D's,
J
respectively. Repeatabi1ity or random error is associated with the ± kS
a
terms of the probability limits.
6.5 Meaning of Accuracy
Table 5 summarizes and interprets the aforementioned probability
limits for accuracy for manual methods and automated analyzers. The D's
and the S ,'s are the means and standard deviations, respectively, for the
a
calendar quarter for the agency.
For accuracy, S represents the variability of inaccuracies across
a
instruments, sites, or analysis-days. In a sense, this S may be con-
a
sidered a precision estimate. For automated methods with a well-controlled
calibration system, and with good linearity and stability over time, the
S for accuracy and the S for precision should be approximately equal.
a a
(Whereas, the S, for precision measures the average variation at a single
a
concentration at biweekly intervals, the S for accuracy measures the
a
variation at given concentration levels but at only one time each quarter
for a given instrument). A part of the EPA analysis includes a comparison
of the S for precision and the S for accuracy for continuous instruments.
a a
If the S for accuracy at the lowest concentration level is significantly
a
larger than the S for precision, there is likely to be some uncontrolled
a
variable existing within the calibration process, which should be investi-
gated.
For integrated sampling methods, the logic of the comparison between
the S A/2 for precision and the S for accuracy is not as straight-forward
a a
as for automated analyzers. For TSP, since the S for accuracy includes
a
variation only from the flow rate portion of the measurement process, this
S should be less than the S./-/2 for precision, which includes variation
a J
from the entire measurement process. For the S0? and NCL bubbler methods,
a similar situation exists as for the TSP, in that the S for accuracy
a
includes variation only from the chemical analysis portion of the
33
-------
Table 4. Computed Probability Limits for Accuracy For
Manual Methods and Automated Analyzers
Manual Methods
Accuracy (TSP and Pb)(from flow rate checks at a fixed level, once
per quarter, 25% of sites each quarter)
Single Site d.
sampler inaccuracy
(combined systematic
and random errors)
Agency 5 ± 1.96 S
a
agency bias total variability
(systematic error) including between
sampler inaccuracies
Accuracy (NO.,. SO., and Pb). Each Level (from analytical checks at
^*least twice per quarter)
Single Analysis Day d^
daily inaccuracy
(combined systematic
and random errors)
Agency D ± 1.96 S
a
agency bias total variability,
(systematic error) including between-
day inaccuracies
34
-------
TABLE 4. (Contd.)
Automated Analyzers
Accuracy, Each Level (from calibration audits once per quarter,
25% of instruments each quarter)
Single Instrument d.
instrument inaccuracy
(combined systematic
and random errors)
Agency D ± 1.96 S,
3
agency bias total variability
(systematic error) including between
instrument inaccuracies
35
-------
Table 5. Summary of 95 Percent Probability Limits for Accuracy and
Their Meanings for Manual Methods and Automated Analyzers
Manual Methods
TSP and Pb
Single Site
Agency
D ± 1.96 S.
Expected bias (Inaccuracy) during the calendar quarter of flow rate
portion of the measurement process for the particular site.
Expected variation in bias (inaccuracy) during the calendar quarter
of flow rate portion of the measurement process for all sites In
the agency.
GO
CTl
S02, N02, and Pb
Single Analysis
Day
Agency D ± 1.96 S,
a
Automated Analyzers
Single Instrument d.
Agency D ± 1.96 S
Expected bias (inaccuracy) during the calendar quarter of the chemical
analysis portion of the measurement process for the particular
analysis day at each concentration level.
Expected variation in bias (inaccuracy) during the calendar quarter
of the chemical analysis portion of the measurement process for all
analysis days at each concentration level.
Expected bias (inaccuracy) during the calendar quarter of air pollu-
tion measurements at each audit concentration from the particular
instrument.
Expected variation in bias (Inaccuracy) during the calendar quarter
of air pollution measurements at each audit concentration from all
Instruments in the agency.
-------
measurement method but the S ./V2 includes variation from the entire
measurement process. If S for accuracy significantly exceeds S /V2 for
o 3
precision, the calibration process for the method involved is not well-
controlled. For Pb, both the flow rate and analytical portions of the
method are audited. But the flow rate audits are combined with the TSP
data and not reported individually for lead.
Figure 6 graphically illustrates the meaning of the calculated values
of d, 6, and Sa, and the 95 percent probability limits for accuracy. For
Q
accuracy at a given level, the individual audit results, d, for four
instruments or sites are represented by the x's. In accordance with the
minimum requirements of the regulations, only one audit value (for a given
level) is shown for each instrument or site.
Under Quarterly Report is shown the same four individual x or d
values, with the D and S calculated from the individual values. The 95
a
percent probability limits are shown by the short parallel lines in the
tails of the distribution.
INSTRUMENTS OR SITES
QUARTERLY
REPORT
-------
7. USE OF PRECISION AND ACCURACY DATA
The precision and accuracy data obtained by the networks and reported
to EPA are of considerable value to various organizations. These estimates
will be helpful to the user of routine monitoring data by providing the
user with information on the quality of the data with which he is working.
The estimates are valuable to EPA in obtaining "real world" information on
the precision and accuracy of the reference, equivalent, and approved
methods. The data should also be of particular interest and value to the
originating agencies as a supplement to the routine quality control system.
7.1 Originating Agencies
7.1.1 Supplement to Internal Quality Control - The measures of precision
and accuracy are obtained by each network in the form of probability or
control chart-type limits that can and should be used within each agency as
supplementary information for internal quality control. The precision and
accuracy information obtained within a network on a given site or instrument
can be used for local quality control purposes for the particular site or
instrument. It is important to emphasize, however, that the precision and
accuracy checks required by Appendix A do not obviate the need to maintain
a routine quality control system. The precision and accuracy checks are
too infrequent to be adequate for day-to-day control. Furthermore, the
precision and accuracy results should not normally be used to make any
after-the-fact adjustments or corrections to the measurement system or to
monitoring data. Excessive deviations, however, should not be ignored and
should trigger investigative action.
7.1.2 Control Charts - The results of the precision and accuracy data
can be plotted on various control charts. As stated above, the results of
the precision and accuracy checks, if used in a timely way, can constitute
a valuable supplement to normal routine internal quality control checks.
With the increased installation and use of computers for acquisition
and/or up-to-date storage of monitoring data, the computers could also be
used for the acquisition and/or storage of the precision and accuracy data.
Further, the computers could be programmed to perform the necessary cal-
culations for precision and accuracy reporting and could also be programmed
to plot the control charts in real (or near-real) time.
38
-------
In general, the control chart limits will be similar to the computed
probability limits except that the 1.96 value will be replaced by a 3.0.
(The 1.96 corresponds to an expected 95 percent probabi1ity--the 3.0 cor-
responds to an expected 99.7 percent probability.) The ± 1.96 S limits
o.
could also be used as "2-sigma" warning limits along with the "3-sigma"
control limits. In the case of manual method precision, the -fe factor
is not included because the points to be plotted will be the percentage
differences, which include variability from the imprecision of both
samplers. Also, since the intuitively expected values for d. and 5 are
J
zero for precision and accuracy, the center!ine for the control charts
should be zero. Table 6 describes control charts which can be plotted
for the individual precision checks and accuracy audits.
Although the prime objective of the precision and accuracy audits it
to obtain an assessment of data quality, a number of statistical control
charts can be maintained to provide some supplemental long-term internal
control. With control limits established on the basis of past history (at
least one quarter for precision, at least one year for accuracy), future
data values can be plotted to detect significant changes from past
experience. Control charts could be plotted with the 5 values to detect
within quarter biases. Similarly, the quarterly values of S could be
a
plotted to control or display the variability aspects of the measurement
systems.
Consult Appendix H of the Quality Assurance Handbook for Air Pollution
Measurement Systems, Volume I, Principles, EPA 600/9-76-005 for further
details on the construction and use of control charts. Also, the analysis
and interpretation of the results from individual accuracy audits are
given on pages 86-9, Section 2.0.12 of Volume II, Ambient Air Specific
Methods, EPA 600/4-77-027a.
7.2 States and Regional Offices
The precision and accuracy reports will be helpful to the states in
comparing these measures of data quality from the networks within the
States. Similarly, the EPA Regional Offices will be able to make
39
-------
comparisons within and between States. These comparisons may point out
particular organizations or States in need of further improvement in their
quality assurance programs.
7.3 Environmental Protection Agency (EPA)
Evaluation of the precision and accuracy data is important to EPA
(EMSL, Research Triangle Park, North Carolina) in its role of responsi-
bility for quality assurance of air pollution measurements. The precision
and accuracy data will be used to (a) determine possible needs for addi-
tional research efforts related to the technical or procedural aspects of
particular measurement methods, (b) indicate measurement methods, or
portions thereof, which may require improved quality control, and
(c) indicate particular agencies, States, or Regions that may require
technical assistance or improved quality control. In other words, the
precision and accuracy information will enable comparisons to be made
among measurement methods, and among networks or other organizational
entities for purposes of identifying possible areas in need of improvement
of data quality.
With knowledge of the precision and accuracy information, EPA could
consider appropriate statistical allowances or risks in setting and
enforcing the standards, and in developing control strategies.
7.4 The User
Users of monitoring data maintained in the National Aerometric Data
Bank (NADB) receive, along with the monitoring data, the precision and
accuracy data for the corresponding reporting organizations and time
periods. Knowledge of the precision and accuracy data assists the users
in their interpretation, evaluation, and use of the routine monitoring data.
8. SUMMARY OF ANALYSIS OF PRECISION AND ACCURACY DATA
To assist Regions and States in making the above comparisons and in
performing other analyses of the reported precision and accuracy data,
EMSL/RTP prepares evaluation and summary reports covering each calendar
40
-------
Table 6. Recommended Control Charts and Control Limits for Precision
Checks and Accuracy Audits for State and Local Agencies
Pollutant
measurement method
Manual methods
so2
NO?
TSP
Pb
TSP (flow rate)
Pb (flow rate)
SO, (analysis)
NO^ (analysis)
Pb (analysis)
Automated methods
so2
or
NO,
°3
Type of
Control charts
Precision-Single
Site
Accuracy-Single
Site
Accuracy for
each audit level
Precision-Single
Site
Accuracy for
each audit
level
Number of
control charts Control limits
One control Zero ± 3 S
chart for each
collocated site
(the single
site for Pb)
One control Zero ± 3 S
chart per agency
One control Zero ± 3 S
chart for each
audit level
One control Zero ± 3 S
chart for each
Instrument.
One control Zero ± 3 S
chart for each
audit level
Frequency of plotting
and values to be plotted
Each day, plot d. for
each site J
After each audit, plot each
Individual d.
After each audit, plot each
Individual d.,
J
After each biweekly pre-
cision check, plot each
Individual d. value
J
After each audit check, plot
each Individual d. value
J
Variability or
Inaccuracy to
be controlled
Excessive lack of
agreement between
collocated samplers
Excessive Inaccuracy
of each Instrument
Excessive Inaccuracy
for each audit
Excessive varia-
bility and drift
of each Instrument
Excessive Inaccuracy
of each Instrument
-------
quarter, as well as an annual summary. Samples of these reports are avail-
able upon request.
9. COMPARISON OF SLAMS AND PSD REQUIREMENTS
Table 7 summarizes the major similarities and differences of the
requirements for SLAMS and PSD.
As indicated, the requirements are the same in that both require:
(a) The development, documentation, and implementation of approved
quality control programs.
(b) The assessment of data quality for precision and accuracy.
(c) The use of reference, equivalent, or approved methods.
(d) The use of calibration standards traceable to NBS SRM's or other
primary standards.
(e) The participation in EPA performance audits and the permission
for EPA to conduct system audits.
The monitoring and QA responsibilities for SLAMS are with the State
or local agency, whereas for PSD they are with the source owner/operator
seeking the permit. The monitoring duration for SLAMS is indefinite,
whereas for PSD the duration is usually up to 12 months. Whereas, the
reporting period for precision and accuracy data is on a calendar quarter
basis for SLAMS, it is on a continuing sampling quarter basis for PSD,
since the monitoring may not commence at the beginning of a calendar
quarter. For example, the reporting quarters for PSD might be March,
April, May; June, July, August; etc.
The performance audits for PSD must be conducted by personnel dif-
ferent from those who perform routine span checks and calibrations,
whereas for SLAMS, it is the preferred but not the required condition.
For PSD, the audit rate is 100 percent of the sites per reporting quarter,
whereas for SLAMS it is 25 percent of the sites or instruments. Note that
monitoring for SO, and NO,, for PSD must be done with automated analyzers--
&. £
the manual bubbler methods are not permitted.
42
-------
Table 7. Comparison of QA Requirements for Appendix A
(SLAMS) and Appendix 6 (PSD)
Topic
Appendix A
Appendix B
Requirements 1. Develop and implement an approved quality control
program.
2. Assess data quality in terms of precision and accuracy.
3. Use reference, equivalent, or approved methods.
4. Use traceable standards.
5. Participate in EPA performance audits and permit EPA
to perform system audits.
Monitoring
and QA
Responsibility
State/Local Agency
Source Owner/Operator
Monitoring
Duration
Indefinitely
Up to 12 months
QA Reporting
Period
Accuracy
Assessment
Audits
Audit Rate
-Automated
-Manual
Calendar quarter
Standards and equipment
different from those used
for spanning and cali-
bration. Prefer different
personnel.
25% per quarter
Hi-vol and Pb - 25% per
quarter.
SO & NO. - Each analysis
day, at feast twice per
quarter.
Sampling quarter
Personnel, standards, and
equipment different from
those used for spanning and
calibration.
100% per quarter
100% per quarter
(No manual !
permitted).
or N0
Precision
Assessment
-Automated
(precision
gas check)
-Manual
(collocated
sampling)
One point precision check biweekly - more
frequent encouraged. Independence not required.
Two sites, every sixth day,
or at least once per week
for NO,, SO, and TSP.
Duplicate strips or ali-
quots for Pb.
One site: at least once per
week or every third day for
daily monitoring (TSP
and Pb).
Reporting
By reporting organization. By site.
43
-------
The requirements for precision assessment for the automated methods
are the same for both SLAMS and PSD. However, for manual methods, only one
collocated site is required for PSD and the frequency is once per week
instead of every sixth day as is usual for SLAMS.
The precision and accuracy data for PSD is reported separately for
each sampler (site), whereas for SLAMS, the report is by reporting
organization.
It should be recognized that the requirements of Appendix A and
Appendix B are minimum requirements. The permit-granting authority
for PSD may require more frequent or more stringent requirements than
stated in Appendix B. Also, the Regional Offices may require more frequent
or more stringent requirements for SLAMS than those stated in Appendix A.
44
-------
10. QUESTIONS AND ANSWERS
A series of workshops was conducted in each of the 10 EPA regions to
review the background, rationale, and requirements of the May 10, 1979
regulation (40 CFR Part 58) with region, state and local agency personnel.
During the conduct of the workshops numerous questions were raised by the
regional, state and local agency personnel concerning interpretations of
the requirements and guidance for implementing the requirements of the
regulation in special cases and circumstances. This section presents the
questions raised and the answers given.
1. Q. What is the relation between the Quality Assurance Criteria
(QAC) program and the Precision and Accuracy Reporting System (PARS)?
A. The QAC program was designed as a qualitative means of
"scoring" data quality from knowledge of siting, probe location, measure-
ment method, etc. (i.e., technical criteria) with judgmental weights.
The QAC program was intended as as interim method of judging data quality
for past periods until the PARS system became effective, January 1, 1981.
Unless required to be continued by the Regional Offices, the QAC program
has been terminated.
2. Q. Are video tapes available of the Regional Workshops conduct-
ed by EPA (EMSL/RTP and OAQPS) on regulation 40 CFR Part 58?
A. Yes, a series of color video tapes on the regulation
covering condensations of the material presented at workshops by members
of the EMSL/RTP and OAQPS at each of the ten Regions is available on loan
from the Air Pollution Training Institute, EPA (MD-17), Research
Triangle Park, North Carolina 27711. These tapes provide a systematic
review of the requirements of the regulations, including those portions
dealing with the precision and accuracy reporting (7).
3. Q. The regulations (CFR Part 58, section 58.23) require that
SLAMS be fully implemented, including the requirements of Appendix A, by
January 1, 1983. The regulations (section 58.34) require that NAMS be
45
-------
fully implemented, including the requirements of Appendix A, by January 1,
1981. Further, Appendix A specifies the minimum requirements for SLAMS.
Section 4.1.1 of Appendix A, requires precision data from all approved
SLAMS analyzers. Section 4.1.2 of Appendix A requires accuracy data from
all approved SLAMS analyzers. The instructions for Form 1 (Appendix A,
Section 5.3, Block No. 15-17) state that only approved analyzers in the
network be counted and reported. (1) What, if any, is the difference
between approved and reference or equivalent methods?; (2) What
analyzers/methods are to be included in the precision checks and accuracy
audits and reported?; and (3) How does the difference in the implementation
dates (January 1, 1983 for SLAMS and January 1, 1981 for NAMS) affect the
requirements for precision checks and accuracy audits and the reporting
thereof?
A. (1) An approved analyzer is a reference or equivalent
method or an analyzer otherwise approved under 40 CFR Part 58, Appendix C.
(2) Reporting of precision and accuracy data is required
for all reference, equivalent or approved methods used at designated, fully
approved, and operational SLAMS sites.
(3) The results of the special checks for precision and
accuracy for both automated analyzers and manual methods are intended to
represent the precision and accuracy for the entire reporting organization
for the SLAMS network. Since the NAMS is a part of SLAMS, the same
precision and accuracy data represent the NAMS as well as SLAMS. Further,
the intent of the regulation is that the documented QA system (see section
2.2, 40 CFR Part 58) applies equally to NAMS and SLAMS. In other words,
the QA system for NAMS is to be no different from that for all other SLAMS.
NAMS sites are to receive no special treatment with respect to QA.
4. Q. Will EPA certify commercial supplier cylinders or permation
tubes for users?
A. Yes. EMSL/RTP currently provides a service of certifying gas
cylinders, permeation tubes, and flow measurement devices at no cost.
46
-------
5. Q. What quality assurance requirements apply to the meteoro-
logical monitoring systems used to obtain data for modeling purposes for
PSD?
A. No requirements are given in the regulations. However,
Ambient Monitoring Guidelines for Prevention of Significant Deterioration
(PSD) EPA-450/2-78-019, May 1978, includes some recommendations (6). The
QA for the meteorology measurement systems should be included in the
documented QA plan. A guideline document, "Quality Assurance Manual for
Meteorological Monitoring Systems" is currently in preparation.
6. Q. Where in the regulation is the form for reporting PSD
precision and accuracy data?
A. There is no special form included in the regulation. How-
ever, Form 1 could be modified for use. A separate form, however, would
be required for each site. The permit-granting authority should specify
the format for reporting PSD data.
7. Q. If SLAMS data are used for PSD purposes, must the precision
and accuracy requirements for PSD (Appendix B) be met?
A. If it is planned in advance by the permit-granting authority
to use data from a SLAMS site for PSD purposes, i.e., the same site is
used for both SLAMS and PSD, then it must meet the precision and accuracy
requirements for both. Special considerations or decisions may be made
by the permit-granting authority to use data from SLAMS sites.
8. Q. (a) Can the flow rate audit for TSP be performed in the
laboratory or must it be performed in the field? (b) Can the precision
checks and audits for automated analyzers be performed in the laboratory?
A. (a) The flow rate audit must be performed in the field
with different equipment than used for calibration, (b) All precision
and accuracy data are to represent field monitoring results. Consequently,
precision and accuracy checks for automated analyzers must be made in the
47
-------
field under monitoring conditions.
NOTE: By definition, precision data for manual methods result from
field sampling. Only the accuracy audits for the SO^, N0?, and Pb manual
methods are performed in the laboratory.
9. Q. Can the results of the precision and accuracy checks be
used as a basis for invalidating routine monitoring data?
A. The intended use of the precision and accuracy checks is
not for use as data validation or invalidation checks. Each agency should
have developed and implemented a separate system for routine use in per-
forming the data validation function and which should include various
types of checks with associated validation/invalidation criteria.
It is possible that after a sufficient history of precision
and accuracy data (e.g., after a year) have been accumulated, these data
could, with appropriate statistical analysis, provide a basis for being
used as validation/invalidation criteria, as a part of (in addition to)
the routine data validation system.
If, however, precision and accuracy data are used to
invalidate routine monitoring data, all of the monitoring data from the
particulat site or sites (instrument or instruments) involved should be
invalidated back to the last "acceptable" check of the same type. In
such a case, the results of the precision check or accuracy audit involved
should not be included in the calculations for reporting precision and
accuracy data.
10. Q. Precision checks and/or accuracy audits may have been per-
formed during a period for which routine monitoring data have been invali-
dated for cause. Should the results of the precision checks and accuracy
audits be reported?
A. Not if routine monitoring data obtained immediately before
and after the precision checks and accuracy audits, were invalidated for
reasons that could have adversely affected the precision and accuracy
results. The audits should be repeated as soon as practical after the
invalidation of the monitoring data.
48
-------
11. Q. Although the regulations require only one accuracy audit
for the selected automated analyzers each quarter, an agency may decide to
audit a given analyzer more than once during a quarter. In equations 8
and 9 (section 4.1.2 of Appendix A of the regulation), k is indicated as
the number of analyzers audited during the quarter. If more than one
audit is performed on an analyzer, what is the value of k?
A. With more than one audit on a given analyzer, k is the
number of audits performed. For example, if two analyzers are audited--
one twice and the other three times—the total number of audits is five.
Therefore, k is 5, which should be used in the calculations and reported
in Blocks 36-38 on Form 1 (front). The same procedure would also be
used for audits of manual methods.
12. Q. On Form 1, the upper portion of the leftmost block in each
group of blocks used for the reporting of the probability limits contains
small +/- signs. Are the + or - signs, whichever applies, to be circled,
or should a larger + or - sign be written or typed in?
A. The intent of the +/- signs was to remind those completing
the form that either a + or a - sign must be entered preceding the two
digit value. On some of the forms, the +/- is very faint. It is best
to enter a large + or - sign in the block, rather than to circle the
appropriate sign, or to delete the inappropriate sign.
13. Q. Does EPA provide any guidance on the conduct of independent
performance audits?
A. Yes. Details concerning the conduct of performance audits
for the TSP flow measurement and for automated continuous methods for
sulfur dioxide, nitrogen dioxide, carbon monoxide, and ozone are provided
in the EPA Quality Assurance Handbook for Air Pollution Measurement Systems,
Volume II, Section 2.0.12 (5).
14. Q. Exactly how are precision and accuracy data to be computed
when the samplers or analyzers may be changed at a given site?
49
-------
A. Precision Checks
Automated methods - Compute results individually for each
analyzer-site combination in actual use to obtain monitoring data during
each quarter. Determine d and s for each analyzer-site combination.
Combine the results as specified by the formulas in Appendix A or B. In
formulas 4, 4a, 5, and 5a, k is the number of analyzer-site combinations.
Manual methods - Compute results individually for each
collocated site, whether or not changes or replacements have been made in
the samplers during the quarter. Compute d and s for each collocated site.
Combine the results as specified by the formulas in Appendix A or B. In
formulas 4, 4a, 5, and 5a, k is the number of collocated sites.
Accuracy Checks
Automated methods - Ideally, each analyzer-site combina-
tion in actual use to obtain monitoring data during the calendar year
should be audited. Due to changes of analyzers at given sites, this will
result in more combinations than there are sites.
In practice, by the end of each calendar year, each
analyzer which has been used for routine monitoring should have been
audited and each site should have been audited.
In planning for auditing, the "25 percent rule" should
take the above into account.
Manual methods (TSP) - Although samplers or motors may be
changed at given sites during the calendar year, it is considered necessary
only to audit 25 percent of the sites each quarter, as a minimum. Ideally,
each sampler-motor combination used to obtain routine monitoring data
during the year should be audited. Normally, because of motor brush wear,
motors or brushes are replaced approximately 3 times per year, with a
once-every-sixth-day schedule.
15. Q. In addition to the audit levels specified in the regulation,
an agency desires, for its own purposes, to audit at other levels. Are the
audit results at these other levels to be reported to EPA?
50
-------
A. No. Only the results at the levels specified in the regula-
tion are to be reported to EPA.
16. Q. How are the precision and accuracy data applicable to moni-
toring data with respect to compliance to the ambient air standards?
A. The precision and accuracy data (probability limits) re-
ported to EPA cannot be used directly in relation to compliance to air
quality standards or to attainment/non-attainment for several reasons.
First, compliance standards are determined from site specific information.
Any consideration of precision and accuracy data would be limited to the
specific sites and time periods involved. Such data would be available
only at the local agency.
Further, to be relatable to a standard, any precision and
accuracy data would have to be appropriately transformed to (a) the same
time-averaging basis as that of the air quality standard and (b) to the
same pollutant concentration level as the standard on measurements
obtained.
In the determination of attainment/non-attainment, other
related information may need to be considered, such as
a. Time series history and continuity of the pollut-
ant measurements at the site(s) involved
b. Aggregate frequency distribution of the pollutant
measurements on the same time-averaging basis as
the standards at the site(s) involved
c. Meteorology
d. Frequency of non-compliance to standards
e. Magnitude of exceedance of the monitoring data
above the standard.
The requirements for quality assurance data on precision and
accuracy were not established for purposes of relating the information to
standards or to attainment/non-attainment, but rather to obtain some
measure of data quality and to improve the data quality from the nation's
monitoring networks, where indicated by the precision and accuracy data.
Until appropriate statistical procedures are developed, the
probability limits on precision and accuracy can not be used directly with
51
-------
relation to meeting standards or determining attainment/non-attainment.
The probability limits for precision and accuracy are not confidence
limits on true pollutant concentration levels.
17. Q. Section 5.2 of Appendix A of the regulation specifies that
"simple unweighted arithmetic averages of the probability limits for
precision and accuracy from the four quarterly periods of the calendar
year" be computed and reported with the annual SLAMS report. Why are
the results of the precision checks not weighted using formulas 4a and 5a
of section 4.1.1 (b) as is required for quarterly reporting? Also, why
are the results of the accuracy audits not weighted?
A. The major reason for computing the annual limits in this way
was the simplicity of the computations—aimed particularly for agencies
without sophisticated computing capability.
For annual precision limits, it may be more statistically
correct to compute the limits weighting the results from each site or
instrument for the entire year by the number of precision checks made on
each site or instrument by using formulas 4a and 5a. However, it is doubt-
ful that, from a practical standpoint, there will be appreciable differ-
ences between the limits calcualted by the more complicated weighted
procedure and the limits calculated by simple averaging.
For the more complicated weighted procedure to be more
correct statistically, it must be assumed that the ratios of the numbers of
precision checks at given sites or instruments during the year to the
number of ambient pollutant data values reported from the sites or instru-
ments during the year, are essentially the same.
For the annual accuracy probability limits, it would be most
correct statistically to compute the limits by using the results of all
audit checks made during the year whether or not multiple audits have been
made for a given instrument or site during the year. The computations
would be made using equations 8 and 9 (not equations 4a and 5a) using all
of the audit data for the year. Here again, it is doubtful that the
results will be, from practical standpoints, appreciably different from
those using the simple averaging method. The results of accuracy checks
are aimed primarily at measuring the correctness of the calibration system
52
-------
used throughout the reporting organization, even though during a given
quarter, only about one-fourth of the instruments or sites may be audited.
If multiple audits are made during a given quarter (as some reporting
organizations may desire to do) presumably the multiple audits of the same
instrument or site would not be performed during the same month, and
therefore, the multiple audits would represent different audits j_n time of
the reporting organization's calibration system.
Therefore, each audit is considered as a separate audit of
the calibration system, whether or not multiple audits of the same site or
instrument are performed during any given quarter, or during the year.
In effect, the sites or instruments which have been audited
more than once are given additional weight by including the individual
results of each audit.
Manual Methods
18. Q. For results of the manual methods, it is stated that at
least 25 percent of the high-volume and Pb samplers (Section 3.2.2 (a) of
Appendix A) be audited each quarter. Is it required that the duplicate
sampler in the case of collocated sites be included in the accuracy audits?
A. No. It is not required that the duplicate sampler be
audited. Since the designated sampler will be audited, paired data from
the two samplers are available from the precision check for accuracy
comparison. However, the local agency might consider it desirable to audit
the duplicate sampler.
19. Q. Section 2.3.3, Appendix A, 40 CFR Part 58, states that flow
measurement equipment must be traceable to an authoritative volume or other
standard. Will the use of Class A volumetric glassware satisfy this
requirement?
A. Yes, if it is of sufficient size. For example, when
checking a wet test meter having a 1 liter per revolution dial, a 2 liter
(or larger) volumetric flask should be used. Other means of satisfactory
traceability are (a) commercial NBS-traceable bubble meters, or (b) mass
53
-------
flowmeters calibrated with wet or dry test meters which are NBS-traceable.
20. Q. How are flow audits conducted for the TSP method when flow
controllers are used? How are the results calculated?
A. A filter is used with the orifice plate or reference flow
device. The flow rate determined by use of the flow transfer standard
considered the true value (the X value), and the observed level (the Y
value) is the flow rate indicated by the sampler's flow indicator or the
flow rate setting (assumed flow rate) of the flow control device.
21. Q. Can the collocated high-volume samplers use the same timers
for automatic start and stop?
A. Ideally, they should be operated as independently as
possible to involve all of the variables in the measurement process. Some
Regions require that the collocated samplers be independent to the degree
that they are plugged into different electrical outlets. However, since
timer variation should be small, a common timer may be permissible by the
Regional office or permit-granting authority.
22. Q. In some cases, for PSD sampling, as many as 4 (or more)
high-volume samplers may be used with automatic start-stop timers at a
given site in order that the samplers may operate unattended for 4 (or
more) successive days. Must the collocated (duplicate) sampler always be
used with the same designated sampler?
A. No.
23. Q. Must the duplicate sampler, in such a case, always be the
same sampler?
A. No, but it would be desirable. If different duplicate
samplers are used during a report quarter, the results should be examined
separately for each duplicate sampler.
54
-------
24. Q. Can the collocated bubbler samplers use the same vacuum
pump?
A. Ideally, they should have separate vacuum pumps and be
independent to the degree that they are plugged into different electrical
outlets. However, from practical considerations, a common vacuum pump may
be permissible by the Regional office.
25. Q. For manual methods, Form 1 (back) requires the number of
samplers of each type which are operational in the network to be reported
in Blocks 15-17. Are the duplicate samplers for each collocated site
included?
A. No, the duplicate samplers are not counted. The intent
is to obtain the number of SLAMS sites at which the manual samplers
(reference, equivalent, or approved methods) are being operated.
26. Q. Can the sites selected for collocation be changed at any
time?
A. Yes, although it would be best to change only at the
beginning of a calendar quarter or a calendar year. If a change is made
within a calendar quarter, the new site, or sites, shall be treated
separately for calculation and reporting purposes. Thus, if one of two
collocated sites is changed during a quarter, the results shall be treated
and calculated as three separate sites.
For local quality control purposes, different biases (and
variabilities) might be expected at the new site compared to the old site.
Therefore, the results should not be combined when calculating the averages
and standard deviations for the quarter. Similarly, if quality control
charts are maintained—as they should be--at the local agency, it may be
necessary to establish new control limits for the results from the new site.
27. Q. In the case of collocated samplers, if either the designated
or duplicate sampler gives results that are below the minimum detection
limits, should the precision data be reported?
55
-------
The minimum detection limits are:
TSP 1 |jg/m3
N02 15 pg/m3 (TGS-ANSA)
9 (jg/m3 (sodium arsenite)
9 ug/m3 (sodium arsenite with Technicon II)
S02 25 ug/m3 (for a 30 liter sample in 10 ml of
KM absorbent)
Pb 0.07 ug/m3
A. No, the precision data must not be reported if either the
designated sampler result or the duplicate sampler result is below the
detection limit. Also note that if a pair of values are not reported,
it is not to be counted in the "No. of valid collocated data pairs" entered
in blocks 58-60, Form 1 (back). Further, the entry for blocks 30-23 "No.
of collocated samplers < limit" must include only those readings from the
designated sampler that have been used in the computation for precision.
precision. Note that the limits given on Form 1 in the block and applic-
able to data blocks 20-23 are not the detection limits.
The determination of the 95 percent probability limits for
precision in no way changes the reporting requirements of SAROAD. All
data, regardless of concentration, shall continue to be reported in the
standard manner.
28. Q. If past precision results from collocated sites are used
to establish data validation limits,* and the precision results for a
given day exceed the established limits, should the precision data be
reported?
A. If the established data validation limits are exceeded due
to excessive lack of agreement between the results of the collocated
samplers and the result from the designated sampler is not reported (i.e.,
the value has been invalidated), then the results must not be included in
the computations for reporting precision. The monitoring data from both
the designated sampler and the duplicate sampler should be invalidated,
i.e., neither should be reported as routine monitoring data.
56
-------
In some instances, the excessive difference may be due to a
known cause affecting only the duplicate (or designated) sampler results,
in which case the result from the designated (or duplicate) sampler may
be reported as monitoring data.
29. Q. Is a high-volume sampler with an automatic flow controller
a reference or equivalent method?
A. It is considered as a reference method.
30. Q. For collocated high-volume samplers, how should the roofs
of the two instruments be oriented?
A. The high-volume reference method does not restrict or
specify the orientation of the ridge of the roof with respect to compass
direction, with respect to the direction of predominate wind, or with
respect to any other reference. Unless the region, state, or local
agency has stipulated some requirement on the roof orientation, as for
any individual high-volume sampler, then the roof orientations of the
collocated samplers should not be restricted. The roof orientations
should occur in whatever (random) direction results from the installation.
In other words, roof orientation variability is a part of the method
variability, and the two collocated samplers should not be made more
alike than would result if they were installed separately as individual
samplers without regard to the other.
31. Q. Will EMSL/RTP supply excess bubbler solutions (QC reference
samples used in EPA performance audits) for use as audit materials?
*NOTE: Any such data validation limits should be specified in the agency's
documented quality control program subject to approval by the EPA Regional
Office. Further, the occurrence of an excessive lack of agreement should
raise questions concerning the validity of data acquired previously, and
should initiate some corrective action investigation.
57
-------
A. Yes, but only if extra (spare) solutions are available.
However, it is not EMSL's practice to procure a large number of excess QC
reference samples for this purpose. Further, the concentrations may not be
in the ranges required by the regulations.
32. Q. (a) When monitoring for lead in particulate matter, can the
flow audits for TSP automatically be used as the flow audit data for lead?
As an example, suppose a network has a total of 16 hi-vols, 12 of which are
used for TSP and 4 of which are used for lead. Must one of the 4 hi-vols
used to monitor lead be audited each quarter? Or could the 4 hi-vols used
to monitor lead be grouped with the 12 hi-vols used to monitor TSP, so that
each quarter, 4 of the total hi-vols are audited, the ones used for lead
being audited randomly within the year?
(b) In some cases, the same hi-vol sampler may be used for
both TSP and lead. Does this mean that such samplers must be audited at
least twice each year, once for TSP and once for lead?
A. (a) According to Section 3.22 (d) of the Pb regulation (2),
the Pb sites are to be grouped with the TSP sites. The sites audited
should be selected at random from the entire group of sites, such that 25
percent of the sites are audited each quarter. The results obtained from
these audits will be reported as for TSP but will be considered as
representative measures of the precision of flow measurements for both TSP
and Pb.
(b) No. If a sampler is used for both TSP and Pb, it need
be considered only as a single site for auditing purposes.
33. Q. Since the requirements of the regulation are considered as
minimum requirements, some agencies may decide, or some Regional Offices
may require, the use of collocated samplers to estimate precision for Pb.
If so, how will this fact be indicated?
A. It should be indicated by entering a written note on Form 1
beneath data blocks 24-29 (for Pb) stating "dup. samplers."
58
-------
34. Q. Signed percentage differences of the results from collocated
samplers are used for precision estimates for N02, S02, and TSP. How are
the signs of the percentage differences for Pb results to be assigned?
A. For duplicate strips, one of the strips, by location of the
strip within the filter, can be considered the "designated11 value and the
other, the duplicate. Once the designations are made, the same designa-
tions should continue to be used.
In the case of duplicate analyses of the extract from a
single strip, the first analysis should be considered as the "designated"
value, and the second, as the duplicate.
35. Q. For lead monitoring, some states and local agencies prepare
and analyze composite samples formed by combining strips from a number of
filters. In such cases how should the precision data be obtained?
A. The recommended procedure would be to prepare two separate
composites and analyze each independently. The signed percentage differ-
ences would be obtained by subtracting the first analysis result from
the second and dividing by the first value. The first analysis value would
be considered as the designated and reportable value. Equations 10 and 11
of section 4.2.1(b), Appendix A, would be used in computing the appropriate
probability limits. If such a compositing procedure is used, that fact
should be indicated by entering a written note on Form 1 beneath data blocks
24 - 29 (for Pb) stating "duplicate composites".
36. Q. Some agencies have automated analyzers (03, S02, N02) with
ranges as high as 5 ppm, and some use ranges as low as 0.5 ppm. At what
levels should the precision check and the accuracy audits be performed?
A. The levels of the precision check and the accuracy audits
must conform to the levels specified in the regulation. Therefore, if the
range of 03, S02, or N02 analyzers equals or exceeds 0.08 ppm (8 ppm for
CO), the precision check can be performed within the specified 0.08 to
0.10 ppm level (8 to 10 ppm for CO).
59
-------
It is assumed that most SLAMS sites for routine monitoring will
have 03, S02, and N02 analyzers with a full-scale range of 0.50 to 1.0 ppm
(50-100 ppm for CO). It is further assumed that analyzers will be cali-
brated over their range (or ranges, if equipped with a range selector
switch) of intended use. For example, if a S02 analyzer equipped with a
range selector switch is operated with a 0.50 ppm range for routine monitor-
ing and a 1.00 ppm range for episode monitoring and is intended to be used
at both range settings, then the analyzer should be calibrated separately
on each range setting. In this case, the analyzer is audited at three
levels (0.03-0.08 ppm, 0.15-0.20 ppm, and 0.35-0.45 ppm) on the 0.50 ppm
range setting and it will be audited at four levels (0.03-0.08 ppm, 0.15-
0.20 ppm, 0.35-0.45 ppm, and 0.80-0.90 ppm) on the 1.00 ppm range setting.
Thus, it will be audited at seven conditions, and the results will be
reported accordingly. If the range for 03> S02, or NOp exceeds 0.90 ppm
(even as high as 5 ppm or 10 ppm), or if the range for CO exceeds 90 ppm,
then the audits are required at the four levels specified. No audits are
required to be reported at levels higher than 0.90 ppm for 0-, S02, or NO,,,
or higher than 90 ppm for CO. However, it would seem reasonable that the
local agency should for its own internal quality assurance, calibrate
their high-range instruments and perform audits at higher levels of expected
concentrations.
Automated Methods
37. Q. For automated analyzers, Form 1 (front) requires the
number of analyzers of each type which are operational in the network to
be reported in Blocks 15-17. If the state or local agency is operating
a non-reference, non-equivalent, or non-approved analyzer at a special
purpose site, should the results of this analyzer be reported?
A. No. The report applies only to SLAMS sites which,
beginning on January 1, 1981, must use reference, equivalent, or approved
methods. If, for some reason, a state or local agency is permitted by the
Region to use a non-reference, non-equivalent, or non-approved analyzer
at a SLAMS site, it should then be included. The same rules would apply
to manual methods as well.
60
-------
38. Q. When performing precision checks or performance audits for
automated analyzers, the average output should be obtained over what time
period of equilibrium response?
A. The time to reach equilibrium conditions and the time
period over which the response should be averaged will depend upon the
instrument and level of the standard being used. As a rule, it should be
the same as is used to obtain calibration data points at the same levels.
39. Q. In some cases, due to instrument replacement or scheduled
start-up at a given site, only one precision check may have been made
on an automated instrument during the quarter. In such cases, the
standard deviation is zero for that instrument. Should the zero be
included in the calculation of the pooled standard deviation, S ? How is
cl
the precision result reported?
A. It can be handled in one of two ways.
1. If precision checks are made on the instrument during
the succeeding quarter, the single result can be held
over and combined (calculated and reported) with the
results of the succeeding quarter.
2. If, for some reason, no precision checks are performed
or planned to be performed during the succeeding
quarter, the single value could be included in the
calculation of the average, D, but not included in the
calculation of the pooled standard deviation, S .
a
40. Q. Auditing must be conducted with a different standard than
that used for routine multipoint calibration. What relationship must
exist between the standard used for accuracy auditing and the standard
used for routine multipoint calibration?
A. The general rule is: The working standard used for the
accuracy audit must be different from the working standards used for cali-
bration, but both may be certified (referenced) against the same NBS SRM
or CRM. A protocol for certifying the working calibration or audit
61
-------
standard against an SRM or CRM is given in Section 2.0.7 in Reference 5.
(CRM's were authorized for use as traceable standards by amendments to
40 CFR Parts 50 and 58 on January 20, 1983 (48 FR 2528-2530)).
41. Q. Can the NBS SRM traceability requirement be met if I use a
NBS "traceable" cylinder gas or permeation tube from a commercial supplier?
A. Yes, but caution should be exercised and complete certifi-
cation documentation should be received with the NBS-traceable items.
42. Q. The regulations state the "Direct use of an NBS SRM as a
working standard is not prohibited but is discouraged because of their
limited supply and expense." Should NBS SRM's be used for the accuracy
audits?
A. No. As stated in the regulations, NBS SRM's are in limited
supply, and should be used only sparingly as references to which working
calibration standards (and accuracy audit standards) are assessed. CRM's
may also be used in lieu of NBS SRM's; see answer to previous question.
43. Q. In some networks, the output from automated analyzers is fed
into a data logger or minicomputer or transmitted by telemetry to a central
station or computer. When conducting a performance audit, at what point in
the total measurement system should the observed or measured value be
obtained?
A. The observed or measured value should be obtained at the
same point and in the same manner that routine monitoring data are
obtained. In other words, the performance audit should be an audit of the
entire routine measurement system—not just the analyzer. (The same
conditions should apply for the calibration process.) If the normal output
of an analyzer is measured and reduced by a computer in another location,
the performance audit result (and calibration data) should be obtained in
the same way. Since there is usually a major concern for the analyzer,
however, it would be good practice to also check its output with a (digital)
voltmeter or recorder. It is possible that the analyzer could be
62
-------
functioning perfectly but the rest of the system could be malfunctioning,
or vice versa.
44. Q. Some automatic instruments have daily (i.e., every 24-hours)
automatic injections for zero. In the normal reduction of data, the data
between successive automatic zero injection checks are corrected by the
average of the "before" and "after" zero drifts. Since the regulations
state that the precision check should be made prior to any adjustment,
what should be the procedure for calculating the result of the precision
check for this type of instrument?
A. The procedure is the same as that used for routine ambient
measurements: the precision check reading should be processed exactly the
same as it would be if it were an ordinary ambient reading. If the zero
injections occur on a fixed schedule, then, to the extent possible, the
precision checks should be randomly timed, i.e., at various times of day or
at various times with respect to the automatic cycle. The same would also
apply to automatic span cycles. See also the answer to the next question.
In either case, do not make any adjustments to the instrument until after
the above checks have been performed.
45. Q. Some instruments (Beckman 866) have built-in automatic
electronic stabilizers which readjust zero and span every 8 hours based on
the automatically performed zero and span checks. When should precision
checks and accuracy audits be performed?
A. Ideally, the precision checks and accuracy audits should be
performed at random times between instrument adjustments. If the schedule
of the automatic adjustments is known to those performing the precision
checks and accuracy audits, the precision checks and accuracy audits should
be scheduled at random times between the adjustments. If the schedule of
the automatic adjustments is not known to these persons, then the precision
checks and accuracy audits could be performed at any time of the day.
(See also the answer to the previous question.)
63
-------
46. Q. Some ozone analyzers are operated only six months of the
year. How shall precision and accuracy data be obtained and reported?
A. Precision checks will be made on all operating monitoring
instruments on the minimum biweekly frequency specified; no special con
siderations need to be made; except in the unlikely case where only a single
precision check is made in a quarter. Because the standard deviation
cannot be computed from the single value, the result of the precision check
should be held and combined with data for the previous or subsequent
quarter, as appropriate. Therefore, if no precision check (or only one
precision check) is made in a given calendar quarter, no precision data
will be reported.
The purpose of obtaining the precision data is to
relate it to scheduled monitoring data, there would be no purpose in ob-
taining precision data when no monitoring data are being obtained.
The regulations require that all operating analyzers be
audited during the year. If ozone analyzers are operated only six months
(e.g., two quarters), then 50 percent of the analyzers must be audited
each quarter. If the six-month period covered more than two quarters
(e.g., May through October), some analyzers could be scheduled for audit
in each of the three quarters. If only one audit is performed in a given
quarter, the audit results should be reported with audits of the previous
or following quarter, since a standard deviation cannot be calculated for
a single value.
47. Q. Philips S0? instruments require a new calibration whenever
the reagent is changed—approximately every 90 days. When should accuracy
audits be performed?
A. Accuracy audits (and precision checks) should be performed
at random times between multipoint calibrations or other adjustments of the
analyzer. Consequently, neither accuracy audits nor precision checks
should be performed immediately after such calibrations or adjustments. To
minimize costs, (i.e., to eliminate extra trips to sites) accuracy audits
and precision checks could be performed just before such calibrations or
adjustments. Since the Phillips S02 instrument may have has automatic
64
-------
daily zero and span checks and adjustments, the precision checks and
accuracy audits should be made at random times between such automatic
adjustments (or immediately before such adjustments).
11. REFERENCES
1. Code of Federal Regulations, Title 40, Part 58, "Ambient Air
Quality and Surveillance," promulgated on May 10, 1979 (44 FR
27571).
2. Code of Federal Regulations, Title 40, Part 58, as amended Septem-
ber 3, 1981, (46 FR 44159-44172).
3. Guideline for Lead Monitoring in the Vicinity of Point Sources,
EPA-450/4-81-006, January 1981.
4. Summary of Audit Performance, Measurement of S02, N02, Sulfate,
Nitrate, Lead, Hi-Vol Flow Rate—1978, EPA-600/4-80-017, Environ-
mental Monitoring Systems Laboratory, Research Triangle Park, NC.,
June 1980.
5. Quality Assurance Handbook for Air Pollution Measurement Systems,
Volume II, Ambient Air Specific Methods, EPA-600/4-77-027a, U.S.
Environmental Protection Agency, Research Triangle Park, NC,
June 1977.
6. "Ambient Monitoring Guidelines for Prevention of Significant
Deterioration (PSD), EPA-450/2-78-019, May 1978.
7. Implementation of Air Quality Monitoring Regulations, Color Video
Tapes, 4 hours (condensed version of workshops conducted at each
Regional Office, April-June, 1979, on Section 319 of the 1977
Amendments to the Clean Air Act and 40 CFR 58). On free loan
from USEPA Air Pollution Training Institute, Research Triangle
Park, North Carolina.
65
-------
8. "A Procedure for Establishing Traceability of Gas Mixtures to
Certain National Bureau of Standards SRMs." EPA-600/7-81-010,
U.S. Environmental Protection Agency, Research Triangle Park,
North Carolina 27711. May, 1981.
66
------- |