Air Pollution Training Institute
                 MD20
                 Environmental Research Center
                 Research Triangle Park NC 27711
EPA 450/2-81-016
May 1984

Revised December
       Air
c/EPA  APT I
       Course 470
       Quality Assurance
       for Air Pollution
       Measurement Systems

       Student Workbook
       Second

-------

-------
United States
Environmental Protection
Agency
Air Pollution Training Institute
MD20
Environmental Research Center
Research Triangle Park NC 27711
EPA 450/2-81-016
May 1984
Air
APTI
Course  470
Quality Assurance
for Air Pollution
Measurement Systems
Student Workbook
Second
Prepared by:
B. Michael Ray

Northrop Services, Inc.
P. O. Box 12313
Research Triangle Park, NC 27709

Under Contract No.
68-02-3573
EPA Project Officer
R. E. Townsend

United States Environmental Protection Agency
Office of Air and Radiation
Office of Air Quality Planning and Standards
Research Triangle Park, NC 27711

-------
Notice
This is not an official policy and standards document. The opinions and selections
are those of the authors and not necessarilv those of the Environmental Protection
Agency. Even attempt has been made to represent the present state of the an as
well as subject areas still under evaluation. Any mention of products or organizations
does not constitute endorsement bv the l'nited States Environmental Protection
Agency.
11

-------
Table of Contents
Lesson
Registration. Course Infonnation. and Pretest.
Basic Areas of Quality Assurance Activities.
Managerial Quality Assurance Elements for Establishing a Quality
Assurance Program and Recording Changes. .. . . 3 - 1
Review of Precourse Problem 3. ...... .. . 3A-l
Basic Concepts of Statistical Control Charts. . . 4-1
x - R Statistical Control Charts. . ... ................. . 5-1
The Measurement Process with Emphasis on Calibration. . . . . . .. ...,. .6-1
Group Problem. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. .. ................ 6A-l
Review of Control Chart Homework. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 B-1
Regression Analysis and Control Charts for Calibration Data. . .. ............. 7-1
Review of Precourse Problems 1 and 2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .7 A-I
Identification and Treatment of Outliers. . . . . . . . . . . . . . . . . . . . . . .. ..........8-1
Intralaboratory Testing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .9-1
Interlaboratory Testing....... ....... ....... ......... .........10-1
Procurement Quality Control. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-1
Perfonnance Audits. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-1
System Audits. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-1
Quality Assurance Requirements for SLAMS and PSD. . . . . . . . . . . . . . . . . . . . . . . 14-1
Precision Work Session. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14A-l
Data Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .15-1
Quality Costs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16-1
. . . . . .
Page
1-1
.2-1
III

-------
Section 1
Registration, Course Information,
and Pretest
Lesson Goal
To familiarize you with the course structure and objectives, to have you meet
instructors and fellow students, to conduct the pretest, to present pertinent logistical
infqrmation, and to obtain registration information.
Lesson Objectives

At the end of this lesson, you should know-
1. the course goal and objectives,
2. the requirements for passing the course,
3. the nature and use of class materials,
4. the teaching method used in the course,
5. the name of the organization conducting the course and any other con-
tributing organization,
6. the source of the course materials and any similar information,
7. the names of all instructors and their affiliations,
8. the location of emergency exits, restrooms, telephones, refreshments,
restaurants, and transportation facilities,
9. the phone number where you may receive messages during the course
offering,
10. where to obtain the name and employer of each student in the class, and
11. the address and phone number of the USEP A Air Pollution Training
Institute.
1-1

-------
Notes
1-2

-------
Course Goal
The goal of this course is to train you in quality assurance principles and techniques
to the extent that you will understand the usefulness of them and be able to apply
them in the development and implementation of a comprehensive quality assurance
program for air pollution measurement systems.
Course Objectives

At the conclusion of this course, you will be able to coordinate the design of a com-
prehensive quality assurance plan for an air pollution measurement system.
Specifically, you will be able to-
1. develop an organizational plan for quality assurance, including the develop-
ment of an organization chart indicating those positions with major quality
assurance responsibilities, the delineation of the quality assurance respon-
sibilities for key personnel, and the development of an implementation
schedule in terms of the various elements of quality assurance,
2. formulate a quality assurance policy for an air pollution monitoring
organization,
3. develop objectives for a measurement process in terms of completeness, preci-
sion, accuracy, representativeness, and comparability,
4. describe the principles that should be considered in preparing quality reports
to management and the quality facts that should be reported,
5. describe the kinds of training that are available to develop and maintain per-
sOnnel at the level of knowledge and skill required to perform their jobs,
6. design a reporting format for quality costs that allocates quality-related
activities into cost categories,
7. compare and contrast a quality assurance program plan and a quality
assurance project plan in terms of their components (elements) and functions,
8. explain the importance of establishing a closed-loop corrective-action system,
9. explain the purposes for and describe how a basic document control system
and a basic configuration control system should be established,
10. list the factors that should be considered in designing a preventive
maintenance program,
11. describe the mechanisms that can be used to ensure the quality of procured
items,
12. define the two kinds of audits recommended by USEPA and describe the steps
and factors that must be considered in the design of each,
13. describe the kinds of quality control checks that should be performed on
sample collection and analysis systems (manual and continuous) and what
statistical analyses and records should be maintained,
14. describe the purposes of both intralaboratory and interlaboratory testing pro-
grams, the factors that must be considered in establishing the programs, and
the methods of analyzing and reporting results of each program,
15. develop calibration programs incorporating the elements recommended in the
EPA Quality Assurance Handbook, Volume I,
1-3

-------
16. select the appropriate kinds of control charts to be used to control measure-
ment systems, calculate control limits for them, and interpret plotted results,
17. outline the basic elements of a data qualification scheme for estimating
accuracy and precision, select the appropriate statistical techniques to be
used, and calculate estimates of precision and accuracy. and
18. explain the importance of timely data validation and, using appropriate
techniques, develop a data validation scheme for a given air pollution
monitoring system.
1-4

-------
Name
Date
Course 470
Quality Assurance for Air Pollution Measurement Systems
Pretest
. This test is designed to measure your present knowledge of quality assurance for
air pollution measurement systems.
. It is intended to be a closed-book test. Do not use yo'ur notes or books. You may
use a calculator.
. You will have 30 minutes to complete the test.
. On the answer sheet. circle the letter that corresponds to the best answer to each
question. Each question has only one "best" answer. Each correct answer is worth
five points.
. The results of this test will not affect your final course grade.
1. Activities involving the use of standard reference materials would fall under
which basic area of quality assurance?
a. Management
b. Measurement
c. Statistics
d. Systems

2. In addition to having good precision and ~ccuracy. data should be
a. complete
b. representative
c. comparable
d. all of the above

3. The principal objective(s) of quality assurance programs for SLAMS and PSD
air monitoring is(are) to (?) .
a. provide data of adequate quality to meet monitoring objectives
b. increase the number of quality assurance coordinators
c. minimize loss of air quality data
d. both a and b, above
e. both a and c, above
(?)
4. A control chan shows (?)
a. how a process is behaving
b. how a process should behave
c. when action should be taken to make a process behave as it should
d. all of the above
3/84
1-5

-------
5. A check sample was analyzed each day for three months. A review of the results
indicated that they were normally distributed and that 95% of the results fell
between 18.0 and 24.0 ppm. Establish an upper control limit such that only 13
analyses in 10,000 should exceed the limit. The limit is (?) ppm.
a. 21.0
b. 24.0
c. 25.5
d. 27.0
6. One criteria for data entering USEPA's National Aerometric Data Bank is that
it must have b~en acquired by application of standard methodologies and
reponed in consistent units. That is, the data must be (?) .
a. complete
b. accurate
c. comparable
d. precise

7. A specific preventive maintenance schedule should relate to the
a. purpose of monitoring
b. physical location of analyzers
c. level of operator skills
d. all of the above

8. Independent checks made by a supervisor or auditor to quantitatively evaluate
the quality of data produced by a measurement system are called (?)
audits.
a. perfonnance
b. system
c. calibration
d. analysis

9. An on-site inspection and review to qualitatively evaluate the quality assurance
system used for the total measurement system is called a (?) audit.
a. perfonnance
b. system
c. calibration
d. analysis
(?)
10. A quality assurance (?) contains general quality assurance requirements
and infonnation for an organization.
a. narrative statement
b. project plan
c. program plan

11. Results of (?)
monitoring data.
a. routine operational checks
b. collocated- sampling
c. audits
d. all of the above
are used to assess the accuracy of SLAMS and PSD air
1-6

-------
12.
(?) measurements are audited in the USEP A's interlaboratory per-
formance audit program.
a. Ambient air
b. Source emission
c. both a and b, above

13. In a quality report, costs should be reported in terms of
a. prevention
b. appraisal
c. failure
d. all of the above
(?)
costs.
14. Quality reports should (?)
a. be obtained from source documents
b. have a baseline for comparison and be easy to interpret
c. present data in summary form
d. all of the above
15. During a performance audit. the
to pollutant concentrations.
a. auditor
b. analyzer operator
c. analyzer operator's supervisor

16. The process of establishing the relationship between the output of a measure-
ment process and a known input is (?) .
a. determination of efficiency
b. ratiocination
c. determination of pollutant concentration
d. calibration

17. In selecting a cylinder gas to be used in a performance audit, one option is to
select a high concentration cylinder and use a dilution system. The advantage of
this option is (?) .
a. high concentrations have better stability
b. calibration errors are small
c. the gas concentration can be analyzed before and after the audit
d. cylinders can be traced to standards of higher accuracy

18. An out-of-control condition is indicated on a quality control chart
when (?) .
a. one point is outside the 3-standard deviation limits
b. two consecutive points are outside the 2-standard deviation limits
c. three consecutive points are above the central line
d. either a or b, above
(?)
should convert analyzer responses
1-7

-------
19. Quality assurance for a monitoring network is concerned with
a. all factors that affect the quality of data collected
b. only field activities
c. only laboratory activities

20. Routine data validation should include
a. checks for transmittal errors
b. checks for spatial and temporal continuity
c. a complete recomputation of any arithmetic
d. both a and b, above
1-8
(?)
(?)

-------
Name
Date
Course 470
Quality Assurance for Air Pollution Measurement Systems
Pretest
1. a b c d 
2. a b c d 
3. a b c d e
4. a b c d 
5. a b c d 
6. a b c d 
7. a b c d 
8. a b c d 
9. a b c d 
10. a b c  
11. a b c d 
12. a b c  
13. a b c d 
14. a b c d 
15. a b c  
16. a b c d 
17. a b c d 
18. a b c d 
19. a b c  
20. a b c d 
1-9
3/84

-------
Section 2
Basic Areas of Quality
Assurance Activities
Lesson Goal
To familiarize you with the four basic areas of quality assurance and the various
activities that relate to each.
Lesson Objectives

At the end of this lesson, you should be able to-
1. define quality assurance,
2. list the four basic areas of quality assurance: management, measurement,
systems, statistics,
3. recognize specific activities that relate to each basic area,
4. explain the need for quality assurance to be involved with the wide scope of
activities that affect air pollution data quality, and
5. explain the dynamic nature of a quality assurance program; i.e., the need for
continual improvement of the program through planning, implementation,
assessment, and corrective action.
2-1

-------
BASIC AREAS OF QUALITY

~E~;';~;~:


Ma agemen Me.surement
-~-
-..........'- -
-....-.'
~ ~""'.'-.... ====-
- ~'::~~::...... ~
- :.::.;.;..'''' ===-
- ~Xi
X= n
System.
Statistics
19UALITY I
(l \Y
jCONTROLI fASSURANCE I
D D
I I Audit of
Calibration Calibration
Air Monitoring


LNilL

\ Data i
Quality Assurance
Program

il

rv;Ud Data I
MONITORING SYSTEM

Variable QC Activitv

. method technical procedure
. materials
. machines
. maintenance

. men/women

. measurement
procurement

preventive/ corrective

training

calibration procedures
operating procedures
MONITORING SYSTEM
Variable QC Activity
. monitoring sites conditions
. mathematics computations
. management objectives
policies
procedures
siting
quality costs
. meteorology
. money
2-3

-------
0., '!.ii .,
.. \' 11 ./ I' -J'
'\. \\!i Ii .. "or
.. 0--'" \ ~..~ / .,/,
...~~ " III 0"'. ,.....
~ ..""
........ EUMEIITI .--
..... 0' A C.-ree'''. ...-
'-"":...... QA 11IlOO.." ..
Il108 ... '-.
..",'" .;" .. 0......:......
"', '!.~
.. ;~..... II. "'"
.~ ,. Ni\-\'"
... J~ u ! .,",
./ 4! \
w
-4-5-6-7-8-
...Io".I....I.II.L...I....I.."I....I_..I....I....lon
- ~Xi
X= n
~ MANAGEMENT
. QA policy . quality planning
. QA objectives . audit planning
. organization
. training
. quality reports
. corrective action
. QA plans
. quality costs
I.~.~.~.?~,? MEASUREMENT
o pre-te8t
preparation
. calibration
(standards
traceability)

. sample collection
and analysis
. _su.-nt
system reliability
. audit procedure.
SYSTEM INTERNATIONALE (51)
(Metric 5,8tem)

! 3 4 5 6 i meter-m -=-i:u:il-=- Kelvin-K
I I J I I length ~ M=- temperature

.0. ~ idiogram-kg ~ mol_ol
-,;;:;r- ma.. =- -=- amount 01
= ~ .ubstance
~ second.. (,;) - candela-c:d
~ time ==-:1':-= lumlnou.
~ - - Inten.lty
- - ampere-A
=- - electric current
2-4

-------
~ETROLO~~"


-5-6-7-8-9-10-11-
. the
science of measurement
METROLOGY REFERENCES
(available from
National Bureau of Standards)
,
SpecI.1 PubllC8t1on 4011
SpecI8l Publlc.tlon 300
Volu-I
""- M....-t
.... Cullbr.tloo,
Std8tIcaI Cone.....
.... Prac8d..."
"s..nd8rd Ref.......
M.-a. ,.
.nd M......,..
M_"
SYSTEMS
. quality planning
. data handling

. data validation

. performance/
system audits
.. procurement
quality control

. document control.
. prewenttve
maintenance

. configuration control
. correc:tJve action
. quality costs
x=~
n
STATISTICS
. control charts
. regression analysis
. outlier tests
\CA=vel
~
I Planning I
~
THE QA
CYCLE
»
I Assessment I
I Implementation I 
~
2-5

-------
Attachment B
,,~tO s'''''''"
. ft .
:~i
~ ;
... .I
"'... -,"
"t...o'"''''
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON DC 20460
THE ADMINISTRATOR
May 30. 1979
MEMORANDUM
TO:
Deputy Administrator
Director, Science Advisory Board
Director, Office of Regional and
Regional Administra~ors
Assistant Administrators
General Counsel
Intergovernmental Operations
SUBJECT:
Environmental Protection Agency (EPA) Quality Assurance
Policy Statement
The EPA must have a comprehensive quality assurance effort to
provide for the generation, storage, and use of environmental data which
are of known quality. Reliable data must be available to answer
questions concerning environmental quality and pollution abatement
and control measures. This can be done only through rigorous
adherence to established quality assurance techniques and practices.
Therefore, I am making participation in the quality assurance effort
mandatory for all EPA supported or required monitoring activities.
An Agency quality assurance policy statement is attached which
gives general descriptions of program responsibilities and basic
management requirements. For the purpose of this policy statement,
monitoring is defined as all environmentally related measurements
which are funded by the EPA or which generate data mandated by the EPA.
A detailed implementation plan for a total Agency quality
-assurance program is being developed for issuance at a later date.
A Select Committee for Monitoring, chaired by Dr. Richard Dowd, is
coordinating this effort, and he will be contacting you directly
for your participation and support. I know that each of you shares
my concern about the need to improve our monitoring programs and
data; therefore, I know that you will take the necessary actions
that will ensure the success of this effort.
Douglas M. Costle
Attachment
2-7

-------
,.'0 ......,."
, ft .
t~}
" ,~
.. 1180',,"
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON DC 20.&60
THE AOMINISTRATOR
November 2. 1981
MEMORANDUM
TO:
Associate Administrators
Assistant Administrators
Regional Administrators
SUBJECT:
Handatory Quality Assurance Program
One of the major concerns of this administration and myself is that we
support all of our actions and decisions with statistically representative
and scientifically valid measurement of environmental quality. To meet
this objective. it is essential that each of you continue to support and
implement the Agency's mandatory Quality Assurance program which is being
implemented by the Office of Research and Development. It is especially
essential that you assure that the appropriate data quality requirements
are included in all of your extra.ural and intramural environmental
monitorinl activities. I also am particularly concerned that you do not
sacrifice quality for quantity when adjusting your program to meet our new
resource targets.
The attached Second Annual Quality Assurance Report demonstrates the
importance of this program in achieving our goals and objectives.
Recognizing its importance, I have asked Dr. Hernandez to closely monitor
this program'~ implementation and advise me of any problems that affect
the scientific data bases of the Agency.
Anne H. Gorsuch
Attachment
cc:
Deputy Administrator
Office Directors
2-8

-------
CHALLENGES OF IMPLEMENTING QUALITY
ASSURANCE FOR AIR POLLUTION MONITORING SYSTEMS

Raymond C. Rhodes
Quality Assurance Specialist
S. David Shearer, Jr., Ph.D.
Director
ABSTRACT
Special considerations are necessary in implementing a quality assurance system for air pollu-
tion monitoring. Of particular concern are the following:
(1) Quality characteristics of environmental data.
(2) Network design and sampling.
(3) Measurement methods and standard reference materials.
(4) Statistical quality control.
(5) Data analysis and validation.
(6) Preventive maintenance

Accuracy, precision, completeness and representativeness are the quality characteristics of air
monitoring data. The physical sampling of the air environment presents a number of unique
and difficult problems. The technology of air pollution measurement has created special
demands for measurement methods and standard reference materials. Because of the
variability patterns of pollution data, and the non-uniform error variability of the measurement
methods, particular types of statistical control and data analysis and data validation are
required. The wide diversity in the scope and requirements of compliance and research
monitoring makes it necessary to develop flexible quality assurance procedures. In spite of the
many difficulties involved, much is being accomplished in implementing quality assurance for
air pollution monitoring systems.
INTRODUCTION
With the increased interest and activity in the environment in recent years, a need exists to
apply the- principles and techniques of modem quality assurance to the various pollution
monitoring systems. Pollution measurement methods involve field sampling and chemical
laboratory analyses and, to these portions of the measurement process, most of the traditional
laboratory quality control (Q.C.) techniques apply. Of concern, however, is the need to apply
the general principles and techniques to the entire monitoring system.

The following elements of quality assurance (Q.A.) system are generally applicable to pollu-
tion monitoring systems:

Elements of a Quality Assurance System CUI

1. Quality Policy
2. Quality Objectives
3. Quality Organization and
Responsibility
4. Quality Assurance Manual
5. Quality Assurance Plans
6. Training
7. Procurement Control
Ordering
Receiving
Feedback and Corrective Action
8. Calibration
Standards
Procedures
9. Internal Q. C. Checks
10. Operations
Sampling
Sample Handling
Analysis
2-9

-------
11. Data
Transmission
Computation
Recording
Validation
12. Preventive
Maintenance
13. Reliability Records
and Analysis
14. Document Control
15 Configuration Control
16. Audits
On-Site System Audits
Performance Audits
17. Corrective Action
18. Statistical Analysis
19. Quality Reporting
20. Quality Investigation
21. Interlaboratory Testing
22. Quality Costs
However, in a number of very important areas, special considerations ~ust be made. These
areas, which require special attention are:
1. Quality Characteristics of Environmental Data.
2. Network Design and Sampling.
3. Measurement Methods and Standard Reference Materials.
4. Statistical QUc,lity Control.
5. Data Analysis and Validation.
6. Preventive Maintenance.

The ultimate uses of air pollution monitoring information are decisions relative to human
health and welfare. Air pollution monitoring data are used as measures of air quality to make
the best decisions for human health and welfare.

The quality of air is measured by the cleanliness of the air-Are the pollutant concentrations
below the levels established as standards? The quality of air pollution data is measured by the
accuracy, precision, completeness and representativeness of the data.
QUALITY CHARACTERISTICS OF ENVIRONMENTAL DATA

These quality characteristics of data may be defined as follows:
1. Accuracy-The closeness of a measured value to the true value.
2. Precision - The repeatability of the data (actually the repeatability of the measure-
ment .system). (.,
3. Completeness-The amount of valid data obtained as a fraction of that
intended or planned to be obtained.
4. Representativeness-The typicalness of the pollution samples with respect
to time, location. and conditions from which the pollutant data are
obtained.
These quality characteristics are not evident nor can they be determined from examination of
the data itself. Measures of accuracy, precision, completeness, and representativeness must
be obtained from other information. Provision for obtaining measures of these characteristics
must be included in the Quality Plan for each monitoring effort because the relative impor-
tance of accuracy, precision. completeness, and representativeness depr.nds upon the specific
objectives of each monitoring program.
NETWORK DESIGN AND SAMPLING
The monitoring network design, which incorporates decisions with respect to time, location
and conditions of sampling. along with the specification of pollution measurement methods
and equipment, specify to a large extent the "process" of obtaining monitoring data. Quality
assurance personnel should be involved with the network design for pollution m-,)nitoring
2-10

-------
because of the statistical aspects involved, and because of the need to establish the best possi-
ble network at the beginning of a monitoring effort. Changes in monitoring networks can
destroy the previous history or baseline necessary for trend studies.

The process or media being sampled for air pollution measurement is not in statistical control,
but is subject to many effects such as diurnal cycles, day-of-week differences, seasonal cycles,
and local and area meteorological factors. The changing pattern of air pollution is a dynamic
process, sometimes "out of control."(6) The objective of a quality assurance program for air
monitoring is to assure that the measurement system remains "in control," no matter what
the state or condition of the air.

Consideration for temporal and spatial effects in the location .and scheduling of pollution
sampling are critical concerns with respect to representativeness.

Planning of the network design and sampling schedules are very important since resampling
in air monitoring is impossible. The air which was at the sampling point a moment ago is no
longer available! Although duplicate sampling is desirable, in air monitoring, duplicate
sampling is not possible for particulates, and is not very practical for gaseous pollutants. The
most satisfactory way of duplicate sampling for quality assurance purposes is to use duplicate
sampling equipment at the same site. Although such dual sampling requires an additional
sampUng instrument, this procedure is invaluable in estimating the precision of the total
measurement process.

In most chemical analytical work duplicate analyses are desirable. However, for continuous,
automated pollution analysis instruments, reanalysis is not possible. Reanalysis is possible for
some of the manual methods where bubbler solutions or filter media have been used to col-
lect the pollutants.
MEASUREMENT METHODS AND STANDARD REFERENCE MATERIALS
Most of the manual analytical measurement methods for gaseous poUutants involve bubbling
the air through selective absorbing solutions for an extended period (usually 24 hours) and
then analyzing the solution by wet chemical/absorbance techniques. These methods have the
limitation of providing daily averages only. In the interest of obtaining more accurate
measurements on a short-time basis, numerous automated instrumental methods have been
developed in recent years. Problems with these instruments include the manufacturing and
reliability problems associated with newly-designed equipment, and the technological prob-
lems of measuring minute concentrations (parts per million or parts per billion) in the
presence of possible interferents. Further, problems arise relating to the stability and reliability
of these instruments if operated remotely or unattended. The development of completely
satisfactory measurement methods is a very important effort of quality assurance for air
monitoring. Because of the instability of gaseous mixtures, primary standards (Standard
Reference Materials of the National Bureau of Standards) are difficult to prepare, and must
be prepared and assessed from time to time as required by users. For some gases (for exam-
ple, ozone), no primary standard has yet been developed. Neither has a particulate standard
for particle size or chemical content in a naturally-occurring matrix yet been developed.
Because of the problems in developing and using primary standards for air pollution measure-
ment, the achievement of comparability for accuracy among the various agencies and facilities
within a country is not an easy task and this concern is further magnified when comparability
among different countries is considered. In most other physical measurement areas, com-
parability among nations is relatively easily achieved through traceability to common primary
standards.
2-11

-------
STATISTICAL QUALITY CONTROL
In traditienal quality centrel systems much impertance is placed en the establishment.of
average and range IX, R) c.ontrel charts te c.ontr.ol quality. Averages are .obtained fr.om
measurement .of a sample frem seme assumed h.omegene.ous ratienal sub-gr.oup .of preducts.
In this way, the average is used as a measure and means .of centrel .of the level .of the quality
characteristic and the range .of the measurements is used as a measure and means .of c.ontr.ol
.of variability. Except in the laberatery, batches .or ratienal sub-gr.oups seld.om exist in pelluti.on
measurement; and even in these cases, replicati.on is acc.omplished usually .on a duplicate
basis .only, such as duplicate measures .of the same sample, duplicate analyses by different
analysts, .or measurement frem duplicate c.ollecated sampling instruments. Further, except f.or
repeated measurements .of hemogeneeus c.ontrel samples, the averages .of the duplicates vary
depending upon the cencentratien level. Therefere, the >r chart is .of little value in quality c.on-
trel fer pellutien measurer:nents.

Further, in the cases .of duplicate data, some identity can usually be associated with each .of
the pair .of measurements, se that the range is net the best value .of interest. Because .of
suspected bias between the twe sources, signed differences sheuld be used rather than the
unsigned range. Further, since the average levels may vary widely between pairs, and the
errer variatien is usually proportienal te levels, the value .of cencern is the signed percentage
difference (.or signed relative difference). This value is an apprepriate parameter te plet en
c.ontrel charts as a means te centrel variability .of the measurement process.

Centrel en the accuracy .of the data must be maintained by frequent calibratiens with
materials traceable te primary standards. Seme type .of calibratien is usually required .on air
pellutien measurement systems daily .or fer each use, and occasienally calibratien is necessary
befere, during and after analysis .of a given batch .of samples. Centrel charts which may be
maintained te assure that the calibratien process remains in statistical centrel are these fer the
slepe, intercept, and standard errer .of predictien fer the calibratien curves fer multipeint
calibratiens, and zere and span drift checks te centrel the drift .of centinueus instruments.
DATA ANALYSIS AND VALIDATION
A number .of special censideratiens exist in air pollutien measurement systems with respect te
data analysis and data validatien. Fer mest air pellutien measurements, the err.or variatiens
are preportienal te the pollutant cencentratien level, thus cemplicating errer analysis .of the
measurement system.

The aggregate frequency distributiens .of air pollutien data are skewed, .often lognermal .or
nearly so, requiring logarithmic .or .other transfermatiens when summarizing .or analyzing data
distributiens.II.J) Cemplicatiens arise when taking logarithms .of zere values! Also, special
treatment .of data belew the minimum detectable levels may be required in the characteriza-
tien .or summarizatien .of air pollutien data.

Because .of the many pessible causes .of variability in air pollutien data, the data validatien
process as a separate activity is very important in air menitering. (5) Since the quality .of the
data is net evident frem the data itself, the reutine checks .of ancillary data fer accuracy and
precisien must be made. Some further checks .of the data with relatien te ether data .or infer-
matien may be made te validate the final product. Varieus types .of checks which can and
sheuld be made include:

Manual Editing-checks fer human errer .or equipment malfunctien, such as:
1. impossibly high .or lew values,
2. spikes, such as caused by electrenic interference, and
3. repetitieus values, such as caused by equipment malfunctien.
2-12

-------
Scientific Validation-checks involving scientific considerations, such as:
1. time continuity,
2. spatial continuity,
3. relationships among different pollutants, and
4. relationships with meteorological data.
PREVENTIVE MAINTENANCE
Preventive maintenance activities are not usually considered as part of quality assurance.
However, for air pollution monitoring systems, the effectiveness of preventive maintenance is
critical in determining the continuous operation of remote, unattended sampling equipment,
particularly automatic sampling/analysis instruments. Unplanned malfunctioning of these
instruments can prevent the obtaining of sample results for peak concentration periods, or
prevent the accumulation of sufficient data to establish valid trend information.

Needless to say, all the above special and important features indeed make implementation of
quality assurance of air pollution monitoring systems an interesting, but difficult and challeng-
ing effort.
REFERENCES

1. Curran, Thomas G. and Neil H. Frank, "Assessing the Validity of the Lognormal Model
when Predicting Maximum Air Pollution Concentrations." U.S. Environmental Protection
Agency, Research Triangle Park, North Carolina. 75-51.3.
2. Environmental Protection Agency, "Quality Assurance Handbook for Air Pollution
Measurement Systems, Volume I, Principles." EPA-600/9-76-005, March 1976.
3. Larsen, Ralph I., "An Air Quality Data Analysis System for Interrelating Effects, Stan-
dards, and Needed Source Reductions." Air Pollution Control Association Journal,
November 1973 and June 1974.
4. National Bureau of Standards, "Precision Measurement and Calibration, Statistical Con-
cepts and Procedures." Special Publication 300, Volume 1, February 1969.
5. Rhodes, R. C. and R. Jurgens, "Data Validation Techniques Used in the Regional Air
Monitoring Study of the S1. Louis Regional Air Pollution Study," proceedings for "A Con-
ference on Environmental Modeling and Simulation." Environmental Protection Agency,
ORD and OPM, Cincinnati, Ohio, April 20-22, 1976.
6. Rhodes, R. C., "Importance of Sampling Errors in Chemical Analysis," symposium on
"Validation of the Measurement Process," American Chemical Society, New York, New
York, April 4-5, 1976.
7. Rhodes, R. C., "Quality Assurance for ADP {and Scientific Interaction with ADP)," pro-
ceedings of second ORD ADP Workshop, Environmental Protection Agency, Gulf Breeze,
Aorida, November 11-14, 1975.
8. Rhodes, R. C. and S. Hochheiser, "Quality Costs for Environmental Systems," trans-
actions, 30th Annual Technical Conference, American Society for Quality Control,
Toronto, Ontario, Canada, June 7-9, 1976.
9. von Lehmden, D. V., R. C. Rhodes and S. Hochheiser, "Applications of Quality
Assurance in Major Air Pollution Monitoring Studies-CHAMP and RAMS," proceedings
of International Conference on Environmental Sensing and Assessment, Las Vegas,
Nevada, September 14-19, 1975.
2-13

-------
6/30/76
8/23/77 Rev.
QUALITY ASSURANCE FOR POLLUTANT MONITORING
by
R. C. Rhodes
An on-going monitoring system will already have implemented a number of essential
elements of a total quality assurance system. When reviewing an existing monitoring opera-
tion or when establishing a new monitoring effort, it is very desirable that a systematic review
be made to consider or reconsider the quality assurance activities which should be required.
The various elements of a total quality assurance program, listed below, are discussed in
the "Quality Assurance Handbook for Air Pollution Measurement Systems, Volume I, Prin-
ciples," EPA 600/9-76-005, March 1976.

Quality policy
Quality objectives
Quality organization
and responsibility
QA manual
QA plans
Training
Procurement control
Ordering
Receiving
Feedback and
corrective action
Calibration
Standards
Procedures
Internal QC checks
Operations
Sampling
Sample handling
Analysis

The extent to which each of the above elements should be implemented by a given agency
will depend upon (1) the objective of the monitoring, (2) the duration of the monitoring
period, and (3) the type of sampling/analysis methods utilized. Each monitoring agency
should review the quality assurance elements with respect to their particular needs, and
should establish a prioritized long-range plan (schedule) for implementation. For on-going
monitoring efforts the quality assurance program should be dynamic in nature, being con-
tinually improved and revised according to increased knowledge, changing conditions, and
assigned priorities.
The elements listed above fall into 4 general categories:
(1) Management-those activities which are of particular concern to, and must. be
initiated and sustained by management, notwithstanding the fact that all activities of a
monitoring system are management's responsibility.
(2) Measurement-those activities which are directly involved in the sampling and analysis
of ponutant concentrations.
(3) Systems-those activities mainly involving the paperwork systems essential to operate
and support the quality assurance system.
(4) Statistics-those computational and statistical analysis techniques and procedures which
are necessary as part of the quality assurance system.
From the above, it is evident that a total quality assurance program is concerned with an
activities which may affect the quality of the monitoring data, and is not limited in a very nar-
row sense to essential calibrations and a few routine duplicate analytical checks.
Data
Transmission
Computation
Recording
Validation
Preventive maintenance
Reliability records and
analysis
Document control
Configuration control
Audits
On-site system
Performance
Corrective action
Statistical analysis
Quality reporting
Quality investigation
Interlab testing
Quality costs
2-14

-------
Management. It is obvious that management's responsibilities should include a stated writ-
ten policy and objectiues concerning quality. The need for monitoring data of high quality
must be continually made evident by the management with a continual awareness of such
need by all the people whose activities affect the quality of the data. One individual of the
organization should be specifically designated and assigned the responsibility to oversee all
quality assurance activities, even though the individual may have other assigned duties, and
even though "Quality assurance is everybody's business." This individual should be
designated as the "Quality Assurance Coordinator."
Management should establish training requirements for each individual whose activities
affect quality. Detailed systematic written plans should be prepared summarizing the various
quality control checks made for each pollutant measurement method or special project. A
manual containing administrative-type procedures applicable to all measurement methods and
projects and to general quality assurance activities should, in time, be prepared to consolidate
in one document all quality-related procedures. The manual should incorporate the above-
mentioned plans by reference.
Management, obviously, is concerned with costs. And after operation of a monitoring
system for, say, a year, a systematic review should be made of the costs related to quality, to
assess the cost-effectiveness of these activities, and to make indicated changes in expenditures
of effort to obtain the most high quality data for the least cost.
Additionally, management should establish some type of periodic (say quarterly) report
summarizing quality assurance activities and providing some continual assessment or measure
of data quality. This report should be prepared by the Quality Assurance Coordinator.
Measurement. Various EPA guideline documents. have been prepared for each measure-
ment method. These documents provide the identification of calibration standards and
detailed procedures for calibration for each of the methods. Also included in these documents
are detailed procedures and internal quality control checks which should be made for the
sampling, sample handling, and analysis for each of the methods.
It may be economically prohibitive to implement all of the recommended checks of these
documents, at least initially. Specific minimum checks for ambient methods are included in
EPA 600/4-77-027a, "Quality Assurance Handbook for Air Pollution Measurement
Systems," Volume II, Ambient Air Specific Methods, May 1977. Specific minimum checks for
source emission methods are included in EPA 600/4-77-027b, "Quality Assurance Hand-
book for Air Pollution Measurement Systems," Volume III, Source Emission Specific
Methods, August 1977. Some judgment may need to be exercised as to which checks seem
to be most critical and need to be implemented first. However, it is best to implement more
checks at a lesser frequency than to concentrate heavily on just a few. The frequency of
quality control checks should be flexible, being increased for those which by experience seem
to give most problems, and being decreased for those which seem consistently to remain "in
control." Similar reasoning applies with respect to the types and frequencies of independent
performance audits described in the guideline documents.
One essential for obtaining high quality data is the procurement of measurement equip-
ment and materials of adequate quality. Adequate specifications should be included in the
procurement ordering documents, and the equipment and materials should be given ade-
quate inspection when receiued. Generally, procured items should not be paid for until after
they have been determined to meet the specifications. Obviously, those methods and equip-
ment designated or specified by the government as official for determining compliance to
ambient air or source standards should be strictly and consistently complied with.
. EP A R4-73-028, Environmental Monitoring Series (for ambient monitoring methods)
EPA 650/4-74-005, Environmental Monitoring Series (for source emission monitoring
methods)
2-15

-------
One part of the measurement method which may not receive adequate attention is that for
flow measurement. For those methods which require flow measurement, the flow measure-
ment is equally as important as the pollutant measurement.
A critical requirement of the measurement method (for pollutant and flow) is the use of
secondary reference standards for calibration. traceable to a national or international primary
standard.
Systems. Detailed. systematic and meticulous records need to be kept concerning all qf the
necessary measurements and computations integrally involved with the measurement process.
Of equal importance is the recordkeeping concerning (1) the written procedures for calibra-
tion, operation and computations. (2) preventive maintenance procedures and records. and
(3) measurement equipment records. A document control system should be established to
identify by number and date each written procedure or revisions thereof so that the exact
procedure used at any specified time (past and present) can be determined. A configuration
control system should be established to record the nature and dates of any changes in the
hardware design, or major corrective maintenance of the sampling, sample handling, and
analysis equipment. These records should be kept by manufacturer's serial number or an
agency-assigned identification number. Such records should enable one to determine for any
past and present time, the exact configuration of any specific piece of equipment. Also con-
sidered as part of a configuration control system is the site assignment history for each piece
of identified sampling equipment.
Recordkeeping systems are essential to record changes to the procedures and equipment of
the monitoring system. Experienced quality assurance and statistical personnel are suspicious
of the possible effects of changes to the total measurement process. Their motto might well
be "CAVE VICISSITUDlNES" or "CAVE VARlET AS.". Oftentimes, seemingly innocuous
changes may cause significant changes in the results. As a precaution against the introduction
of such undesirable effects into the system, the basic principle of performing overlap checks or
comparisons should be made to assure that such changes are appropriately valid.
Statistics. The use of statistical analyses is essential to an adequate quality assurance
system. Some of the more basic statistical applications are presented in APTD 1132, "Quality
Control Practices in Processing Air Pollution Samples." Other applications are included in the
Appendices to EPA 600/9-76-005. If a given agency'does not have a person with some
training and experience in the basic statistical applications presented in these documents,
either (1) an individual of the agency with mathematical capability should attend a course to
receive such training or (2) a statistician experienced in these applications should work with
individuals of the agency on a temporary consulting basis to establish such techniques and
provide such training. The applications of statistics to air monitoring extend from the simplest
(control charts) to the very complex (modeling and computer simulation) and are limited only
by the statistical and computation capability of available personnel and resources. The tech-
niques of data validation and equipment reliability analyses are several specific applications of
value to a local agency.
In addition to the above. several points deserve further emphasis with respect to the
accuracy and precision of the measurement system. In addition to the use of good calibration
standards and procedures, interlaboratory tests, such as the exchange of stable samples
between peer laboratories, or the dissemination of blind samples from some recognized
national or international laboratory is quite valuable in determining the accuracy of par-
ticipating agencies. Such testing may reveal weaknesses in the system which would require
special quality investigations. The use of statistics in planning such studies and in analyzing
the data therefrom, is emphasized.
.CAVE VlCISSITUDINES: Beware of changes
CAVE VARlET AS: Beware of differences
2-16

-------
An excellent way to check the internal precision of an agency's system. is to establish at
one (or a few) selected cities a dual or colocated sampling instrument for each measurement
method.. This type of duplicate check is one form of the independent performance audits
described in the EPA QA Guidelines document for manual integrated methods. The duplicate
sampling instruments should be maintained as independently as possible from the regular
instrument. For example, where possible, independent calibrations and flow measurements
should be made for the colocated duplicate instrument. Similarly, for integrated manual
methods the pollutant analyses should be performed as independently as possible in the
laboratory. For example, the samples from the colocated instrument should be analyzed on a
different batch (using a different calibration) from that in which the regular sample is analyzed.
In the above-described manner, the best possible estimate for within-agency precision for the
total measurement process can be made. Excessive differences in results between the paired
instruments will indicate weaknesses in the system which should be isolated by investigation
and corrected by appropriate correctiue action.
------- -- - - - --. -- ------ ~ ------ .. ~ -
As a part of the recordkeeping system, each agency should compile (or maintain) a
"Significant Event History." Documentation of the location, nature, dates and times of special
events affecting pollutant concentrations should be kept in a systematic chronological file.
Such events which might explain unusual results would be those such as dust storms, large
fires, construction work, etc.
Quality Assurance System Reuiew. On occasion, the Quality Assurance System of a given
monitoring agency may be subject to an on-site system audit or review by an external
organization, for the purpose of evaluating the capability of the agency to produce data of
acceptable quality. Such an independent review is made of the agency's facilities, equipment,
personnel, organization, procedures, etc. by persons knowledgeable in both quality assurance
technology and the measurement technologies involved. The audit should include a review of
the agency's actual operations, procedures and recordkeeping for all of the elements of
quality assurance system discussed herein. The audit team's evaluation should include specific
identification of areas of weakness and specific recommendations for improvement.
"This technique may be cost prohibitive for continuous instruments.
2-17

-------
Section 3
Managerial Quality Assurance
Elements for Establishing
a Quality Assurance Program
and Recording Changes
Lesson Goal
To familiarize you with managerial quality assurance elements involved in
establishing a quality assurance program and recording changes in an air pollution
monitoring system.
Lesson Objectives

At the end of this lesson, you should be able to-
1. list the quality assurance elements that are involved in establishing a quality
assurance program and discuss the factors that should be considered in their
unplementation,
2. list the quality assurance elements that are involved in recording changes in
an air pollution monitoring system,
3. explain the purpose of document control and design a basic document control
system,
4. explain the purpose of a configuration control system, and
5. explain the purpose of preventive maintenance and discuss the factors that
should be considered in designing a preventive maintenance system.
3-1

-------
- - -    
    MANAGERIAL QUALITY
    ASSURANCE ELEMENTS 
   . Establishing a quality assurance
    program  
   . Recording changes in the air
    quality monitoring system
- - -    
    ESTABLlSIIII'fG A QUALITY
    ASSURAI'fCEPROGRAM
    . polley and objectives
    . organization 
    . quality asaurance plans
    . training  
    . audit procedures
    . corrective action
    . report. to management
- - -    
    QUALITY
    ASSURANCE POLICY
    AND OBJECTIVES
    Each organization should
    have a written quality
    assurance policy that
    should be made known to
    all organization personnel
- - -    
    QUALITY ASSURANCE
    OBJECTIVES
   . Data meeting user requirements
    . completeness . representativeness
    . precision . comparability
    . accuracy  
- - -    
    QUALITY ASSURANCE
    OBJECTIVES
    . Data are complete if a
    prescribed percentage of total
    measurements is present.
    . precision. spread of data
    . Accuracy. nearness to true
    value 
- - -    
     3.3

-------
QUALITY ASSURANCE
OBJECTIVES
. Data must be representative of the
<.ondition being measured (e"'ample:
ambient sampling at midnight is not
representative of CO during rush-hour
traffic)

. Data from several agencies should be
in the same units and corrected to the
same conditions (temperature and
pressure) to allow comparability
among groups
ORGANIZATION
Quality assurance is
normally a separate
function in the
organization
BASIC fUNCTIONS
Of QA ORGANIZATION
QA Policy formulation

. agency policy
. contracts
. procurement
. staff training
and development
QA GUIDANCE
AND ASSISTANCE
. laboratory operations
. monitoring network operations
. data reduction
. special field studies
. instrument. maintenance
and calibration
QA GUIDANCE
AND ASSISTANCE
. preparation of legal actions

. source emission testing

. development of control
regulations

. preparation of technical
reports
3-4

-------
- - - -  
     TRAINING
    . essential for all personnel in any
    function affecting data quality
     .sample collection
     . analysis
     .data reduction
     . quality assurance
- - - -  
     TRAINING
    . on-the-job training (OJT)
    . short-term course training
     (normally 2 weeks or less)
    . long-term course training
     (quarter or semester in length)
- - - -  
    AUDIT PROCEDURES
     Performance Audits
    . independent checks 
    . made by supervisor or auditor
    . e¥aluate data quality of total
    measurement system
    . quantitative appraisal of quality
- - - -  
    AUDIT PROCEDURES
     System Audits
    . on-site inspection and review
     of quality assurance system
    . qualitative appraisal of
     quality
- - - -  
     ~"'.~
     . TUE
    CO~~~ic:~ve QA CYCLE Implement
     '-- As_ssm...1
- - - -  3-5

-------
QUALITY REPORTS
TO MANAGEMENT
Quality data u.ually reported I

. percentage duplication or replication
01 determination.

. In.trument or equipment downtime
. percentage voided samples versus
total samples

. quality cost In terms 01 prevention,
appraisal. and correction costs
QUALITY REPORTS
TO MANAGEMENT
Quality data u.ually reportedl
. sy.tem audit (on-site In.pectlon) results

. performance audit resulta

. Interlaboratory test results and
Intralaboratory test results
(precision and accuracy)

. statu. 01 solutions to major quality
assurance problems
GRAPnlC REPORT TO I'IAI'IAGEI'IEI'IT
  .  . .
.  .  i .
"  ;  . ~
1.  .  - .
 .  ~
B  ¥. .
100 . . -;. '
.  ..~ c. !
!IO  . ~i
III Ii. ~
. 80 i
" 70 o. --
.  o. 00 ;
'i 80 "'! oW
so ~  .! .
~ .0 ..  .. :
.. 30 .  ! .
II tI  ~
" JO   ..
W   '"
.. 10    
"    
..   J J ..  "lit J J "
  1978  1979
A SYSTEM FOR RECORDING
CnANGES IN TnE MONITORING
SYSTEM IS NEEDED
. lor written procedure. . documeat control

. lor deal.a and locatloa 01 tbe ..onltorla.
sy.te.. - coanguratloa control

. lor ro.tlae _rYlce after operatloa ba.
beg.a . preY_dYe ..alateaaace
DOCUMENT
CONTROL SYSTEM
Purpose:
To provide the latest
written procedures to
all concerned personnel
3-6

-------
-
-
-
-
-
/'
DOCUMENT
CONTROL SYSTEM

Should include:
. an easy way to make changes
. removable pages
. easily identifiable pages
indexed by Section #
Revision #
Date
Page #
Total pages
DOCUMENT
CONTKOLSYSTEM
Should include:
. a distribution record
system
DOCUMENT
CONTKOLSYSTEM
/'
A new table of contents
should be distributed
with each revision
CONFIGURATION
CONTKOLSYSTEM
To record changes in
equipment and the
physical arrangement
of equipment
CONfiGURATION
CONTROL SYSTEM

Purpose:

. Provide a history of changes
during the life of a monitoring
project
. Provide design and operational
data on the first monitoring
equipment or system when
multiples are planned
3-7

-------
PREVENTIVE MAINTENANCE
An orderly program of positive
actions for preventing failure
of a monitoring system
.cleaning equipment
. lubricating
. reconditioning
. adjusting
. testing
PREVENTIVE
MAINTENANCE
Increased Measurement
System Reliability

.
Increased Data Completeness
DEVELOPMENT Of A
PREVENTIVE
MAINTENANCE PROGRAM

. review equipment. highlight
items most
likely to fail

. define spare part", list
. define frequency for servicing
. prepare a checklist
DAILY CnECK LIST FOR 1'10. AI'IALYZER
eM,    ~M ""."'';i=- .. ~-1
   ... ,.... "
"lee..".  I~IIII"~~:'~.   
.... ......     
~-~' -  ...    
- ,- ... ,"., ...  ...
.,....-. _.      
I .., .. c,II."'"" '--...-  . - f--
. .. ,." 
...  .-" f--     
 ........ .... 1-'     
"0 .... ..... ......      
 ,""''' -. -.      
 .. .... ..-tl.      
- . .. ," ._ ~. --   
- . ,.... . D..""""   
- ...       
- --- '- ...     
.--  "       
C88..... ... ........,    f--  f--
......... "II'"     
3-8

-------
Section 3A
Review of Precourse Problem 3
Lesson Goal
To ensure that you can perfonn the calculations assigned in precourse problem 3.
Lesson Objectives

At the end of this lesson, you should be able to calculate-
1. arithmetic mean, x;
2. standard deviation, s;
3. range, R;
4. geometric mean, x,; and
5. geometric standard deviation, s,.
3A-l

-------
Section 4
Basic Concepts of Statistical
Control Charts
Lesson Goal
To familiarize you with basic concepts in developing and using control charts.
Lesson Objectives

At the end of this lesson, you should be able to-
1. describe a control chan based upon a period of acceptable performance,
2. distinguish between assignable (non-random) and unassignable (random)
causes of variation,
3. describe steps in developing a control chan system,
4. describe the characteristics of a normal (Gaussian) frequency distribution, and
5. describe considerations in using control charts.
4-1

-------
- - - -  
    CONTROL CHART
    """1""""""""""""""-"""""""''''.'OHM
    . how a process 
    should behave 
    . how a process 
    is behaving 
    . when action should 
    be taken to make the
    process behave as it should
- - - -  
    Walter A. Shewhart 
     The Economic
    Ben Control of
    Telephone Quality of
    Manufactured
    Laboratories Product (1931)
- - - -     
    ~~CONST ANT CAUSE" SYSTEM
    A system in which we
    measure something 
    whose variability 
    remains constant 
- - - -     
    Measurements will vary
    over time due to random
     variations. 
    II . . . 
    . .. .. ....
    j ... . . . . .
      ~  
- - - -     
    RANDOM NONRANDOM
    VARIA 110N5 VARIA 110N5
    . unassignable . assignable
    . statistical   . out-of-control
    control    
- - - -     
       4-3

-------
OBJECTIVES
OF A CONTROL CHART
. detect assignable causes

. trigger investigation leading
to corrective action
DEVELOPMENT AND USE
OF A CONTROL CHART
. ..
1. Dec8m*'e whot data ta dIart
2. ~. data
3. "... hIIcogram
.. Dec8m*'e form 011 fNquMcy distribution
$. Aft., ..1INnat1ng autl"". calculat.
,.,... and Itandard drtlatlon
6. Eltabllltlllmltl
7. Construct cMrt
8. Plat paints
9. Hlghllgtllaut4cantrol conditions
10. Tak. --'- action
11. "-¥1M _101 IImItI
12. MaIntain hIItarlcial ft'-
Accumulate data
4-4

-------
Prepare histogram
Determine form of
frequency distribution
[ "
I
After eliminating outliers.
calculate mean and
standard deviation
5 = V I..:.~
-::J ,. ~
/j Y
y ~J
"
I
IL - -
Establish limits
USA
control
::t: 30
99.7%
+ 3.090
99.8%
warning
+20
95:4 %
+ 1.960
95.0%
British
Construct chart
4-5

-------
Plot points
:i
--- .:0-
/ .
~
,
:'
.1' ~ .
..t::.'"~ ~ ~
..
t
~ ~'
- ..." - ->-
- - - --
'-
Highlight out-of-control
condit ions
-~~
:7.' ~ V

........_,j\.

~
Take corrective action
r- ,--- -. :. -, ).

---l~'" .
" ... .J'~/-'" .. ...
'4 '. '.
---::: ' ' ~

"~~:'~ -, -:~
,~ -,,---
~~
Revise control limits
,
,I
Maintain historical file
-"'"
. .',
_\ -'
4-6

-------
Section 5
x-R Statistical Control Charts
Lesson Goal
To familiarize you with the preparation and use of x-R statistical control chans.
Lesson Objectives

At the end of this lesson, you should be able to-
1. explain the Shewhart concept of local control (i.e., use of rational subgroups)
as a basis for developing control charts,
2. distinguish between situations involving rational subgroups and situations
where no rational subgroups exist,
3. distinguish between control charts that are based upon only a period of accep-
table perfonnance and control charts that are based upon rational
subgroups,
4. compute control limits for x-R control charts,
5. recall three rules for detecting out-of.control data points,
6. describe five types of out-of.control patterns that can be visually detected
. using a control chart, and
7. list three assumptions concerning the detection and correction of assignable
causes of measurement-process variability.
5-1

-------
------
.
.
.
.
.
------
-
-
-
-
-
------
------
LOCAL STATISTICAL CONTROL
Shewhart
Control limits based on:
. short term rational subgroups
. small or homogeneous variation
.
Control charts may be based on:
,. I
I ,
I ,
I I
I I
, I
, I
I I
I I
I ,
I I
I~I
I I
I I
I \
~\ '\
, ,
I ,
, ,
, ,
, ,
, ,
'- .'
..
. period of KCeptabie
performance
.raliolW subgroup
CONSTRUOlNG A i-R CONTROL CHART
. identiIy raliolW subgroup
. calculate subglOup arithmetic InNn (I) and range (I)
. Qkulate overd a~ arithmetic: mean (i) and
averase range (It)
. use f41don to est.1bIsh control limits '01' two control
charts (i and R)
.1 chart contJols ~ ~
. R chart controls wiINIHubpoup ,,8iIb8Iy
-
X OtART CONTROL LIMITS
uax = X + (Az) (it)
Where:
i = 29.92
A2 = 1.88 (for subgroups containing two
.Q values)
R=4
ua I = 29.92 + (1.88) (4)
uai = 37.44
LCL" = x-{A2){R)
Lax = 29.92-{1.88) (4)
LCL" =2240
5-3

-------
UWLx = X +(~)(A2)(R)
UWLx =29.92 +(~)(1.88)(4)
UWLx =34.93
LWLx= X -(~) (A2)(R)
LWLx=29.92 -(~)(1.88)(4)
LWLx=24.91
R CHART CONTROL LIMITS

UaR=(D4)(R)
Where:
04 = 3.27 (for subgroups cont.ining two
ca.. values)
11=4
uaR = (3.27) (4)
uaR = 13.08
LCLR=(D])(R)

Where:
OJ = 0 (for subgIoups conI.iining two
ca.. values)
R =4
La R = (0)(4)
LaR = 0
UWLR= (D,)(R)

Where:
0, = 2.51 (for subgroups conl.iining two
ca.. vUles)
11=4
UWL R = (2.51) (4)
UWLR = 10.4
5-4

-------
- - -  
   LWLR=(Ds)(R)
   Where: 
   Ds = 0 (for subgroups containing two data
   values) 
   R=4 
   LWLR =(0) (4) 
   LWLR = 0 
- - -  
   . construct x - R control chart
   . draw control and warning limits
   . plot individual x's and R's
   . use prepared x - R control chart for
   evaluating future x's and R's
- - -  
   OUT-Of-CONTROL CRITERIA
   . Points beyond Limits
   . Runs 
   . Patterns 
- - -  
   Points beyond Limits
   r;;;;t ~
    ----------------
   ---------------- ----------------
   one point outside two points outside
   control limits  warning limits
- - -  
   Runs
   ~ ----------------
   ~
   -.-------------- ----------------
   seven points - seven points-above
   up or down or below central line
- - -  
    5-5

-------
Patterns

. Recurring Cycles
. Change in Level
. Lack of Variabi&ty
. Trends
. Most Points near Outside
Umits
Recurring Cycles
------------------------------------
------------------------------------
Change in Level
Lack of Variability
------------------------------------
At ~ A,,?
'-'" V v
------------------------------------
Trends
------------------------------------
5- 6

-------
-
-
-
-
Most Points near Outside Limits
- - - - 
    ASSUMPTIONS:
    Concerning Assignable Causes
    . possible to identify and correct
    . technically feasible to correct
    . economically practical to correct
- - - - 
5- 7

-------
I. Homework Assignment

A standard material is checked at periodic intervals during routine analyses to
ensure that the analytical measurement process remains in control. Following are
the results in the chronological order in which they were obtained:
1. 19.0
2. 18.3
3. 18.0
4. 17.2
5. 17.4
6. 18.3
7. 19.6
8. 20.7
9. 18.2
10. 18.8
11. 20.4
12. 20.1
13. 19.6
14. 18.5
15. 19.1
16. 21.8
17. 20. 1
18. 20.6
19. 18.4
20. 21.0
21. 25.1
22. 21.1
23. 20.9
24. 20.8
25. 23.3
26. 20.2
A. Prepare and plot a control chan with appropriate limits, assuming a single
analysis is performed each day.
B. Prepare and plot x and R control cham with appropriate limits, assuming two
analyses are performed each day; i.e., results Number 1 and 2 were performed
on day 1, results Number 3 and 4 were performed on day 2, etc. (Hint: each
day is a subgroup.)
C. Do the cham indicate any out-of-control conditions? If so, describe them.
5- 9

-------
PROJECT
NAME
MEASUREMENT
PERFORMED
MEASUREMENT
UNITS
No SubRroups
i
i
i
I
I
\
I

IV
DATE           
t  Ia1 1          
 a 2          
~  0          
 u           
  3     "     
D  ... 1          
II)  H          
a  D 2          
 II)          
 t! J          
SUM            
ERAGE, x          
RANGE, R          
      !I . .  . , I ~
 ~            
 ft            
U)            
~            
::J            
H            
            
         -    
H            
            
H            
~            
H            
 0:            
 ..            
 II)            
 ~            
CJ            
 ~            
    .         
n....           
c:Uc:--          
tI tI 0 .          
I ~.04 U          
 t....          
u~tI  .        
u-           
-
-
,
It')

-------
CGllllment!l
(Correct
Action,
etc. )
RANGES, R
AVERAGES, x
.
5-12
;I>-
<:
~ ~ i

C'J GJ .
f'I2 ..t%:1
.
:It XI
,..
~
,;,.
MEASUREMEN'l' 0 ~
)0 Z
RESULT CODE ~ ~
t'2 t'1'
W N .... W N .... t
0-.
~
o
g
en
"C
t'!i
~c
o
!3
~z
o~
i~
""'~

~!
~

-------
Section 6
The Measurement Process
with Emphasis on Calibration
Lesson Goal
To familiarize you with quality control considerations (especially calibration) for the
measurement of air pollutants.
Lesson Objectives

At the end of this lesson, you should be able to-
1. discuss quality control considerations for the three components (pollutant
separation from the air matrix, determination of the amount of pollutant and
the volume of air sampled, and calculation of pollutant concentration) of an
air pollutant measurement,
2. define calibration,
3. list and discuss the six general elements of a calibration program,
4. define traceability, and
5. identify services available from the EPA's Standards Laboratory.
6-1

-------
. . . . . . . .
. . . . . . . .
. . . . . . . . -
. . . . . . . . .
- - - - - - - -
. . . . . . . .
AIR POLLUTION MEASUREMENT
. separate pollutant from air

. determine pollutant quantity and
air volume

. calculate pollution concentration
by dividing pollutant quantity by
air volume
SEPARATION OF POLLUTANT
'un", Z
ill., --
Manual
Automated
DETERMINATION Of AMOUNT Of
~OLLUTANTANDVOLUME
Of AIR SAMPLED W

~----- ----~,
------------------------
, ,"

'~"
, , I
I , I
I , I
, , ,
: V1n,I~I::
. Iw t I I
I....... : ,1
""'''''''............ : ,I'
"'...... I ,
"'....................;,,'
CALIBRATION
The process of establishing
the relationship between the
output of a measurement
process and a known input.
6-3

-------
ELEIIIEI'ITS 0' A
CALlBMTIOI'I rROGMIll
. sqte.ellt8 0' ..lIowable U.e betwnll
~allbraUoD.
VENDOR RECOMMENDATIONS
D
Operatians
Manual
Contact
Users for
Opinions
IN-HOUSE RECORDS, FORMER EXPERIENCE
ZERO AND SPAN LIMITS
s
~
.
.
~
1i
~
eonc...tratoon
6-4

-------
ELEI'IIEI'ITS or A
CALIBRATIOI'I PROGRAI'II
. .tatelllenta 01 ililowilble tlllle between
c.illibriltion.
. .tatelllenta 01 mlnllllulII qUilllt}' 01
cilllbr.Uon .tand.rd.
ELEI'IIEI'ITS 0.. A
CALIBRATIOI'I PROGRAI'II
. .tatelllenta 01 illiowilble time between
cillibriltion.

. .tatelllenta 01 minilllulII qUillity 01
cillibriltion .tandard.
. provl.ion. lor .tandard. traceilbillty
. NBS-SRMs

. CRMs
EPA'S STANDARDS
LABORATORY
. certification of client-
owned calibration and
auditing materials
CERTIFICATION
SERVICES AVAILABLE
. cylinder gases
. permeation tube rates
. now measuring devices
. calibration I audit devices
. static calibration I audit
standards
. special analyses upon
request
6-5

-------
WRITE TO:
~
ENVIRONMENTAL MONITORING
SYSTEMS LABORATORY

Quality Assurance Division

US EPA. MD-77
Research Triangle Parli. NC 17711
ELEMENTS 0' A
CALl8RATION PROGRAM
. 5tatc.cnu 01 allow.ble tI.c between
utlbratlon.
. 5tate.cnu 01 .Inl.u. qu.llty 01
calibraUon 5tanda"'.
. p....,lalon. for 5tandard. traceability
. p....,lalo.. for wrttten procedure.
ELEMErnS 0' A
CALl8RATI0l'4'ROGRAIII
. atate_u 01 all_a- U- -wee.
ullkaUo-

. 5tate_.U 01 .1.1... ....1.,. 01
ullkaUo. _a....
. P""'laI- I... 5tanda"" trace.""II.,.

. ,...1..... for wrttte. pnoceIIa...

. ata.._a.. 01 proper c..lraa_atal
conditio..
ELEIIIENTS 0' A
CALl8RATION PROGRAIII
. 5tate_nu 01 allowable tI.c between
callbr.tlon.
. .tate_... 01 .Inl.u. qaallty 01
callkaU- 5tanda"'.

. p....,laI- f... 5t8nda"'. tracea""lIty

. p....,I.lo- for wrtttea procedares
. .tate_nu 01 proper ca.I...._.tal
condltlo-

. pra.I.loa. for prapcr reco... keeping
CALCVLATIOri OF AMBIEriT
POLLVTAriT COriCErlTKATIOri
:15°C
or
:l98K
760 mm ng
or
1 atmospbere
EPA Standard
Ambient
Temperature
EPA Standard
Pressure
6-6

-------
&EPA
Environmental
Monitoring
Systems Laboratory
Research Triangle
Park. NC 27711
United St8tes
EnvirOtm8ntllI Protection
Agency
Envirorwnent8l Monitoring Ind ~
LIbor8tory
CircimBtl OM 45268
Volume 3
February 198
Number 1
NEWSLETTER
Quality
Assurance
Sample Repository
A repoSitory of quality control matenals IS maintaIned for use by governmental
Industnal and commerCial laboratories A wide vanety of samples IS available
without charge to the user. These materials are Intended as independent
measures of working standards or as internal quality control samples. A
certification of analysIs IS furnished with each sample. Matenals currently
avaIlable are listed below.
Quality Control Samples for Ambient Air and Stationary Source Analyses
Compressed Gases
NITRIC OXIDE
SULFUR DIOXIDE
CARBON MONOXIDE
CARBON DIOXIDE
OXYGEN
NITROGEN DIOXIDE
METHANE
METHANE/PROPANE
Static Samples
LYOPHILIZED MIXTURE
OF SODIUM SULFITE-
TETRACHLOROMER-
CURATE
AQUEOUS SODIUM
NITRITE
DILUTE SULFURIC
SOLUTIONS
AQUEOUS POTASSIUM
NITRATE
6-7
Multiple levels from 50 to 1500 ppm
Multiple levels from 50 to 10.000 ppm
3 levels from 5 to 50 ppm
3 levels from 3 to 8 percent
3 levels from 1 to 8 percent
3 levels from 25 to 1 00 ppm
Multiple levels from 1 to 10 ppm
2 ppm methane with propanE' ranging from 0.5
to 6 ppm
Simulate collected ambient level SOz samples
from 10 to 200 ~g/m3 of SOz. Samples are
furn.shed as a set of 5 different concentrations.
Simulate collected ambient level NOz samples
from 40 to 200 JI. 1m3 of NOz. Samples are
furnished as a set of 5 different concentrations.

Simulate collected source level SOz (Method 6)
samples from 200 to 2500 Jl.g/dscm of SOz.
Samples are furnished as a set of 5 different
concentrations.

Simulate collected source level NOz (Method 7)
samples from 150 to 900 ~g/dscm of NO:z.
Samples are furnished as a set of 5.

-------
EMSl-RTP
(Cont'd)
F.lter Samples
LEAD FILTER STRIPS
ARSENIC FILTER
STRIPS
SULFATE-NITRATE
FILTER STRIPS
SULFATE ON
CELLULOSE
MEMBRANE FILTERS
SULFATE-NITRATE
ON TEFLON
MEMBRANE FILTERS
LEAD 01"4 CELLULOSE
MEMBRANE FILTER
Flow Measurement
Devices
HI-VOL REFERENCE
DEVICE
CRITICAL ORIFICES
Organic Materials
BENZENE
ETHYLENE
METHANE 'ETHANE
PROPANE
Lead nitrate deposited on "2'" x 8" glass-fiber filter
StripS Samples Simulate collected concentrations
from D,4 to 15 ~g. mJ of lead Nine levels are
available

ArseniOus oXide deposited on "2" x 8" glass-fiber
filter $trlpS Samples Simulate collected concentra-
tions from 002 to 1 0 ~g mJ of arseniC Nine
levels are available!

Sodium sulfate and potassium nitrate deposited on
'/2" x 8" glass-fiber tllter striPS Samples Simulate
collected concentrations from 0 6 to 40 9 mJ of
sulfate, anc from 06 to 15 ~g m" nitrate Nine
levels are available
Sodium sulfate deposited on ',. of 4" d'ameter
cellulose membranes Samples Simulate collected
concentrations from 25 to 320 ~g of sulfate.
Seven levels are available.

Sodium sulfate and potassium nltra~e deposited on
37 mm diameter teflon membranes Samples
Simulate collected concentrations from 50 to
200 ~g of sulfate and from 50 to 200 ~g of nitrate
Three levels are available
Lead nitrate deposited on "4 of 4" diameter
cellulose membranes Samples Simulate collected
concentrations from 10 to 100 ~g of lead Five
levels are avaliabie
ConSists at a set at resistance plates to Slf1lUIi'lI~
vanous filter 10
-------
EMSL-RTP
(Cont'd)
PROPYLEr~E
Several levels from 5 tu 700 pPll'
TOLUENE
Several levels frum 5 10 700 pPll'
METHYL ACETATE
Several levels frum 5 lu 700 ppm
VINYL CHLORIDE
Several levels from 5 to 40 ppl1'
HYDROGEN SULFIDE
Several levels from 7 10 650 pprn
m-XYLENE
Several levels from 8 to 600 ppm
CHLOROFORM
Several levels from 5 10 700 ppm
PERCHLOROETHYLENE
Several levels from 5 10 700 ppm
BUT ADIENE
One level
25 ppm
HEXANE
Several levels from 30 to 3000 ppm
METHYL MERCAPTAN
Several levels from 5 10 10 ppm
METHYL ETHYL KETONE
One level
50 ppm
(Robert Lampe, FTS 629-2573, COML 919.541.2573)
Standards Laboratory
The Environmental Protection Agency. Environmental Monitoring Systems
Laboratory. Quality Assurance DIvIsion. Research Triangle Park, NC
(EPA/EMSL/QAD/RTP) Standards Laboratory offers calibration. standardiza-
tion and certification of client-owned sample material There IS no charge for this
service. Where applicable. certifications are referenced directly to National
Bureau of Standards (NBS) Standard Reference Materials (SRM). The following
services are offered.
. verification of compressed gas standards used for calibration. span checks
or audits of air quality analyzers (NO. N02. S02, CO, CO~, CH., hydrocarbons)
. verification of permeation tube rates (gravimetriC or direct comparison with
SRM)
. verification of flow measuring devices (mass flowmeter, hi-vol orifice
meters)
. verification of outputs of calibration or audit devices (S02, ozone, NO, NO;,
CO, C02, CH., hydrocarbons)
. verification of static audit or calibration standards (nitrite solution,
potassium tetrachloromercurate sulfite freeze-dried powders; sulfate and
lead on glass-fiber filter strips)
. other special analyses are available upon request.
For detailed information or to receive sample material. contact Berne I Bennet
at the Quality Assurance DIvISion. Standards Laboratory, EMSL, MD.77
Research Triangle Park. NC 27711.
(Berne Bennett. FTS 629-2366. COML: 919-541 -2366)
6-9

-------
Section 6A
Group Problem
The occurrence in ambient air of a highly toxic gaseous pollutant, cyclolehmdone
(CL), has recently been reported. Each group is to develop a monitoring and quality
assurance plan that will determine the ambient level of CL.
The following data are provided:
. This is a state-wide problem. All efforts are coordinated through the state
central office.
. There are three local offices located throughout the state. The local offices
will be engaged in the field work. Each local office has a laboratory where
CL analyses will be performed. Assume each local office and the state office
have adequate staff and funding.
. Just by coincidence, there are three plantS suspected of CL emissions located
in the state - one plant is located in each of the jurisdictional areas of the
local offices. Each plant uses CL in the manufacture of itS products.
. Both a manual method and a continuous monitoring (instrumental) method
exist. Each local office and the state office has one gas chromatograph for
analyzing manual samples and one continuous monitoring instrument
available for use in the study. Gas chromatographs mUst remain in their labs.
Continuous monitoring instrumentS of the local offices must remain in the
field. The purchase of additional continuous monitors is not possible.
. The length of the sampling program is two months.
. For manual sampling. 24-hour integrated sampling will be done every day.
. Sampling sites have been properly selected around each plant using historical
meteorological data available. The siting team has decided that six stations
are needed:
Plant I  Plant 2  Plant 3
00 0 0<)00 0 0
o Q   
   0 ~
  o 0
o   0  0
     o
6A-l

-------
. Manual-sampling equipment and supplies must be procured.
. An NBS-SRM (permeation tube) exists (located at the state office); cylinders
of "known" concentrations of CL are available from FBN. Inc. Purchase of
additional permeation tubes is not possible.
Manual Method - Attachment I
Continuous Method - Attachment II
6A-2

-------
Group Problem Planning Sheet
Group
1. Write what you consider to be the QA policy for the group problem.
2. List the data quality objectives that your group will require to be met with regard
to the group problem.
6A-3

-------
Section 6B
Review of Control Chart Homework
Lesson Goal
To ensure that you can perform the tasks assigned in the control chart homework
exerCIse.
Lesson Objectives

At the end of this lesson, you should be able to-
1. prepare a control chart based on individual data values (no rational
subgroups),
2. prepare an x-R control chart (based on rational subgroups), and
3. detect out-of-control conditions indicated by the prepared charts.
6B-l

-------
Section 7
Regression Analysis and Control
Charts for Calibration Data
Lesson Goal
To familiarize you with regression analysis techniques (especially the linear least-
squares method) and control chan considerations for calibration data.
Lesson Objectives

At the end of this lesson, you should be able to-
1. list three advantages of using the least-squares method for determining
calibration curves,
2. list four implied assumptions of the linear least-squares method,
3. discuss the mathematical basis for the least-squares method,
4. compute a linear least-squares calibration equation from calibration data
(given the appropriate formulas),
5. compute the standard error for a calibration curve (given the appropriate
formulas),
6. compute an inverse calibration equation (given the appropriate formulas),
7. select appropriate control-chan calibration parameters to plot for a specific
monitoring situation, and
8. list two non-linear calibration-data analysis techniques.
7-1

-------
----
----
----
----
----
----
REGRESSION
ANAL YSIS
AND CONTROL
CUARTSFOR
CALIBRATION
DATA
/
CALIBRATION
The process of establishing
the relationship between the
output of a measurement
process and a known input.
Obsenred Output. y

(dependent v8rt8ble)
Voltage
o
o
Known Input. JI
(Independent v.rt8ble)
C.llbr8Uon G.. Concent..Uon
METIIODS OF DETERMINING
TilE INPUT-OUTPUT
RELATIONSIlIP
)("~
~.,.
i
Manual
Computation
~ MANUAL
~ METUODS
. draw line by eye
. draw line using
ruler
7-3

-------
r:;l COMPUTATIONAL
l..:C.-.J METUODS
. mathematically determine
relationship (least-squares
method)
. advantages
. more precise
. everybody gets same tiae
. provides formula for transfer
LEAST-SQUARES
METIIOD

Assumption:
. linear relationsbip
. error in y - no error in "
. scatter of error is uniform
. errors normally ancl.
independently distributed
y
d:1+ d:1+d:1
1 :1 3
(a 18.al18.18)
x
EXAMPLE PROBLEM
18
13 ;/y=a+bJl .
y 8
. .
b
____4
. 1
.
o
o
3
.
5
x
. Obtain sums and averages
of data
X
I
1
4
5
I = 11
y x1 y1

1 1 4
7 4 49
7 16 49
11 15 144
--
18 46 146
7
xy x-x y.y

1 .1 -5
14 .1 0
18 1 0
60 1 5

104
ayg. = 3
7-4

-------
. Obtain sums of squares
and sum of products

(x.x) 2 (X.X)(y.y) (y.y)2
4 10 25
100
100
4 10 25
10 20 50
. Calculate slope of
line: acceptable
method

51 b ! (X-x)(y-y)
ope: = - 2
!(x-x)
20
---2
- 10-
. Calculate slope of line:

preferred - regression
method analysis
fl'I(t"
b = I_y----;;--
i,,:z - l!n".J
(11)118)
10.--.-

= .6-~
.
= 1~"--386. = ~~ = 2
. Determine y-intercept
Intercept: a = Y-bx
= 7 -2(3)
=1
Equation: y = 1 + 2x
STANDARD
ERROR: Se
The standard
deviation of the
residuals
distribution.
7-5

-------
11
.,
"',
,'. '
"",,,,, .' "
,,,'" ...",,+ ,,'
,,,,,,,,, ...........:......~,'
,'" " ,I'" '5
fI'" "."." ~ : e
"
"
"
"
",/ y=1+2x
16
y 8
..
00
1
3
X
..
s
6
x
DETERMINATION OF
STANDARD ERROR
Obs. y Pred. y d d1
1 3 -1 1
7 5 1 -
7 9 -1 -
11 11 1 1
rd2 = 10
1
1
-
5
Se = JId1
n-2.

= JId1
4,2

Se = J I~ = $ = 2.236
/"
INVERSE CALIBRATION
EQUATION
y=a+bx
. used to relate output value
(y) to input value (x)
. Using the basic equation:
y=a + bx

. Obtain the inverse equation
(by solving for x) :
)'-.
x=-
b
. Then: x = .!.y + (..!.)
b b I
x=b'y+a'. where b'=ii-

a'=..L
b
7-6

-------
y = a+bx APPLYING
y=1+2x DATA
y., fROM
x = 2" EXAMPLE

x = -!y + (.+) PROBLEM

,
x = b'y +a', where b'= 2'

a'='.!.
2
CONTROL CHARTS
fOR MULTIPOINT
CALIBRATION DATA
Purpose:
to assure that the calibration
process remains in statistical
control
OPERATING
METHOD
CONTROL
CHART
. no Intervening
adjustments for
zero or span
. zero'spaD drlfb
. spaD-zero
. slope
. Intervening
adjustments for
zero and span
. standard error
NON-LINEAR
CALIBRATION DATA
ANALYSIS TECIINIQUES
. make linear by
transformation

. compute non-linear least-
squares equation
CALIBRATION CURVE
1.0
8°..
a .
~ 0.8 .

i
.0..
a
..
~o.J
...
.
.
0.0
0.0 0.1 0.. 0.. 0.. 1.0
Concentration. I' 9 ImL
7- 7

-------
DATA
~g/mL
0.000
0.031
0.081
0.161
0.316
0.663
0.951
Transmittance
0.863
0.815
0.751
0.650
0.0\80\
0.179
0.165
EQUATION

A = log ( ~)
A = -logT
. Determine
absorbance values
1l9/DlL T A
0.000 0.863 0.06"
0.031 0.815 0.089
0.081 0.751 0.11"
0.161 ().650 0.187
0.316 0.0\80\ 0.315
0.663 0.179 0.555
0.951 0.165 0.781
U"MR CALlBRATIOri CURVE
0.11
~ 0..
II
~o..
J0.2
-
- 0.2 0.. 0.. 0.11 1.0
Coaceatntloa. fl9/.L
. Compute
non-linear
least-squares
equation
...
~ ...
.
.
i! ...
I
= ..1
...
... 8.1 e.. 8.. ... I..
c-ba"". ~8I.L
7-8

-------
Section 7 A
Review of Precourse
Problems 1 and 2
Lesson Goal
To ensure that you can perfonn the tasks assigned in precourse problems 1 and 2.
Lesson Objectives

At the end of this lesson, you should be able to-
1. recognize the usefulness of data plotting in detecting outliers,
2. calculate percentage differences of paired data values, and
3. recognize the usefulness of percentage difference detenninations in detecting
outliers.
7A-l

-------
Section 8
Identification and Treatment
of Outliers
Lesson Goal
To familiarize you with the need for identifying and investigating outliers and with
three statistical outlier tests.
Lesson Objectives

At the end of this lesson, you should be able to-
1. define outlz"er,
2. recall five possible reasons for the existence of an outlier in a data set,
3. discuss the need for identifying and eliminating outliers of quality control
data,
4. recall that data are initially screened for suspect values using visual
techniques,
5. employ the Dixon Ratio and Grubbs T tests (given the appropriate formulas
and critical values tables) to identify outliers,
6. explain in general terms the meaning and derivation of the significance level
critical values of the Dixon and Grubbs critical values tables,
7. discuss advantages and disadvantages of using either the Dixon Ratio test or
the Grubbs T test,
8. recognize the use of control charts for identifying outliers, and
9. recall the underlying assumption of the Dixon Ratio test, the Grubbs T
test, and the control chart technique.
8-1

-------
IDENTIFICATION AND
TREATMENT OF OUTUERS
IX
I}(}( xxi
40 60
xl
100
I
80
o
20
CAUSES OF OUTUERS

I.="~:I.~~
I.w-,~ I
~"--

I.-~.- s
- '" - ~~
~;.
NEED FOR IDENTIACA TlON /
ELIMINATION OF OUTUERS
Identification:
. indicates need for doser control
Elimination:

. ensures ~nalysis is vud
. ensures condusions ~re correct
PRoaDURE FOR
IDENTIFYING OUTUERS
. screen data

. subject suspect data to
statistical tests
USE OF
DATA PLOTS
FOR INITIAL
SCREENING
.
8-3

-------
STATISTICAL
OUTUER TESTS
. Dixon Ratio Test
. Grubbs T Test
. Control Chart Technique
DIXON RA 110 TEST
PROaDURE
[I] Arrange dati in ucending or descending
order

[1] Ciku"te a rdo

rn Comp.re,.10 to Dixon tible

[!] Determine if suspect y.lue is an outler
r:jl Arrange daQ values in either
t...!J ascending or descending order
. . 5INIest «NU YUle is suspect
1(1 ~ I(z ~ I(] ~ -......1(.
. . "rgest d.u YUle is suspect
1(1 ~ I(z ~ I(] ~ ......1(.
I2l Calculate a ratio - equation
~ depends upon sample size
,.......,
8]107
'. - I, IJ.......I.
I I
,.......,
.81010
'. ... I, IJ....... a.1 I.
I I
.....--.
'11 .. I, IJ 1""""..1 a.
.
.nto13
.....--.
8 14 to :zs
'12 ... I, IJ IJ.........z 1.1 I.
. .
rJ1 Compare ratio value to Dixon
~ uble of critical ratio values
8-4

-------
.
.
.
.
.
f4l Suspect value is an outlier if ratio
L.:!J is greater than critical value
     0.465 > 0.406
     calculated   critical
     ratio value  value
. . . . .       
     EXAMPLE PROBLEM #1
     Using the Dixon Ratio test,
     determine if the data value,
     25.1, is an outlier at the 5%
     significance level.  
. . . . .       
       DATA VALUES 
     19.0   19.1 18.3 21.0
     18.0   20.1 20.7 2t1
     17.4   18.4 18.8 20.8
     19.6   25.1 20.1 20.2
     18.2   20.9 18.5  
     20.4   23.3 2t8  
     19.6   17.2 20.6  
. . . . .       
     DATA VALUES: ARRANGED
     @)   20.7 19.6 18.2
     23.3   20.6 19.1 :'1ii.if;
     ,."...     '"'...-..
     ':2t8;   20.4 19.0 17.4
     -"..e.      
     n1   20.2 18.8 17.2
     no   20.1 18.5  
     20.9   20.1 18.4  
     20.8   19.6 18.3  
- - - - -       
        SOLUTION:  
     '22 = 25.1 23.3 nS.....1S.0 17.4 17.2
      . .  
      3.J    
     '22 = 7.1    
     '22 = .465    
       .Since 0.465 > 0.406 
       Then 25.1 is an outlie, 
- - - - -       
          8-5

-------
GRUBBS T TEST
PROCEDURE
ill c.JcuYte .arithmetic me.an
[1] GtIcuLate st.anc:l.ard devi.ation
ill GtIcuLate .a r.alio
[!] Comp.are r.alio to Grubbs t.able
rn Detennine if suspect v.alue is .an oudier
r:jl Calculate arithmetic mean (x)
L!.J of data set values
x
-
-
~x.
I
n
~ Calculate standard deviation (5)
~ of cMti set values
s=
2
~X. -
I
(}: X J~
n
n-1
[1] Calculate a ratio
. . smdest cYa. value is suspect

T - x - )(1
1 - 5
. . laraest cYa. value is suspect
)( -x
Tn = ~
[!] Compare ratio to Grubbs tmIe
8-6

-------
~ Suspect value is an outlier if ratio
~ is greater than critical value
2.87
>
282
calculated
ratio value
critical
value
EXAMPLE PROBLEM # 2
Using the Grubbs T test, determine if
the data value, 25.1, is an outlier at the
5% significance level for the data set
used in the Dixon Ratio test procedure
(example problem #: 1~
DATA VALUES
19.0 19.1 18.3 21.0
18.0 20.1 20.7 2t1
17.4 18.4 18.8 20.8
19.6 25.1 20.1 20.2
18.2 20.9 18.5 
20.4 23.3 218 
19.6 17.2 20.6 
SOLUTION:
Determine Ix., Ix~, and n
I I
Ix. = 498.2
I
I x ~ = 10,005.98
I
n = 25
10 find i:
Ix.
i = ~
n
498.2
i = ~
i = 19.9]
8-7

-------
To find s:
,
10,005.98 - (49~1

2S - 1
s =
I:x~ - (I.;!'
I n
n - 1
s =
s = 1.80
Because lugest ~u value is suspect.
c.aku&ate Tn:
T  = I" - i
 n  ,
Tn  1!.1O . 19.93
 1.10
T  = 2.87
 n 
Since 2.87 > 2.822
Then 25.1 is an outlier
Control Chart Technique
. COIIIbUct from
hWoriaII ~g
. Plot 1UbMquent
data
DIXON RA 110 TEST
AdvanUge
. simple alc:uations

DisadvanUges
. not ~ data set YMies used
. limited to data sets with 2S YMies
orles
GRUBBS T TEST
Advantages
. more powerful than Dixon Ratio Test
. can be used for large data sets

Disadvantage
. involved cakuJations
8-8

-------
Control Chart Technique
........-.......................",........................
. Detects individual
outliers

. Detects sets
of oudiers
An underlying normal (Gaussian)
distribution of data is assumed.
-30 -20 -10 IJ
10 20 30
Treatment of Outliers
. Determine causes
. Eliminate if appropriate
8-9

-------
Section 9
Intralaboratory Testing
Lesson Goal
To familiarize you with intralaboratory testing considerations.
Lesson Objectives

At the end of this lesson, you should be able to-
1. distinguish between intralaboratory and interlaboratory testing,
2. discuss the purposes of intralaboratory testing,
3. distinguish among three levels of precision measurement: replicability,
repeatability, and reproducibility, and
4. discuss considerations necessary for designing an intralaboratory testing
program.
9-1

-------
----- -
----
----
----
----
----
TESTING
A


UIIB UIIC
Intralaboratory
Interlaboratory
P(]RPOSES OF
I NTRALABORATORY
TESTING
Identify :
. sources 01 measurement error
Estimate:
. bias (accuracy)
. variablUty (repUcabillty,
repeatability)
THREE LEVELS OF
PRECISION
MEAsaREMENT
. Replicability
. Repeatability
. Reproducibility
[tj]-
~.- ~..-
I . . aT
ReputaIIWtJ
It&pnMIucIbIIItJ
Reproducibility
Repeatability
Replicability
9-3

-------
INTRALABORA TORY
TESTING DESIGN
CONSIDERATIONS
. types of measurement
methods
. potential sources of error
. testing philosophy
MEASUREMENT
METHODS
Manual:
. coUection
. analysis
Continuous:
. collection/analysis
POTENTIAL SOURCES OF ERROR

[L]~
I~--I~P-
MEASUREMENT OF
OPERATOR PROFICIENCY
Major Problems
. what kinds of audit samples to use
. how to Introduce samples Into
analytical process without analyst's
knowledge
. how frequently to audit
KINDS OF
AUDIT SAMPLES
. duplicate of real samples

. prepared reference
samples
9-4

-------
----
----
----
----
----
AUDIT SAMPLE
INTRODUCTION
. samples should have identical
sample labels and appearance
as real samples
. supervisor and analyst should
alternate the process of logging
in samples
AUDITING
FREQUENCY
Decision based on :
. degree of automation
. total method precision
. analyst's training, attitude,
and past performance
9-5

-------
Section 10
Interlaboratory Testing
Lesson Goal
To familiarize you with interlaboratory performance testing considerations and
USEPA's interlaboratory performance audit program.
Lesson Objectives

At the end of this lesson, you should be able to-
1. describe and distinguish between the two kinds of interlaboratory
tests - collaborative tests and periodic performance tests,
2. describe considerations in designing an interlaboratory performance test.
3. describe USEPA's interlaboratory performance audit program,
4. list the common types of performance audits conducted by USEPA,
5. identify the audit materials that are available from USEPA,
6. list sources of information concerning USEPA's interlaboratory performance
audit program,
7. discuss data analysis performed on the results of USEP A's interlaboratory per-
formance audits, and
8. discuss results of USEPA's interlaboratory performance audits.
10-1

-------
-
-
-
-
INTERLABORATORY TESTS
- - - - 
    INTERLABORATORY
    PERFORMANCE TEST
    . identifies biased labs (and lor
    analysts)
    . estimates "between laboratory"
    measurement method
    reproducibility
- - - - 
    CONSIDERATIONS
    IN PLANNING THE
    INTERLABORATORY
    PERFORMANCE TEST
- - - - 
. Selection of the parameter
to be tested
. automated method. total
. manual method. portion
- - - -   
    . Selection of the proper
    sample 
    . If collection
     ~. 
    . (j analysis
- - - -   
      10-3

-------
SAMPLE SIZE
~. "_P~.:

, . ,
, , I I
- ~ J -
J~CJ
te.' I lesl ') lest)
. Sample preparation -
ensure uniformity.stability
"rl....." S...pll"."
>< ~-


--------
~
r :
_I
9.fp ''''.1'.
... ..."..
. Sample preparation - evaluate
sample-to-sample variability
~~~~ M~~~ ~~
. Test instructions
. clear and complete

. only one interpretation

. specify handling - routine
or special?

. specify reporting form and
units
SELECTION OF METHODOLOGY
. inter-method lab variability-

lab selects method
. same method lab varlability-
specify method
Alway. require written copy 0' method used I
10-4

-------
. Report results to the labs
,....-~ 8 timely
ler lab Tes
In Result 8 confidential
yoU d'd~
G.e.'
0"
'!jorrY
;/ 8 recommend corrective
action if needed
--
Dear Lab.

f.n'lo~ed i~
ano"Ier ~a",ple

lor yOU 10 Iry.
rlea~ 10110," the

in,.Uudion,. . . .
. follow-up
RECAP
. select the parameter to be tested
. select the sample
. prepare the sample
. prepare t he instructions
. provide feedback of results
. specify corrective action
. follow-up
EPAINTERlA80RATORY
PERfORMANCE AUDIT PROGRAM
ENVIRONMENTAL
MONITORING SYSTEMS
LABORATORY
. sample repository

. free samples for
quality control
10-5

-------
ADDITIONAL QC SAMPLES FOR
AMBIENT AIR AND STATIONARY
SOURCE MEASUREMENTS
. cylinder gases
. filter samples
. organic gas mixtures
WRITE TO:
I]
EI"IVIROI"IJIIEI"ITAL JIIOI"IITORII"IQ
SYSTEJIIS LABORATORY

Quality A..uraace Dlyl.ioa

US EPA. JIID.77
Research Trlaagle ParI!.. I"IC 177..
OAT A DISTRIBUTION
Ceo
::I
';50
> .0
"
~]O
110
.¥ 10
.
I
.

I
o
10
10 30 .0
True Value. ppm
50
DISTRIBUTION WIT"
OUTLIERS ELIMINATED
C60
::I
';50
>.0
"
~]O
..
110
.¥ 10
I
,
o
10
10 ]0 .0
True Value. ppm
50
MEDIA" I MEA" VALUES
Ceo     
::I     
'; 50     X
>.0    X 
"    
~]O   X  
1 10  X   
.¥ 10 X    
0 10 10 ]0 .0 50
  True Value. ppm 
10-6

-------
~udlt.medlaD or meaD' true valU~ 100
Ufo Accuracy =
true value
STANDARD DEVIATION
s
2 (LXj)2
LX. -
I n
-
0-1
COEffICIENT Of
VARIATION
cv
-
-
~Udlt smea~ 100
CV VERSUS AUDIT MEAri
eo
;50
~40
1,30
;a 1
(J
.
.
.
.
.
o
10
10 30 40
Audit JIIe-
50
eo
1982 AMBIEI'IT AIR AUDIT RESULTS
   ".ge oIlIeported
 Accuracy cv ...ge Vatue. .Itliia s~o-.
r.r...., ".ge ('110' ('110' 0' True V.lua ('110'
8011,........111... -5.7 te e.t "8 '0 '8" ...7 '0 100
"". -1..1101.8 )."0 t.3 "0. te 100 '81011t)
CO -1.7'80.5 2.7" 1.8 .7.0'.100
-.2101lC,>",.'.3. -I.t.o 1.7 e.J te .... 82.8 .. 100
"OJ-Ile...>."..,.,,) -I.' t. J.J t.e.o 10.8 90.. 18 100
.. -J.I te 0.0 to". 11.8 88.. Ie 100
-1 (ce8U..-.1 -1.."-0.7 - -
8......... - - "' Is"'.
10-7

-------
I'IETnOO 5 "UOIT R~SULTS
:
I::: u
t
"'f!N of A.cc.r.cy S. J peru'..
.
.
'C ..
~
j II
3
~
,.
s". .". J/80 .'80 )'8' .,8.
".... Dele. ....., ,e.,
"81 .'.1
an 1U no ns
"'....., of s......
n.
I'IETnOO 6 "UOIT RESULTS
i oo
~ 7.
too
: M
"
.! ..
~ ..
~ J.
~'" of Ace.reey oS :I "Kee'
~
.,. II.' ./8. ".1
~.. D8&8. ..-.,,..
.,.,
.0. III .. II.
I'll.." ef .......

............ "."D8C"
- 0." ... 1_'.'-
-1_",-
l'Ir:rnoo 7 "UOIT RESULTS
I :[d"-'s-
I: V -~

I.
.,. '0'. .'0' ..,.. ./81 ,.,.
.... _. .....,~
;. .;. ;.
-..-
-_.-.'''''
-.- .... _1-1-
.. ....-
;.
.
..
"
SEPTEJII8ER 1982
COAL AI"IALYSIS AUDIT RESULTS
 La" ....,.... hl.eo La'" Repoftl.. hl.eo
 -.tII.. *5" -'l1li'. * 10"10
T,.. V81- of T..e V.I- (" of T... V.I.e C'"
I . J 1" ...... II 7.
s.n......., 71 .,
J.' ,"It """.re II "
1..'" .......... ,. ..
I' ...1...... .. 100
..."... U8II .. ..
un,8'nI''' " 100
118JJ 8111'. " 100
JUI"IE 1982
JIIUnOD 3 AUDIT RESULTS
 La" .......... hl- LaM Report'.. V.'.eo
 -"... * 5' _"10'. %:1"-
Tne V..- of Towe V.I- C'" of Towe V...e ("
...... CO2 "  07
1.2" 0'1 J.  ..
7.... CD I.  JI
10-8

-------
Why are audit results
optimistic?
,... ~ " .
..~~ ~,~.~:-;
I I"" . .r- ... ) ,l ~
:/' .
)"", -
'I -' J I .. I
I I J::r: .
, -
-,'J". _r
"{11" - -f
1 . /
-.,-.

I ' ~-.
"': :-
,"'~'" "-'
10-9

-------
Section 11
Procurement Quality Control
Lesson Goal
To familiarize you with quality control procedures for the procurement of supplies
and equipment.
Lesson Objectives

At the end of this lesson, you should be able to-
1. recall the four major groups of procured items of concern in procurement
quality control,
2. list at least two procured items from each major group that affect air
monitoring data quality,
3. describe a quality control procedure for the procurement of an ambient air
quality analyzer, and
~. describe quality control considerations in the procurement of calibration stan-
dards, chemicals, and materials.
11-1

-------
- - - --
----
----
----
----
----
~ ,. ~l

,.U~" ., 00 "" PROCUREMENT
--.:~~ .
CaUbfouon QUALITY
EquIpment Standard>

~ - CONTROL
"--I
-,.-
. ""'.r -
Men.noli
EQUIPMENT
CALIBRATION 5T ANDARD5
CHEMICALS
~"
I I . '. ::
- ~ I,
MATERIALS
11-3

-------
PROCEDURE FOR PROCURING AN
AMBIENT AIR QUALITY ANALYZER
[I] Prepurchase Evaluation/Selection
[1J Writing of Purchase Contract
Specificat ions
~ Acceptance Testing
~ Overlap Testing
J] Record Keeping
[I] Prepurchase Evoluotionl
Selection
. analysis of analyzer performance
specifIcations
. assessment of analyzer
Analysis of Analyzer Performance
Specifications
Assessment of Analyzer
. review
operations
manuals
Contact
Users for
Opinions
11-4

-------
-----
----
----
----
----
~ ,~
Y~'F-==III:iI~' ';.-1-
"---"-'J~

~rJ

- '1
In-House
Testing
Field
Testing
Selection
of
Analyzer
~ Writing of Purchase Contract
Specifications
. inclusion of performance Sp4PCS test doto
. poyment contingent upon successful
occeptonce testing
. inclusion of worronty
. inclusion of consistent operoting manuols
. provision of operotor troining
. provision for burn.in
. inclusion of consumables ond spore pons
: 3' Acceptance Testing
,- ..J
(r~
\" (.'.. ~.:.~
/ )~..':j"/

/./..1 ~- -'JI ~
,./ ~-r- '.'.'11

"~~JI:: '
----
11-5

-------
4. Overlap Testing
--- 2--- \
. /
,
"
II} .--:--:-
i
" ....... - ---- --'" ,"',I,
~--~...J'--:--'
~ Record Keeping
/:- .i r;::==l-r
;;~< rr=-'i:r=:1
e- >, ~"I_I

1~5:4),I~' ~~!.FI' 'I
~'~'L~I
-~\ ~:,~i
/=-J:~" ~,~.
::z l;) ~
PROCUREMENT
CONSIDERATIONS FOR
CALIBRATION STANDARDS
. Purchase Contracts

. Overlap Testing
Purchase ContraCts
Requirements.
. NDS or CRM traceability
. cenlflcat. of analysis
. calibration curves
. USfH Instructions
Overlap Testing
~~
11-6

-------
---
---
---
PROCUREMENT
CONSIDERATIONS FOR
CHEMICALS
. Certified Analyses
. Overlap Testing

. Record Keeping
PROCUREMENT
CONSIDERATIONS FOR
MATERIALS
. Performance Parameter Specs
. Acceptance Testing

. Overlap Testing
/'
11-7

-------
1979 - ASQC TECHNICAL CONFERENCE TRANSACTIONS - HOUSTON
36
QUALIT! ASSURANCE FOR PROCt1REKENT
OF AIR .ANALYZERS
Mary Jo ICGpeclcy _d Bruce Radger
W18cou1D Departlllat of Natural R..ourc..
Had1aon. W1acon.1D
ABSTIACT
Aab1eat .ir mon1tor1Dc 111 the v1c1D1ty of . P01l1t lource requires d1ffereat
charact.r18c1cs 111 - analyz.r thaD IIGn1tor1l1& for background data 111 - .1'''
where thare ar. 110 po1Dt _rc... D1fferat deer... of .--1t1v1ty, d1fferllDc
n.pou. tWe.. &lid tb. d88l''' of aut_Uon required. wlll d1ff.r in each
..tt1ll&.
lefor. purcba.1I1C - _lyzer the _er 8118t. therefor.. def1D. 1118 need. 111
tar.. of .eu1t1v1ty, accuracy, cl&u c08p1.ta..., r..pou. to chaD,.. in &abient
conccaaUou, reliability ad _1DtaiDab1l1ty, daer.. of .ut_t1on. .... of
operation aIId co.t. The Wia-1D DeperCl8l1t of Natural aaaoure.. baa ..t&-
bl1abed. prosr- of proc:ur_t quality &8auraac. to botb d.f1D. tb. 118.1".
Dud. &lid to evaluate tbe ab111ty of clifferct aulyzer. to _t th... nud..
Th18 prosr- 18 cl1vUed l8to four .taI..: 1) U.er Naeci. ADalyd., 2) Pre-
Pureba.e Eva1uat:LoR. 3) Purc:baH Spae1f1c&t1OU ad CoRtraet Cond1t1ou. &lid 4)
ACCaptallC. T..t1ll&.
Thi. faur .tale proe... -. applied 111 the recat purcba.e of tWelve sulfur
cl1ozlcl. analyaer. for the n.,.r-t '. KoIU.tor1ll& Pro,r_. Surpr181agly, the
in.t~t tbat loolrad the b88t at the bq1mUA& of the pra-pureba.. evaluat1C111.
8IUI towrd vhich the l18er sroup -- l88d1ll&, -- IIAt the _ly.er tbat .cor8d
h1&ha.t in tba Uaal ..,alll&t1&. Aa a r881< of tha Depan:aat'. waluat10n
proc.... a cliff._t 8D8ly.ar -- purdlaaed. I" ciaf1l11a& tha l18er 118811. in
qUaDtif18b1e fora. aIId th8a objectively 118&8Ur1A8 tha ab1l1ty of ciifferct
.1181,.... to _t th..a 118811a. the Depart88l1t of Natural aaaoure.. baa &88Ured
iuelf of purebaa1ll& the b88t ..,a1lab1. -.1y.er that CAD cio the job required.
mTIODUCTIOH
Ealrir__tal Protect1& Alaey racul&t1ou stau that 110 later thaD
February 1980. all 88biat air 8D8lyzara l18ed 18 .tat. ~tor:1D8 pro,r.. &8
Ipac1f1ed 18 their .tat. 1att1__tatlGD plaR ..t be approwed refer_e. or
8q11ivalct 8D8ly.er.. For -at .tat.. tb1a vill ... replac1D, "obao1ata"
.1181y..r. With - lIDIiela. The -81 .,_t on th.18 n- equip88ll1t in th. n~
t1lll ""1'. eaulci ...l1y raech tc a1ll1on ciollar.. Unl... .tat. a,_1.. &lid
pr1yate air ~tor1ll& sr-pa telra pr8C8llt1ou, n_1y purcbaaed _lyaer. -y
Dot ...t their 11881&8, or if they cio, it -" be at - ae...1Y. co.t. To avoid
.ueh prob1... a QuaUty Aaauraaca PlaR for pr__t of -.1y.u. &lid other
tap1tal purcha88a, 18 ci88irab1a.

Th. W1ac_l8 Deparm8t of Nacural laaOUZ'cu (DR) has d..,alopeci sueh a
plAD for it. 1aa~t procur_t cd baa rec_t1" 118- the plan 18 the purc:baH
of auUur ci1oz1da 8D8lyzer. for ita statevU. _1tor1ll& __k. Th18 paper
d..erib.. the ,_eral f..cur.. of the l1li. pr__t plan, &lid bav the plan -.
app118d 18 .81act1ll& a .._ifte 1IDIi81 of 8I&lfur clioz1.ci. _lyaer for Via_1D.
This plaD prcwided l1li. With - ob1acUv. -- of a81act1ll& - -.1yzu wh1eh
beat -u tile D8eda 8II1II r..aure.. of tile a,-ey. It baa ,_eral app11c&b1l1ty
to all a,-188 aIId to pr1_ca COII81Iltaata ad corporaUoa.a a. W8ll.
The plaD COU818ta of tllr.. parta:

1. Pre-purelleaa ..,aluatlGD ad aelectlGD of the -.1yau.
II. Purcbaaa CoRtract Spacif1c&t1ou baaed CII1 the pre-purdlaa. evaluation.
III. Aceeptaca t..t1ll& of the purCha8ed -.1YZU8.
"Copyright 1979
American Society for Quality Control, Inc.
Reprinted by permission."
rl9'79 American Sociaty for Qaaliey CoDIroI
11-9

-------
31
1979 - ASQC TEOiNICAL CONFERENCE TRANSACTIONS - HOUSTON
PU-Pt11CBASE EVALUAnOK
Th. pr~purcbaa. w.1uac1011. d8fiD.. the IIpec1ficaciou thac ch. 811&lyzu
8U8C ...c &IIIi th- co d.cua.1A. which 811&lyzu buc ...u thu. IIpec1ficaCiC1118.
1. ADa1y.18.. bciDa of Perfol'lllaDc. N.eda
a.for. evaluaclDc Ulli1v1dua1 UI&lYZUII, che pufol'lllaDc. r.quired of the
UI&lyau 8U8C b. d8f1A8ci. Wher. v1ll the aaalyzu b. u.ed - arlNllA a po1AC
sourc. vbar. CDIIC_crac101L8 of Nlfl&&' dlozid. u:c.ed1D& 500 pU'u p.r b1llicm
ar. noc WlC-Ii, or 1A a naral ..cc1Da vhu. valuu .. h1&h .. 50 p.rca p.r
bUl1cID are qu1C. rar.? Wbac l..ela of 8CC1&r8CT uui p%ac18io.. are n88ded? W1Iac
sb0W4 the r~ t1lM of th. 8II&lyzu b.r Do the 8IqI8Cud aabiac C=C8II-
traU.ou chat. r..1dly or oyu . pu10d of h.o\Ira? Wbat -iDc_c. uquir__u
do.. the a18l:.,. hey. - v1ll IIp_acor. accad the sit. daUy, or oaly o..c. pu
__k7 8cnr _cia fuadlDc 18 avaUa018 for th18 purcbaa.?
ODe. the pcf-. 1I1I-~1cac101L8 are dal1A8ci, th.,. are r8Dk8d 1A order
of th81r ~n&Dc. to the ..u.corlDc n.cwork. Th. l80ac ~rc_c lI1Iec1f1cacioo
nc81... the h1&hMe l11181tu ad the l8uc w,orc8Dt 1I1I-~1caC1ola rec.1v- a
r~ of "1".
z.
IDaC%UaCc ...-_c
AD ft'&l1&8c101a of uch 898C~1c en. of 1A8c~c 8U8C b. ud8 to decid.
-h1CJa ~y_a alMN1.cl be brGallac co the l.&Jt for fureher checlrG8e. Th18..-
..._e 18 a cwo ec.. proc_.
a. t'1I8 ..w-Ul" ad d1...a-cq- of uch C71I8 of 1A8e~e ara
dacumaed by ..aluaelDc iaf_c101l. prGrided by the _facaa_. .. -u. ..
chee fOUlMi 1A the 8&1.y.u'. OfG'aCiDa -...1. Th18 1AYolv- a c~ of
__e pr1lac1plu, pcf-. ~8Ccu18e1ca _d tha relae1va C08Ipla1cy
of opanc:S.oa.

b. 5--.1 -. of ucla 88Aly.u ara _cacced co check OR the aaalyau'.
pcf- 1A tll8 f181ll. A uau cou&ce C(1I88Uaaaira vIa14h - d_alopad by
.... vIIicIa 40...1...1.. 8cb iaf_e:S.oa .. the ~c_c of v&U.cl daca CAfCl&&'a, the
a....a --- of 1A8~e br..W- .:I.IIc. the aaalyzu. -ra purcha88cl, the
pU'U r.,lac8d _.e fr...-uy. ad the puc_e .... uUe ..,.:l.aced.
t'1I8 -.1.18.'. uU1ry CO -- uch of the .-f_a .1I8CU1cae1otl& 18
_Uc8 CO a ~ r&C1q, vi.eh the h1&bae --- ...1p8II co che aaalya-
vIIicIa bue .-u tll8 8p8CUicaeiota. Tha raelDc 18 -.les.,U8d by cha ruak:1q
auipM thee .,~1c&C1GD 1A the earUu l188li8 88Alya18. Th18 proc... 18
r..-e8d for ucla .,.U1c&Cioa. .ad the r-.1ca for all a1l8C~1cae1otl& are
........ The r.-1c' 18 a raldat of 1A8U-U 8CCOrdSAt co ch81r appar_e
uU1Cy co _e CM pat- 8p8CU1cac1ola8. na c_- co. r.eed 881yzua
are tb88 ..&luaCM fuch8.
3.
x-tauet..Uaa
na c_- UI&lyau. vi.th tha hilha.c .coru 1A che Iuc%U88le ...u_c
u. 8IIIIjecc. to a laMracory checlrG8c to dacus1D8 vlUch aaalyzu .h.oI&ld ba
purcha88cl. The 18---- tUC1q cou18ta of waluaC1Dl che critical perlo-.:.
par_eua 1cI_cUi. 1A cha earliu naada aDaly.1a. For --.la. 11 low
881118c 1...u u. l'88Ua8ly --1'.. 1A8c~c lI018a v1ll be aD w,orc8Dc
par_cu. !acla 1A8~c 18 cha chackad for iea 11018& 1..81 IA.SAt tha
uchDU 
-------
1979 ~ ASOC TECHNICAL CONFERENCE TRANSACTIONS - HOUSTON
PmtCBASE CONTUCT SPECIFICAnONS
The pcfonlaACe 8pec1ficaciGD8 fo~ che 1I18cnaaenc v1ch cbe h1gh..c r8l1k1Ag
are wriccen 1I1co tbe COQcra~c for puz~ba8e. The pur~be.e concra~c 8pe~ifie8 a
6G-day pc1oci, aftar 1Dac~c delivery, 111 whi~h DR CAD IIValuata ..~h 1I1.tru-
..at to a..u~e tbet ea~h one meec. tbe p.rforaan~e 8pe~1f~.tiona writtRD 1I1to
the cont~a~t. lnatruaent8 DOC .eeCiAg tbe .p.~1f1~atioD8 CAD be returned to tbe
a&DUfa~cu~er fo~ replac"8Dt, v1thout cbarga to DR.
The ~oncract al.o r.qu1r.. the V8Ddor to po.c . p.rfor88nce 1:wmd - 20% of
the total purcbe.. pri~e - for. - yur par1oci. The baud vauld be forfeited
for:
a. fa11.ure of .., 1Dac~c to _ec the perfora8D~e .pec1ficaciona for
at l...t oue year,
b. f.1lur. of tta. v8Ddor to boDGr a au. yur _rraa.cy OD all 1I1.cr1l88Dt
coapon8Dc. ,
c. fa11.ure of the v8Ddor to provl4e a eubnicuce 8I1&1yz.r to replace a
faulty aaalyzar be1q repa1red Ullliar the one year -rraa.cy, aDd
d. fal1ure of .., iAec~t CO operace pl'aparly for _re thaD 30 ciay..
dur1D& tba fir.t year of op.aC1oli.
Thu. coa"acc .,8CU1c:aC1oli8 balp iAeure thec DR v1ll bev. reliable,
fllllCt1lnU.q aaalyaar. pl'ovUSq --- ciace captur..
ACClPTAlfC! TZSTIlIC
a.fore a D8V 1Dac~c 18 COIi8l4.red capebl. of 18D.ret1D1 vall4 _b1-c
air quality ciaea, it ...c be cbec:1r8d to 1Deure tbet ic _u tbe pedo_c.
8peCl£1c:aC:1oll8 1D the parc:ba.. COIICract. .. uc:b iAetr1l88lt 18 r.cdved it 18:

1. ID8p8cC811 to be ..r. cbat all ,area 8Dd opc1oDa1 aqui"8DC 8I'e pI'_.ac,
c_tlau ar. t1abt, 8Dd that uc:b lID&lyaar 18 c:oa£icured the - -y - -
88b8r of c1r=i~ boari.,. - type ...s .18. of JIU8P., .Ce.
2. Op8nc. 1D. the ~aCOl'7 for 8C l...c - ~ to 4auu ~1ate
aalfuDctlau clue to ciaf8CC1". parce, poor _acC:1oII8, .te.
3.
rUC. for c:r1c1cIal IlK-car. - e.I., tba 1101&8 lavel.
lA 8ddit:in., a rad88 ...,J.Sq of 8D8lyaer. 18 cho- &lid sr. ilMtaptb
parfo- ebacU are CCIIIductM. If tha. cbacke fal1 CO _t the p.fora8DC.
8~1c:aC:in.e Sa tile pure-- COIIcracC, all 8D8lyaer. v111 be eb8clr8d 1I1-ciaptb.
ID8t~u ,...Sq throulb tb1a proc... v1tbout prob~ er. placed ac
_itor1D& .itu 8Dd rim .:i8ul~ly v1tb the "oU" 8D8ly.ar. for at lun 30
day.. Tba ciaea o"ea1aad 18 l188li to d.c&rmDa 1£ the D8V aaalyaer 18 fUIICC1GD1D&
properly, aDd alao to ucalll1ab .., 41ffer8Dc. 111 ttae ciaca be.. due to the
av1tcb to tile D- 8D8ly.ar. It 18 WpoTC8l1t to bav. tb1a 1I1for88t1OD vb8D
avaluaC1lla ciaea fr08 a dc. - a period of year..
rucnc:u. AP1'LICAnCII or plOClJUH!lT PLAII
The p~8Iiur- pr8VWualy 41ac1a8e. V8re ue. ciurSq the - of 1978 bJ
the Stete of "18_1D to puzcb88. 12 - 8I&lfur 41ox14. aaaly..r.. Th. fir.t
Itap 1D thi. procu. .. CO parton a 11881i. 8D&ly.18. Th18 lID&lye18 Uui1c:at.
... vue raqu1r. to ._ac. vel14 coats.m-& 88bi8Dt auUur 41oz14. ciaea at
88Y8D p--.t .eac:1oII8 Sa tile Kl1-kaa aru ...s at tbr.. _"11. V8l18 vblcb
collect ciaca .eac8V14a. .u.o, tiler. vae a raquir_t to o"ca1D conc~ 502
Ute fraa dt.. Sa Gr- 18,. ...s M841Ma. .. _c1oD. earlier, 11,. F.bruary
1980, aU 88bi8Dt a1% lID&lya.. 111 .tate ~~or1D& procr- -.t be approv.
r8f.r8DC8 or &qUi_laDt lIDIiel 8D8lyaer.. Therefor., it ... 4.~&nI1D. tbet the
stace Dead. to purcb8.. 12 au1fur dioz1ll. ~cora approv. bJ ErA a. 1181DI e
r8far8Dee or aquivel_t _tboci. lD 8lid1t1oD to th18 be.te D.., the foUGv1D.
it- ver. alao apec:1fW 111 the _ly.18:
11-11
-n

-------
31
1979 - ASOC TECHNICAL CONFERENCE TRANSACTIONS - HOUSTON
1)
2)
3)
4)
"
CIGeracioll of COIICUNOUI S02 daca.
"'uau \lD&ccaadeG for lolll PU1oci1 at C188 (over \l8ek8nAI, 8CC.).
C_uace valid 502 daca iA u... of boch hilh aDIi 10. _biIGc COIi-
c_craciOlll (iUA188 
-------
1979 - ASQC TECHNICAL CONFERENCE TRANSACTIONS - HOUSTON
!ho above Wonaati011 for all of tbe u.us questioued for eacb analyzer --
put 111 table fora. tabl.. III-VI at the end of th.18 report cout.&1J1 that data.
tach _ufacture va. th8ID coutacted alaiA aDd ..ked about the following:
1)
2)
3)
4)
S)
Loeaticm of factory r.pair service aDd reapoue t1Jlle
Warrauty tU188
Auto zcro/spaD ava~b1l1ty
St&Dd&rd UultZUlUDt ranles
UU1t COat of 1J1at1'Ul8.C: ntb auto zero/spaD aud &IIIOW1t of d1.8coWlt
with _lUpla orciar
th18 WOrIMt1cm wa. al.o placed 1J1 table f01'1ll (table VII) for all tbe
8D&lyzers to allow for .... of c:.oapar18cm bear.. analyzers. Al.o cou1d.red iA
tbs pret..tiDI ..,..nt of tba procur88eDt proco.. were tbe folloviDg:
1)
Vedor cooperaUoli for pre-purcha.e agre88mt cOIIcem1llg 1II-IIou.e
tut1n8 - nu.. 1Dvolved COIIt.&ct1llg each veDdor to d.te1'1ll1Ae if tbey
VVI&ld allow 118 to 118. an _11zer of their., without co.t, for a
per10d of ~ to thru -.u for the purpo.e of p.rfonI8Dce t..t1llg.
lequired support squ1~t, ..1., alectrOll1c equ1p88it, gas cylUuior.,
h1ab 8Ortal1ty part., otc.
CoIIfcmu.ty to .ut1ll1 c:al.1brat1cm d_1cu &lid 81ta ...pl1n8 -UoUa.
CoIIforaUy to 8Z18t1n8 data ac:qu1.1t1cm ay.t... &lid ab111ty to be rack
_t..
2)
3)
4)
TIIa abov. Uaf_t1.oD .. a1ao plac. iD a tabla (Taill. VIII) to a110v for
c.,ar1aGll *- C!Ia 881yz... F1II&11y a table (Tailla IX) of ..jor aAivutag..
8l1li d1aaA98DtaI- for ..ch of the 8811z.. -. dra- up for cou1d.ratiou 1J1
cletaaiD1q vh1c:b ttm'- 881.,.u. .hould be cho.. for iD-bou.. to.tiA,.

to dat..-a vh1ch ttm'- 881,.-. -U be tuted - u.ed a total p01llt-
ral:1II8 ay.t_. Eacb of C!Ia cr1t8r18 _ili.ed iD the prata.t1ll, data ...rcb
1188 rated fr08l-6 d.,..&1111 -- ita d8ir- of WportaDca. ID eNr particular
s1tU8tlDa 11018. ... pr_1.aWII -.r. _1IiRed 98I"y Sapol'tallt &ad v.a ,1". a
ral:1II8 of 6. S88pla flGw, 1101: _ili.ed .. 18portaat, .. Ii". a rat1n8 of 2.
lacb 888l.,.1Ir .. I'81aIr.8d fr- l-S dap8ul1q Upoli ~ favorailly tbey c~r8d to
otllllr 888l.,... ba1q cb8c1r8d for a part1c1U.&r cr1t.r1&. A r8Dk1q of S _I:
tb81: tll8 881.,.. -- ~t -. tile 8811... _ili.r. for tbat puc1cu1ar
cr1tllr1&: To d.~ C!Ia 88b8r of po1llta ..ch 811811.- r_81"ed for eacb
cr1l:er1&, tile r~t1n8 8IUI ruak1II& 1III8b.a -.r. 8IIl1:1p11811 tOI.thu. The..
prodacta vu. tb8a .--s for each 8811.U. The 8D&l&y.. v1tb the h1ab_t
total poiDta tNNld be tba - c:ho.. for 1D-baua. tut1n8. Th. pr.t..t1ll1 work
iAd1c:at8d tllet 81181.,.81'. A., I, ed C .hou.ld be cho.e for furthu tut1n8. At
this poiDt 111 tile procUl'_t proc... 8811aer 1 V88 81:111 tb. favored 8II81,zar.
III-bDua. t..t1q parfor88li 011 tba 881.,... I-ated tut data coIIC8I'II1.IIg
tb8 fol1ov1n8 pu_tu.:
1). 1018.
C.o lualJaa
0% Pull Sc:ale
2)
c-C12-11Da%
Z.o Dr1f
4-Boar
3)
--<::~ at 20% of Pull Seal.
Spa Drift
&aur at 80% of Pull Sc:a1e
4)
~O% of Full Scale
hscl8
0% of Full Sc:a1.
S)
Lal, Pall, ti&8. 8Dd Ca11brat1oll T1au
11-13
39

-------
.0
1979 - ASOC TECHNICAL CONFCRENCE TRANSACTtONS - HOUSTON
Th. tl.elAl p~oc.bll''' f011o". WI~I tae- fraa thl Feci8l'al R.I1UI~, Vol.
100. No. 33. P.~e II. MD1_e A1r McnI.1to~1DI RalarlDcI ud Equ1v818De Mlehodl.
Co81I8DY C va. slow 10 p~-1d101 118 v1.th aD &D&lyz.~ fo~ tl.elol. W. IIln 110t
alll. to c08pl.e. all the tl.elAl p~ocedu~.. DO thae aoalyz.~. ll88Ulta of tha
tuelAl 1181'. -~ued 10 a talll. (Tabl. X). P~1o~ to the lo-bou.. tau1q ..,
baA fured thae rupoo.. t1a8 fo~ aoalyz8l' A _ld b. too slow fo~ our aled..
AD&1Y18l' 8 V88 azpactad to have the ..e rap1d rl.poD.. t1a8. Th. IUl1'r181A1
tUt r.8Ult. 1JId1c:ated thae &D&lyz8l' A had a 1811~. rapid rupoo.. t1a8 thao
aoalyz8E. 8 aDd C.
!f~ &D&lyz.~. A ud 8 1181'. l8II"ad to aD act1v. I8IID1eo~101 dt. wbarl tile?
1181'. 1o.C~ed UId op.~aeed fo~ a two _.k p.noct a. if they II.~I b.1II1 II.. tD
rooc1A81.y coUece 8881_e 502 daca. Th1a 1Dclu4ad rooUza. c811b~aU0D8 &lid
lero/.,.. cbecu. TI.elAl"" 8180 dOD. ae the _1co~101 dCI co d.e-Ul 1£
&D&lyuZ' r881l-. -. adY8I'1a1y affecced by 8DY lourllr_c.. Th. &D&lYI18
..eboci f~ &D&l18. 8 .. f188 pMe_eZ'J. A Teclm1c&1 AI.UtaDC. Doc_e
(lPa.-600/4-78-0Z4) coacan1ll8 tb8 II.. of tl8a pboeG88cr1c d.eeccors fo~ -
1UZ'__e of 502 10 ""1_e aU r.fanad to a 1U1IP~...1OD of &D&lyz8l' rupoDlI
to~ cbia MChod by c:arboD dioz1d. (CO2)"" W. due_8I'ad ae thu pow 1a tta
c..e- ChaC &11&11881' 8 .... aulljece co tb8 u-. 1oe.rl_8C. fraa co,. VI a1Io
tOllDli ehaC &11&118- 1 .. 1... Icaill. cb.m aaal18.Z' A cll1l'101 e&11bueLHa ad
&811:01.,.. cll8ck1q.
SlUC'no. or AIIAL'nD AlII) caatC!
AC CM ... ot CM C88ClAI - b8Ii obca1Dad lIIlf1c18DC 1DfomaUoli to a110II
. deci8io11 co 118 ... C8 1Dae~e pR=~_C. Cop1.. ot all th. daca .-_at.
du1Dl tb8 p~OCII~_e PRC". 1182'. cIue~1buead to aU Oft pare1.. dface. by
tM 1Da~c puZ'Cb&8e. A _e- b.cv- th... p&Z'C1.. va. hald co cIec1d1 011
which aaal,.- co pllZ'cbu8. AU eh. daca .... ~..1.... aDd eh. aAv_equ aod
cI18advaacal" ot 88Ch of tb8 -..1ys.. 1181'. cIi8=..ad. AI _c1011ad 8ull1r
&D&ly-. 1 .... lIaav'11y f8Vft'ad bato~. th. FOCIIZ'_c PRC". b.._. IIovaftr,
.. a reale ot eb8 daca coUecead 8l1li e..e- --. -..1YI. A (T.I.C.O. lto4al
143) ..... .. ehe 8&11-81' vb1ch -U buc ..cut, -~ U8da azp~...ad
-Uv 18 eM II88U 8&1yai8. lad - lIIOe 1ImIlvad OUla1V" 10 cbia p~OCIIr-
pne88a. ic 11 po.~. - -U have puZ'Chuad aaal18.~ I. 8l1li ita .._1&e.
pnbl_. v1.ClllNc 11riDa f1all couid8Z'acioli co eb8 T.I.C.O. II. loeaad 00 u1al
elda pftCU88C peac- t~ puchu- all C8f1W ~1I~C 10 che tucu. .-
Icnq1y r--~" oclaac ..-188 118. elda ~ a lUI1lu pcOc'" toe all t~
8I(1I1f88C pIIZ'~.
u::s
650190:"2
11-14

-------
Section 12
Performance Audits
Lesson Goal
To familiarize you with performance auditing considerations (especially when con-
ducting performance audits of continuous ambient air quality analyzers).
Lesson Objectives

At the end of this lesson, you should be able to-
1. distinguish between a performance audit and a system audit.
2. describe differences in performance audit procedures for continuous versus
manual measurement methods,
3. list the four purposes of performance audits, and
4. describe considerations in conducting performance audits of continuous
ambient air quality analyzers.
12-1

-------
D..."," iJ:...,
PERFORMANCE
AUDITS
AUDITS
Perfonnance
. Quanlitilive
System
c
c
,"
~
IJ'I :;-:;." I
~.. ,

~~~~~



, -
. Qualitllive
PURPOSES OF PERFORMANCE AUDITS
. identify sensors operating out~,ontrol

. identify systematic bias of monitoring
network

. measure improvement if! data quality
. assess accuracy of monitoring data
PERFORMANCE AUDITS
Continuous
. sampling/analysis/data reduction
Manual
. sampling
. analysis
. data reduction
PROCEDURE FOR MANUAL METHODS
QA
Handbook
Vol II
See 218
Table 8.1
Fipre 8.1
. »mtJinR - check flow rate
with rotlmeter

. AnaIysis- analyze reference
samples

. Dati Reduction - perform
independent calculations
. Plot audit Ie5UIts on
control chari
12-3

-------
PROCEDURE FOR CONTINUOUS
AMBIENT AIR ANALYZERS
1. Select audit materials
2. Select audit concentration levels
3. Determine auditor's proficiency
4. Select out-of-control limits
5. Establish communications system
6. Conduct audit
7. Verify stability of audit materials
8. Prepare audit report
9. Follow up audit recommendations
1. Select audit materials
. high concentration
audit cylinder
with dilution system
I' .
. low concentration
. audit cylinder
~ c 3~. 5,°,

~
, t7 . permeation tube

"" '
/
.{
\
12-4

-------
.,..-~
--
- - -    
   CO S02 NO 
   MJ';i . use materials
   traceable to NBS
   ..J I '.1' Standard Reference
    .. .,. . Materials or CRMs
   (SRM) (SRM) (SRM)
   ~' 
- - -    
   2. Select audit concentration levels
- - -   
   3. Determine auditor's proficiency
   Cylinder Known Auditor's Measured
   No. Concentration Value I Concentration Value
    ~ 
   2 ~'",,"-,,"'- ~tt
   3 ~01",,*""""""- ----
     ..............~
- - -   
4. Select out-of-control Omits
% Diff. =
analyzer known
value - value

known value
x\100
-
-
-
S. Establish communications system
-
-
-
12-5

-------
~

/ --:?\
( -' -r~
i.f ::.~~ ~ l
i "nT1'"
, ~ I I
~ j 'tl""~
. ~ ~'wt, ~ . ~ . . ;, . t.; . .~ . ~ .: .f .\1 .;
6. Conduct
audit
7. Verify stability of audit materials
4ucit

~ ~ nC jo:~. - ,:0: J
" j,1
: I' ,,----.
I" /~\ II ,I

IllrV' ,i
, \ '-~' "d
I' \ ' II,
,I' / j ~ r--~
If~~~\ \
8. Prepare
audit report
9. Follow up audit
'I
recommendations .
,iI
i'
~
I"~""~-'-...
'/ --.,
'-- /
, ,.
./J "
. . ,- '-"'\
~.
12.6

-------
Section 13
System Audits
Lesson Goal
To familiarize you with system auditing procedures.
Lesson Objectives

At the end of this lesson, you should be able to-
1. state the purpose of system auditing,
2. recognize items that should be evaluated during a system audit, and
3. describe the procedure for conducting a system audit.
13-1

-------
- - -   
   SYSTEM AUDIT
    8 independent, on-site
    inspection and review
    of quality assurance
    system
    8 qualitative appraisal
    of system
- - -   
   PROCEDURE FOR
   CONDUCTING A SYSTEM AUDIT
   8 Prepare questionnaire 
   8 Review questionnaire 
   8 Identify weaknesses/prepare check list
   8 Arrange entrance Interview
   8 Perfonn audit  
   8 Conduct exit Interview 
   8 Prepare report 
   8 Follow up recommendations
- - -   
   ~I..  
   o.w-_II......I CUrt 0 0 . Prepare
   SOP'. 00 pre-audit
   P_Ml/T.8InI,. 0 0
   FoclUl18 00 survey
   !qul_'/Su""'''' 0 0 questionnaire
   _,Oll,. 00 
   D.t. HandYIll 00 
   Quail." _.once 00 
- - -   
   . Review completed
    questionnaire
-
-
-
8 Identify
organization's
weaknesses
and prepare
audit check list
-
-
-
13.3

-------
. Arrange entrance interview
~
r"" ,
~.:.:.~;:.~
. ---=.~
.....~,~. ~....~
~ Y..1~~~-
v v./ v
.~

d ~"I
-
-------
Section 14
Quality Assurance Requirements
for SLAMS and PSD
Lesson Goal
To familiarize you with quality assurance regulations pertaining to ambient air
quality monitoring (especially data-quality assessment in tenns of precision and ac-
curacy requirements).
Lesson Objectives

At the end of this lesson, you should be able to-
1. briefly describe the Standing Air Monitoring Work Group (SAMWG) and its
major quality assurance finding and recommendation,
2. list the four types of ambient air monitoring stations defined in 40 CFR
Part 58,
3. list the appendixes of 40 CFR Part 58 that describe quality assurance
requirements for ambient air monitoring,
4. recognize that Appendixes A and B describe quality assurance requirements
for SLAMS and PSD stations, respectively,
5. list the two quality assurance functions required by 40 CFR Part 58 Appen-
dixes A and B,
6. describe air monitoring activities that must be addressed by the quality
assurance program,
7. distinguish between precision and accuracy,
8. recognize the need for precision and accuracy assessments,
9. describe the precision and accuracy checks required for manual and
automated measurement methods,
10. compute precision and accuracy assessments for manual and automated
measurement methods (given necessary equations),
11. describe quality assurance reporting requirements, and
12. compare and contrast quality assurance requirements for SLAMS and PSD
stations.
14-1

-------
!l'!l"'U ![I!I"'U ![I!I"'U ![I!I"'U ![I!I"'U ![I!I"'U ![I!I"'.
. . . . . . .
. . . . . . .
. . . . . . .
. . . . . . .
QUAUTY
ASSURANCE
FOR SLAMS
AND PSD
STANDING
AIR
MONITORING
WORK GROUP
(SAMWG)
MAJOR QA
FINDING
. questionable data quality
MAJOR QA
RECOMMENDATION
. establish formal QA
programs
. . . . . . .
40 CFR 58
. . . . . . .
14-3

-------
MONITORING 5T A TION5
SLAMS
State and local Air Monitoring
Stations

National Air Monitoring Stations

Special Purpose MOnitoring
Stations

Prevention of Significant
Deterioration
NAMS
SPMS
PSD
40 CFR 58
APPENDIX A - "Quality Assurance
Requirements for State
and Local Air Monitoring
Stations (SLAMSr

APPENDIX B - "Quality Assurance
Requirements for
Prevention of Significant
Deterioration (PSD) Air
Monitoring"
APPENDIX A
QA Functions
. control requirements
. data quality assessment
CONTROL REQUIREMENTS
. are in general terms
. states to develop and implement a
QA program which wit:
. pnMIe uu of ildeqlMte quMIy to meet
mOlli'ori.. objectives
. minimize loss of - quIity uu due to
mdunction Of oukf.conIJoI conditions

. must be approved by RegionaJ
Administrator
GUIDANCE
. Quality Assurance Handbook for Air
Pollution Measurement Systems
. Volume I . Principles
. Volume a - Ambient Air Specific Methods

. Reference and Equivalent Methods given in
40 CFR 50 and 40 CFR 53

. Operation and Instruction Manuals of
Designated AnaJyzers
14-4

-------
- - -  
    PROGRAM CONTENT
    . method or analyzer selection
    . equipment installation
    . calibration
    . zero and span checks and adjustment
    . quality control checks
    . control limits for zero, span and other
    quality control checks - corrective action
- - -  
    PROGRAM CONTENT
   (continued)
   . use of multiple ranges
   . preventive maintenance
   . qualitY. control procedures for
    episoC:le monitoring
   . recording and validating data
   . documentation of QC information
- - -  
    TRACEABIUTY REQUIREMENTS
    . gaseous standards for CO, 502' and
    N02 traceable to NBS or CRM
   . 0] test concentrations measured by
    UV photometer
   . flow measuring instruments traceable
    to authoritative volume
- - -  
    EPA INTERlABORATORY
    PERfORMANa AUDIT PROGRAM
    ~~-~
   :;;;:::> LAB
    ~~~
   """"'...,-,-~
- - -  
    EPA SYSTEM AUDIT
    . facilities
    . equipment
    . procedures
    . documentation
    . personnel
    . (all 23 QA elements)
- - -  
   14-5

-------
QA program reviewed for:

. adequacy
. compliance
..-... DATA QUAUTY
.~a'.. ~~~::E~n~
-- Accuracy
PRECISION AND ACCURACY
PRECISION REFERS ACCURACY REFERS
TO RB'R00UCI8JTY TO coma-NESS
..
\1.
6\
.
~ is good but
«CU'Ky is poor.
Accu.Ky is good bul
preOIion is poor.
..
\ftl
Both precision
and accuracy
are good.
IMPORTANCE OF
PRECISION AND ACCURACY
DETERMINATIONS
. needed to determine quality of data
recorded

. useful for data validation

. minimizes generation of erroneous data
14-6

-------
MANUAl. MmtOO5
_0.8
   -  
 - -  -"
10, c- - I."'"
-
...,    1."--
..   .....-   
'" -"'- .....- l...",
-,-
1_- SO, NU. .- .... .... "-
'- '..1- --- ...........
   At'" ....... ................
 '" '" ..-- ",. ,t........~..
   - 1_.....,..
MANUAl. MmIODS
--
  - 
 - .- -
10,  - -
..., - - '-
.. -  -
'" - - -
-- - - -
-
MANUAL
METHODS
Predsion
C0L10CA TED SAMPlER DATA BY SITE
 Dupliat.. 0ffici0I  d,
o.y 51..... 5Im'" DiII..renc:.. ("vI
1 y, " y, . " ¥«100t
t ! ! 1 !
ft,
    iI;
95% prolNbilily limits = dj ~ ~ 5j
n
dj = ~di/n
i=l
- [ 1:d~ - (1:dY/n ]1/2
S. -
J n-1
14.7

-------
COllOCATED SAMPLER DATA BY
REPORTING ORGANIZATION
  "........-..c.... Sf.... Dnuboft
... - 0... DOI.....- ~"'IOIIH..-...c'"
1 " d, "
I ! I I
. "' .. "
'5~ proINbilily limits = i5 ' ,l~ '.
D = (n,d, + n1d1 +... + n"d,,)/(n, +"2 +... +",,)
5 -
A-
(n,,'": + (n1")s~ + ... (n~ "II~
(n, +"J + ... + n~)-k
COUOCA TED TSP SAMPlERS
To doc
c......- '" ~ ...
Civen:
5iIe 0I0pIQIe 0IIdII
 D8~...'" .'.......
 m. IUS
 ... 12Z.7
 ..... au
 1J7~ au
 ... nu
FOR Sf1t ,
 DupIiuM 0IIdII  II,
DIy s.mpIer SompIer DIIIeNnce N
, UO 81.9 -11 -u
1 '19.' '116 +6.3 -5..5
] 11&.4 1217 -~ -46
    d, 8-],8
5, = U1
FOR SITE :2
 ~ 0IIci0I  d,
DO' SompIer SompIer ~ N
, 1%7.' 119.0 -11 - 0.,
1 W.5 tK:l +13 +1.5
] na.o 1UA +46 +4.1
    dl- \90
51 8155
14-8

-------
  4Wet'... hK~ St.Indud o...uDOfl
... _Do,. _. hK~ OIN~ft
1 , . UO 11I
, J . ,.. ...
D = [(31(3.80) + (31 (1.90)] 3 + ] = 1.85
- \ I (}.II(U1)' +(J..11(1.5S)' - ""U.
1. - V' (J + 1)-1 - ~
95% probability limits = D:!: ~ (sa)
= 285:!: ~(238)
= 285 :!: 3.30
= +6.15% or +06
= - 0.45% or - 00
MANUAL
METHODS
Accuracy
ACCURACY DATA IY REPORTING ORGANIZATION
    -
-DIy ~ --  do
.~v.. .- ,.... _. N
--
I y, '; fl4 II ';:\'; (100)>
! 1 1 1 1
k
    D
95'!f. poabIbiIiIy limits = D:!: 1.96 (s.)
i5 = (dl + dz + . . . + dk)/k
'Ikz ,(k)ZI
s. = ...- I d. - - I d. I
.1 I. .. I I k i = 1 I )
14-9

-------
AUTOMATED MfTHODS
InterrYl Checks
-
4ccurK"t
so,
co

NO,

0,
- C!wd
PftOoion 0Ied
- 0Ied

P_n Check
Loul "udil
Loc:.ol "udit
Lou' "udit
Loc:.oj "udil
Ext.... '"
~
lliwftldy
}~ Lr.'"
15.. uch q...".'
41 &'.'1 1 ~r qU.lrter
"I ~rwlvzeR eolCh ¥e.,
AUTOMATED MfTHODS
ExtflNl Audits
 -. SY'I-
50, EM5l ...",
CO EM5l ...",
NO, - ...",
0) - ....
&""1'" ~ """"'"
~  
AUTOMA TED
METHODS
Precision
CO I'IKI5ION OtK15 IY 4NAI. YZU 1
 """""" -  
- ~ c- - 4,
c-. IYlI I-I 'ft-.,. N
I ... .... .alS ...
, us .... .a. .U
) U7 .... .1.8 ...
. ..17 .... .- .t.)
. ... .... .&8 .at
. ... .... .an .U
    ..". ....
1,-0..6'
95% prolMbility limits = d1:t 196 (51)
= - 0.95:t 1.96 (0.69)
=+0.40% or +00
= - 2.30% or - 02
14-10

-------
-
-
-
-
CO PREOSION CHECkS
BY REPORTING ORGANIZATION
 Number  
 Biwoeldy  
An.oIyz~ ChedIs ii, 'j
1 ., ii, "
1 1 1 1
k ., ii, "
- - - -  
    , 
    In.d.
    - .-1 I I
    D--
    - k 
    In.
    . I
    1=1
    I(ni-1)S~
    5.= I(nj-1)
    95% probability limits = D ~ 1.96(5.)
- - - -  
 Number  
 Biweekly Ii 
Analyzer Checks 5
1 6 - 0.95 0.69
2 6 + 1.03 0.94
3 6 -1j6 0.51
- - - -    
    D= (6M- 0.95) + (6M + 1.03) + (6M. 1.76) 
     6+6+6 
    = - 0.56  
     (5Mo.691' + (5Mo.941' + (5H0.511' 
    5.= 5+5+5 
    = 0.73  
- - - -    
    95% probability limits = 0 ~ 1.96 (Sa)
      = - 0.56 ~ 1.96 (OJ])
      = + 0.87% or +01
      = - 1.99% or - 02
- - - -    
      14-11

-------
AUTOMA TED
METHODS
Accuracy
ACCURACY DATA BV
CONCENTRATION LEVEl
MoIyzer Oboefted KMwn  d,
""* l.fteI l.fteI ~ N
1 y, " r. .'f ~100~
! 1 1 1 1
k
    D
95' protwbility 6mib = ih 1.96 (5.)
502 AUTOM4TED METHOD
l1VB. 3 (G.35 . 0.45 ppml
AnIIyae 0b0eIWd -  de
AIIIII l.fteI Leoel ~ N
1 II.J9 UJ ...... .9.]
2 IL40 UJ .o.a -41
) GAS DA4 +1.01 +13
    15-. U
s. "" 5.8
95% proINbiIity limits = D ~ 196 (5.)
= - J..9 ~ 196 (5.8)
= '7.5% or '08
= - 15.0% or - 1S
"
REPORTING
REQUIREMENTS

SLAMS
. pooled qu..rterly precision and
accuracy averages
. reported through EPA Regional
Office to EMSL within 90 dOIys
after end of qu..rter
14-12

-------
REPORTING
ORGANIZATION
A state or subordinate organization
responsible for a set of stations
which monitor the same
pollutant and for which precision
and accuracy assessments can be
pooled.
A reporting organization
shQuld usually have:
. common team of field
operators
. common calibration facilities
. common laboratory support
PRECISION AND ACCURACY
SUMMARY ANALYSIS
. quarterly summary analysis from
EMSL to states - within 6 months
after end of each quarter
. annual summary analysis from
EMSL to states - within 9 months
after end of year
EPA REGIONAL SYSTEM AUDIT
. Verbal Report
From: Regional Audit
Team
To: Aucitee
When: Immediately
following audit
. Written Report
From: Regional Audit
Team
To: Auditee
Copy: State
When: Within 1 month
of audit
. Annual Regional
Summary

From: EPA Regional
Offices
To: States/EMSI.
When: Within 6
months after
end of year
. Annual National
Summary
From: EMSI.
To: States (EPA
Regional Offices)
When: Within 12
months after
end of year
14-13

-------
.
EMSL PERFORMANCE AUDITS
. True Values

(written)

From: EMSl

To: Ltch Reporting
Orginization

When: Within 1
month .Iter
e.ch .udit
. Annual Summary

Report

From: EMSl

To: Regions,St.tes,
Reporting
Orginiutions
When: Within 9
months .fler
end 01 ye.r
APPENDIX B
Quality assurance
requirements are the
same as Appendix A
requirements except
for the following:
 APPENDIX B 
Tope ---- A ---- .
--Q. s.... l"""" SoIIIu 0.... 0,......
....-. 
-- - UIt to 12 ,......
QA ...,....,.... c-- -- o.w-
Act...., "............ --- ~. "......... ...
.- --- ---
 .... c........ ,... ...... .... c:.......
 -- 
 APPINOIX.  
 ~  
'- -.  -.
 .....- -...-
 ...". ........... -...-
 ".Il1O,._-
 -   
 ,.. 1_....... .. -........
--    
~- ,---......-- ....--.--
 -to. fiIIO ... .. 'v...",
 ---- <--
 - 
- .-- . ---....-
14-14

-------
DATA A5SESSMENT REPORT
REPORTING.
STATE ORGANIZATION
mID
YEAR

IT]
1 2
1
OUARTER

o

8
~~-p"'';;_.
--- ------
.....
~
.
.....
(.)1.
SEND COMPLETED FORM
TO REGIONAL OFFICE
WITH COpy TO EMSlIRTP
J 4
5
6
NAME OF REPORTING ORGANIZA TlON
AUTOMA TED ANAL YZERS
PRECISION
NO. OF NO. OF
ANAL YlERS 1 PRECISION CHECKS
PROBABILITY LIMITS
A. CO
EEEEHJ
9-14

EEI!EEG
9-14

EEGEEEJ
9-14

~
9-14
LOWER UPPE R

OJ] ITITJ BIECD
15-11 18-71 . 22-27
LOWER UPPER
OJ] ITITJ BIEITJ
15-17 18-11 . 22-27
LOWER UPPER
OJ] ITITJ ETJLUJ
15-11 . 18 -21 22-27
LOWE RUPPE R
OJ] ITITJ ECCIIIJ
15..11 18 .21 22-27
LOWER UPPER
fJIII]J
22-27
B. N02
C 03
. D S02
E -
EIIITIJ OJ]
9 - 14 15 11
ITITJ
18 21
ACCURACY
----
SOURCE OF
LOCAL
TRACEABILITY PRIMARY NO Of
51 ANDARy_2..~DlTS
A CO EEEEJill 0
  28 -33 34
B. N02 EEI!EEG 0
  28-33 34
C. 03 ~ 0
  28 -33 34
D. ~02 ~ 0
  28-:33 J4
E. - EIIITIJ 0
  28-33 J4
PROBABILITY LIMITS
LEVEL 1
--
LEVEL 2
LEVEL 4
LOWER UPPER

~

LOWER UPPER

EITEIIJ
57 -62
LOWER UPPER

fC[E[]]
57 -62
LOWER UPPER

~
57-62
LOWER UPPER

~
57-62
LEVEL 3
NO. OF AUDITS
AT LEVEL 4
lOWER UPPU\ lOWER UPPER

o ITIJ. ECDEIIJ fTIIllJ
J5 36 38 39-44 . . 45 -50
LOWER UPPER LOWER UPPER
o ITIJ ~ r.:r-nr.;::r--n
J5 l~~~ I%~~
36-38 LOWER UPPER lOWER UPPER
o ITD EITEIJJ EQE[]J
J5 36-38 39-44 . 45-50
LOWER -UPPER LOWER UPPER
o OJ] ETL.EITI ETL.EITI
J5 38 39-44 45 -50
36- LOWER UPPER LOWER UPPER
o OJ] ETL.EITI FUIIIJ
J5 36-38 39-44 45-50
LOWER UPPER
~
l51 ~56 L.IL...LJ...-.J
LOWER UPPE R
f3IIIJJ
51-56
lOWER UPPER

EITEDJ
51-56
LOWER UPPER

ETL.EITI
51-56
LOWE RUPPE R

ETIEIIJ
51-56
ITIJ
63-65

ITIJ

63-65

ITD

63-65

ITIJ

63-65

[IT]
63-65
._------_.-.
2 Identify according to the following code

A. NBS SRM
B. EMSl REFERENCE GAS
C. VENDER CRM
D. PHOTOMETER
E. BAKI
F. OHlER. SPECIFY
1 COUNT ONL Y REFERENCE OR EOUI\;ALENT MONITORING METHODS

-------
DA T A ASSESSMENT REPORT
OMB No. 151-ROO12
E.p....
REPORTING
STATE ORGANIZATION
DIIIJ
YEAR

IT]
OUARTER
o

8
SEND COMPLETED FORM
TO REGIONAL OFFICE
WITH COpy TO EMSLlRTP
1 2
3 4
5
6
1
....
~
.
....
0\
 NAME OF REPORTING ORGANIZATION     
    MANUAL MHHODS    
     PRECISION    
    NO. Of   NO. Of  PROBABILITY LIMITS LIMITS APPLICABLE NO. Of VAllO
  NO Of  COllOCATED COLLOCATED  COLLOCATED
  SAMPLERS 1 SITU SAMPLES 
-------
Precision and Accuracy Data from State and local
Air Monitoring Networks: Meaning and Usefulness

Raymond C. Rhodes
U.S. Environmental Protection Agency
Presented at 73rd APCA Annual Meeting and Exhibition in Montreal, Quebec. Canada,
June 1980.

Raymond C. Rhodes is a Quality Assurance Specialist for the U.S. Environmental Protec-
tion Agency. He received a B.S. degree in Chemical Engineering and M.S. degree in
Statistics from Virginia Polytechnic Institute. He has more than 30 years of experience in
quality assurance work. He is a fellow of the American Society for Quality Control (ASQC);
is a past chairman of the Chemical Division and is currently Chairman of the Environmental
Technical Committee.

Raymond C. Rhodes
Quality Assurance Specialist
Quality Assurance Division (MD- 77)
Environmental Monitoring Systems laboratory
U.S. Environmental Protection Agency
Research Triangle Park, North Carolina 27711
14-1 7

-------
Precision and Accuracy Data from State and Local
Air Monitoring Networks: Meaning and Usefulness

R. C. Rhodes
Introduction

Appendix A of the EPA air monitoring regulations of May 10, 1979 include requirements
aimed at improving the quality of air monitoring data obtained by state and local networks.
The requirements involve such aspects as network design, site and probe location, use of
reference or equivalent methodology, and the establishment and documentation of quality
assurance programs. State and local agencies are also required to perform special checks to
determine the precision and accuracy of their pollutant measurement systems and to report
the results of the checks to EPA Regional Offices and to the Environmental Monitoring
Systems Laboratory (EMSL) at Research Triangle Park, North Carolina. The requirements for
reporting precision and accuracy data are effective January I, 1981.
Precision and Accuracy

Precision and accuracy are two fundamental measures of the quality of data from a
measurement process. Simply stated, "precision" is a measure of repeatability of the measure-
ment process when measuring the same thing, and "accuracy" is a measure of closeness of
an observed measurement value to the truth. Precision and accuracy of air monitoring or
measurement data cannot be ascertained from the data themselves, but require the use of
specially planned checks from which precision and accuracy can be estimated.
Precision

In general, precision can be determined under various conditions. For example, precision
will be better when repeated laboratory measurements are made with a single instrument on
the same day and by the same analyst than when the repeated measurements are made on
different instruments, on different days, and by different analysts. The conditions under which
precision is measured are carefully defined in the regulation to properly interpret and use the
estimates and to assure comparability of the precision estimates.
Because all components of a total measurement process contribute error to a reported
value, it is necessary to determine the precision under conditions which involve all com-
ponents of the measurement process. For air monitoring systems, the best and easiest way to
accomplish this is to use duplicate, or collocated, measurement systems to obtain duplicate
results when sampling the same air. The agreement between the results is a measure of preci-
sion of the entire measurement process.
For manual methods, the regulations specify the technique of using collocated samplers for
estimation of precision. Not only does this technique involve all parts of the total measure-
ment process, but it determines the precision using actual concentrations of pollutants in the
ambient air.
For automated analyzers, the use of coUocated sampling instruments would be best to
measure repeatability. However, the cost would be prohibitive. The next most desirable
technique would be to perform "span" checks at approximately ambient concentration levels
at random points in time between successive instrument adjustments. In this way, the preci-
sion is a measure of instrument drift from the time of the most recent instrument adjustment
or calibration to the time of the precision check. The regulations require the precision checks
to be made at two-week intervals or more frequendy. Although not stated in the regulation,
following introduction of the "precision" gas and after reaching equilibrium conditions, an
average of the instrument output should be obtained over some relatively short period of
time, e.g., five minutes. Thus, the precision estimates have meaning only with respect to the
time-averaging period over which the average values are obtained. Precision estimates for
other time-averaging periods would have to be determined by knowing or assuming a drift
pattern between successive instrument adjustments/calibrations.
80.43.1
14-18

-------
Accuracy

To measure the closeness of an observed measurement value to the truth, some material
or condition of known (true) property must be measured by the measurement system being
checked. The measurement system is "challenged" with the "known" to obtain the observed
measurement. For automated analyzers, "known" gaseous pollutant concentrations, deter-
mined using different standards and different equipment from those used for routine calibra-
tion and spanning, are introduced into the measurement instruments. In this way, two dif-
ferent calibration systems are involved: the one used for routine monitoring and the one used
to assess the "known."
For manual methods, it is difficult to challenge the total measurement system with
"knowns." Therefore, an accuracy audit is made of only a portion of the measurement
system. The two major portions of manual measurement systems are the flow and the
analytical measurements. The flow measurement portion of the TSP method, and the
analytical measurement portion of the N02 and S02 bubbler methods are audited for
accuracy.
Regulation Requirements

Based on the above considerations, special checks/audits were devised. Table I sum-
marizes the minimum requirements specified in Appendix A of the May 10, 1979 regulation.
Precision, Automated Analyzers

Precision checks are conducted at least biweekly and are made with the following concen-
trations of gases: 0.08-0.010 ppm for S02' OJ, N02, and 8-10 ppm for CO. These preci-
sion checks may be made using the same materials, equipment, and personnel routinely used
for instrument calibration spanning.
/Table I. Special checks and audits for estimation
of precision and accuracy.
Precision
Accuracy
(local audit)
Automated analyzers
(S02. CO, N02. OJ)
Type check
Frequency
Precision check at one
concentration
Biweekly
3 or 4 concentrations
Scope
Manual methods
Type check
S02
N02
TSP
Frequency
All monitoring instruments
25 % of the analyzers each quarter
At least 1 per quarter
All analyzers each year
Flow
Analytical
J 3 levels
Collocated samplers
2 sites
t
Each monitoring day
1 level
25 % of the sites
each quarter
At least 1 per
quarter
All sites each
year
Each analysis-
day
At least twice
per quarter
(Not applicable)
Scope
2 sites (of high
concentration)
80.43.1
. -
14-19

-------
Precision, Manual Methods

Precision checks are made using collocated samplers at at least two sites (of high concen-
tration). One of the collocated samplers will be randomly designated as the official sampler for
routine monitoring; the other shall be considered the duplicate. Results from the duplicate are
to be obtained each day the designated sampler is operated unless the samplers are operated
more frequently than every sixth day, in which case at least one duplicate is required each
week.
Accuracy, Automated Analyzers

Automated analyzers are challenged (audited) with known pollutant concentrations at three
levels (or four levels. in the case of episode analyzers), in accordance with Table II:
Table U. Automated analyzer audit concentrations (ppm)
Audit level
1
2
3
4
Concentration range 
SO"~ NOz, 0] CO
0.03-0.08 3- 8
0.15-0.20 15-20
0.40-0.45 40-45
0.80-0.90 80-90
Twenty-five percent of the automated analyzers of each type in the monitoring network are
to be audited once each calendar quarter so as to represent a random sample for the entire
network. Thus, for each quarter, the results represent a random sample from all of the
analyzers. However, at least one analyzer shall be audited each quarter and all analyzers shall
be audited each year. Since the audits are to be conducted with standards and equipment
different from that used for calibration and spanning (the analyst should also be different),
when the audit is performed within the quarter is not critical.
Accuracy, Manual Methods

For manual methods an accuracy audit is made of only a portion of the measurement
system. For TSP, only the flow measurement portion is audited; for NOz and SO"~ only the
chemical analytical portion is audited.
The flow rate audits for TSP are made at the normal operating level. Twenty-five percent
of the sites shall be audited each quarter, so as to represent a random sample for the entire
network. However, at least one site shall be audited each quarter and all sites shall be
audited each year.
For the NOz and SOz methods, audit samples in the following ranges are used: 0.2-0.3
Jlg/ml; 0.5-0.6 Jlg/ml; 0.8-0.9 Jlg/ml. An audit at each concentration level shall be made
on each day of analysis of routine monitoring samples, and the audits shall be made at least
twice each quarter.
80.43.1
14-20 '

-------
Computations
Signed Percentage Differences

The general form for computing individual signed percentage differences, d" whether for
precision checks or for accuracy audits, is:
d = Vi-x' (100)
i Xi '
(1)
where, for accuracy audits (both automated analyzers and manual methods) and for
automated analyzer precision checks, V represents the observed value and X represents the
known value. For manual method precision estimates (collocated samplers), V represents the
duplicate sampler value and X represents the designated sampler value.
Percentage differences instead of actual differences are used because errors in precision and
errors in accuracy are generally proportional to concentration levels.
Signed percentage differences instead of absolute percentage differences are used to reveal
or highlight any systematic errors that may need to be investigated and corrected to further
improve the precision and accuracy of the monitoring data. Absolute percentage differences
would not enable a separation of the systemati<: errors from the random errors.
Data Summarization

Precision and accuracy data are summarized and reported for each calendar quarter.
Precision. For each analyzer or site, the individual signed percentage differences are sum-
marized by calculating an arithmetic average, ~, and a standard deviaion, 5). Ninety-five per-
cent probability limits can be calculated for each instrument or site for local network informa-
tion, using the following formula:
d)z 1.965).
(2)
Although the regulations do not require such limits to be computed, they should be of par-
ticular interest and value for the local network as a supplement to their routine internal quality
control. However, for reporting to EPA, a consolidated set of 95 percent probability limits,
Dz 1.965.,
(3)
are computed for automated analyzers: where the D is the weighted average of the a), and
5. is the pooled, weighted valued computed from the 5).
The expression for the probability limits for precision for collocated samplers is:
Dz 1.96 5./../2.
(4)
This .fl factor is introduced to correct for the statistical accumulation of imprecision of results
from both the duplicate and the designated samplers. The probability limits are thereby put in
terms of individual reported values, the same as for the other probability limits.
Accuracy. From the dl values obtained from the accuracy audit checks at a given concen-
tration (or flow) level, an average [) and 5. are computed. For reporting to EPA, 95 percent
probability limits are computed using Equation 3.
Meaning of Probability Limits
Average Value, Precision

Automated Analyzers. The ~ values for each instrument represent the average bias of
results due to instrument drift. The D simply represents, for the network, the average of the
a).
80.43.1
14-21

-------
Manual Methods. The dj values at each collocated site represent the average bias between
the results from the collocated samplers. The 0 simply represents. for the network. the
average of the 
-------
Various control charts can be used for plotting the results of the precision and accuracy
data. As indicated above, the results of the precision and accuracy checks, if used in a timely
way, can provide a valuable supplement to normal routine internal quality control checks.
Quality Control Charts. Although the prime objective of the precision and accuracy audits
is to obtain an assessment of data quality, a number of statistical control charts can be main-
tained to provide some long-term internal control. With control limits established on the basis
of past history (at least one quarter for precision, at least one year for accuracy), future data
values can be plotted to detect any significant change from past experience.
In general, the control chart limits will be similar to the computed probability limits except
that the 1.96 value will be replaced by a 3. (The 1. 96 corresponds to an expected 95 per-
cent probability-the 3 corresponds to an expected 99.7 percent probability.) In the case of
manual method precision, the J2 factor is not included because the points to be plotted will
be the percentage differences, which include variability from the imprecision of both samplers.
Also, since the intuitively expected value for Jj is zero for precision and accuracy. the
centerline for the control charts should be zero. Table III summarizes the various control
charts which can be plotted for the individual precision checks and accuracy audits.
80.43.1
14-23

-------
  Table III. Recommended control charts and limits for state and local agencies. 
 Pollutant  Number of  Frequency of Plotting Variability
 Control Charts Control Limits or Bias to
 Measurement Method  Control Charts  and Values to be Plotted be Controlled
 Automated methods for Precision-Single One control chart . Zero :t 3 S. After each biweekly pre- Excessive variability
 SO" NO,. OJ. and CO Instrument for each instrument  cision check. plot each and drift of each
     individual d, value instrument
  Accuracy-Single One control chart Zero :t 3 S. After each audit check, Excessive bias of
  Instrument, each for each audit  plot each individual each instrument
  audit level level  d, value 
- Manual methods 1---SlngIo One control chart Zero :t 3 S. Each day, plot d, for Excessive lack of
~ TSP Site for each collo-  each site agreement between
I 
NI SO,  cated site   collocated samplers
~ NO,     
 TSP (flow rate) Accuracy-Single One control chart Zero :t 3 S. After each audit, plot Excessive bias of
  Site per agency  each individual d, each instrument
 SO, (analysis) I Accuracy for each One for each audit Zero :t 3 S. After each audit, plot Excessive bias for
 NO, (analysis) audit level level  each individual d, each audit
~
~
....

-------
Other control charts could be plotted with the is values to detect biases from quarter to
quarter. Similarly, the quarterly values of Sa could be plotted to control or display the
variability aspects of the measurement systems.
States and Regional Offices

The precision and accuracy reports will be helpful to the states in comparing these
measures of data quality from the networks within the states. Similarly, the EPA Regional
Offices will be able to make comparisons within and between Regions. These comparisons
may point out particular organizations, states, or Regions in need of further improvement in
their quality assurance programs.
Environmental Protection Agency (EPA)

Evaluation of the precision and accuracy data is important to EPA (EMSL, Research
Triangle Park, North Carolina) in its role of responsibility for quality assurance of air pollution
measurements. The precision and accuracy data will be useful in (a) determining possible
needs for additional research efforts related to particular measurement methods, (b) indicating
measurement methods or portions thereof, which may require improved quality control, and
(c) indicating particular agencies, states, or Regions that may require technical assistance or
improved quality control. In other words, the precision and accuracy information will enable
comparisons to be made across measurement methods, and across networks or other
organizational entities for purposes of identifying possible areas in need of improvement of
data quality. With knowledge of the precision and accuracy information, EP A can consider
appropriate statistical allowances or risks in setting and enforcing the standards, and in
developing control strategies.
User

After January I, 1981, when the precision and accuracy reporting becomes effective, users
of monitoring data maintained in the National Aerometric Data Bank (NADB) will receive
along with the monitoring data, the precision and accuracy data for the corresponding time
periods and locations. The availability of the precision and accuracy data will assist the users
in their interpretation, evaluation, and use of the routine monitoring data.
Environmental Monitoring Systems Laboratory Reports

To assist Regions and states in making the above comparisons as well as to perform other
analyses of the reported precision and accuracy data, EMSL/RTP will perform various types
of statistical analyses and will prepare evaluation and summary reports each quarter and each
year.
Summary

The implementation of the May 10, 1979, regulation should result in an improvement in
the quality of air pollution data obtained from the states and local agencies. Particularly from
a quality assurance standpoint, the quality assurance plans of the states and local agencies
will be documented in detail, and quantitative estimates of precision and accuracy will be
available for users of air monitoring data.
80.43.1
14-25

-------
Section 14A
Precision W or k Session
Lesson Goal
To ensure that you can perform precision and accuracy calculations as described in
Lesson 14, "Quality Assurance Requirem~nts for SLAMS and PSD."
Lesson Objective

At the end of this lesson, you should be able to calculate 95 % probability limits for
the precision of air monitoring data collected by a reporting organization using col-
located samplers.
14A -1

-------
,...--
I. Problem

Under the conditions described below. calculate the upper and lower 95 % pro-
bability limits for the precision of TSP monitoring data collected by the reporting
organization.
Given:
Collocated TSP Sampling Data
for the Reporting Organization

Sampling Site I
Sampling period Duplicate sampler results Official sampler results
(~g/std mS) (~g/std m')
I 227 236
2 268 275
3 258 256
Sampling Site 2
Sampling period Duplicate sampler results Official sampler results
(~/std mS) (~g/std m')
I 245 257
2 227 240
3 164 166
4 212 221
14A-3

-------
Section 15
Data Validation
Lesson Goal
To familiarize you with data validation considerations.
Lesson Objectives

At the end of this lesson, you should be able to-
1. define data validation,
2. describe nine characteristics of a data validation system,
3. describe factors that affect the selection of data validation techniques,
4. list the levels of data validation for State Implementation Plan (SIP) air
monitoring data, and
5. explain the imponance of having data validation performed by the organiza-
tion that generates the data.
15-1

-------
25 40
150 480
63 36
112
DATA
VALIDATION
r:i
L:J
DATA
VALIDATION
The process whereby data
are filtered and either
accepted or flagged for
further investigation based
on a set of criteria.
DATA
VALIDATION

A systematic procedure of
reviewing a body of data
against a set of criteria to
provide assurance of its
validity prior to its
intended use.
RELATED TERMS

. data editing
. data screening
. data auditing
. data verification
. data evaluation
. data qualification
. data quality assessment
CHARACTERISTICS OF A
DATA VALIDATION SYSTEM

. Is an after-the-fact review
. Is applied to blocks of data
. Is systematically/uniformly
applied
. Uses set of criteria
. Checks for internal consistency
15-3

-------
CHARACTERISTICS OF A
DATA VALIDATION SYSTEM
(continued)
. Checks for temporal/spatial
continuity

. Checks for proper identification

. Checks for transmittal errors

. Flags/rejects questionable data
. 15 C!n after-the-fact EJ
review - "AV

"'-'
. Is applied to blocks of data
.,
Data Validation
Protocol
I,.. ,", !."..., It.. I....... II...... '~'"
. Is unifonnlyj
systematically
applied
",.10,......11..1,11..1...,,,..11..,.,,,,,,.
I....... I.., I~" '"'' ,I.... ,,,,10..,1. I..... ,
,.11...1......,,,,,,..1.........11..11..,11.....
,.1,1.,.....11",'''.1............... ,.....
1......I...,...I,.I0.....1...........U,,,.......
............."...11..11..,.10....&01.,.,.....
-,,"..1.. .".11.... ..1""",1.,......11...,
,'10.1..,.........1"''''''''''''............1.
,,1................,1''''''''01....,..110...1
..11..11.. ,n...-.I....... ...01'. ..j............
CRITERIA
. Uses set
of criteria
Maximum
span drift
Maximum
temperature

Concentration
limit
15-4

-------
. . . . . . . .   
        . Checks for internal 
        consistency 
        . uniform sampling methodology
        . uniform monitor siting
        . uniform data reduction and
         reporting 
        . pollutant relationships
        . pollutant/ meteorological
         rela tionships 
I I I I I I I I   
         0.. 
         0'0 
         0,. . Checks for
        03'  temporal/
        ppm 0 20 spatial
          continuity
         0'0 
.
.
.
.
.
.
.
.
S02
Concentration
MOIl Tues Wed Thur. fri
Sa. Sun
Temporal
(weekly)
.
.
.
.
.
.
.
.
°3
Concentration
Spring Summer F.U Wln.er
Temporal
(seasonal)
.
.
.
.
.
.
.
.
Spatial
.~,.
I ~.. ..';;F.""', .
./,...".".
.
.
.
.
.
.
.
.
15-5

-------
".,.nC\I
p..,..m,...,
ObM",..d
! CII~ NaMe
Tlm..lne..,,,..
. Checks for
proper
identification
,
St... Add,....
Unl.. I




i-I
. Checks for transmittal errors
. Flags! rejects
questionable
data
~:~::: B

0.4555
0.0450
0.0600
0.0298
0.0123
0.4555
0.0450
0.0600
TECHNIQUES EMPLOYED

Monitoring Network Characteristics

. Nature 0' ResponM Output
. Data Reduction Methodology
. Data Transmittal Methodology
. Types and Amount of Ancillary Data
. Computing/Plotting Capability
. Intended UMS 0' Data
. Amount 0' Data
. Nature of Response Output

I[~II
I -I
ooc[QJoo
[~6D~1

~S~GITJ
15-6

-------
. Data Reduction Methodology


jj)/./.f7--

...."d... I
.",1,.....

,..,~,.. .j..
,,,I..,,,,
/
. Data Transmittal
Methodology
Data Form
.~., ,II..." I ,.. .~. I....." '"''
~
,,,,...u.,n..,lh,,,,.I......,,,t
,II.. ,11".."J....."....n, "I."..,
.............11"""'''...,1"..18.
I,n.......,........,............
11,......1............."/,,..........
11,....1.......11.......".....10"
.".,..,....11........11........".
. T~ and Amount of
Ancillary Data
   -=
 ~ J 
  Un,..yal
  E".ft'.
Met.-.loglcal C.llbr.tlon  
Reconl.  
Data  Maintenance 
  Records 
. Computing/Plotting
Capability
Concentration l50pleths Plot
-. Computiag/PlottIDg Capability
(conUnued)
.--
.-
('I
so.
u
15-7

-------
. Intended Uses of Data ,"';f'~
1 -, -:>.,;::."
B t~ ~rl ~....~ Ii
~' ... ~"O>
-------
. personnel
. equipment/supplies
. operating procedures
. calibration materials
Validation should be
performed by someone
other than the person
who collected or
reported the data.
15-9

-------
QUALITY ASSURANCE AND DATA VALIDATION FOR THE
REGIONAL AIR MONITORING SYSTEM OF THE
ST. LOUIS REGIONAL AIR POLLUTION STUDY
By
Robert B. Jurgens.
Environmental Sciences Research Laboratory
Research Triangle Park. North Carolina 27711

And
Raymond C. Rhodes
Environmental Monitoring and Support Laboratory
Research Triangle Park. North Carolina 27711
The success of model development and evaluation
from a body of monitoring data depends heavily upon
the quality of that data. The quality of the
monitoring data in turn is dependent upon the various
quality assurance (QA) activities which have been
implemented for the entire system. commencing with
the design. procurement. and installation of the
syst.. and ending with validation of monitoring data
prior to archiving. Of the IIIIny sources of aeromet-
tric and emissions data that exist. the St. Louis
Regional Air Pollution Study (RAPS) is the only known
study specifically designed for model development and

evaluation on an urban/rural scale.l.2
The prime objective of RAPS is to develop and
evaluate mathematical models which will be useful in
predicting air pollution concentrations from informa-
tion of source emissions and meteorology. In addition
to detailed emissions and meteorological data. an
extensive base of high quality pollutant monitoring
data is required to verify and to refine the models.

The Regional Air Monitoring System (RAMS) 'is the
ground-based aerometric measurement syst.. of RAPS and
consists of 2S automated data acquisition sites
situated in and about the St. Louis metropolitan area.
Data from these 2S stations are transmitted over
telephone lines to a central computer facility for
processing and then sent to Research Triangle Park for
archival. Details of RAMS have been described by

Meyers and Reagan.3 The complex air pollution.
meteorological. and solar radiation measurements that
are made at RAMS sites are shown in Table 1. Also
shown are the recording intervals and the nunber of
recording stations for each inst~nt.
Two main challenges exist for an effort of the
IIIIgnitude of the St. Louis study:

I. To efficiently and effectively handle the
large quantity of monitoring data. and
2. To Obtain high quality monitoring data.

In general. data validity results from: (1) A
quality assurance system aimed at acquiri~g acceptable
data. and (2) A screening process to detect spurious
values which exist in spite of the quality control
process .
*On assignment from the National Oceanic and Atmos-
pheric Administration. U.S. Department of Commerce.
15-11
Table 1.
RAMS NETWORK MEASURE:~ENTS
"" IIUALlTY:
SUL'UII DIDIIII
TDTAL SULfUII
II.ASUllfIlUT
.IITlIIYAL I..j
lUll
~

,
,.
I
I
"YOIIIIG.ISIIL"DI
DZOIII
ImllC 01101
DIIDIS Of IIITIIDGII
.ITIIDGI. DlDIIDE

CAII..IIIDIDIIDI
IIIT"AII
TOTAL ..YOIIDCAII.DIS

..liD SPUD
..liD DIIIECTIDI
TDIHIIATUIII
UWUATU"I'"ADIIIIT
'1IIsaUII.
DIW"IIT
AlIIDIOL SCAnl1l
IIIUDIIDLDGICAL:
SDLAllIIAOIATIDII:
"UIIDMIT.II
"""ILI_IUII
"1I8.8IIT.I
QUALITY ASSURANCE SYSTEM
The following list includes the elements of
total quality assurance system for aerometric
monitoring:
Qua I i ty po 1; cy
*Quality objectives
*Quality organization
and responsibility
QA manua 1
*QA plans
Training
.Procurement 'control
Ordering
Receiving
Feedback and
corrective action
.Cal ibration
Standards
Procedures
Internal QC checks
Opera t ions
Sampling
Samp I e hand 11 ng
Analysis
Data
Transmission
Computation
Recording
. VaHdation
.Preventive lIIIintenanc
-Reliability records a
ana lysis
.Document control
.Configuration control
.Audi ts
On-site system
Performance
Corrective action
Statistical analysis
Quality reporting
Quality investigation
Interlab testing
Qua 11 ty cos ts
Detailed definition and discussion of the
elements of quality assurance for air pollution
measurement systems have recently been published.

The elements of particular concern to RAMSS
fall into three general categories:
I. Procurement and management. those activi
which need to be established or accomplished earl
in the program; .

2. Operation and maintenance. those activit
which need to be performed routinely to assure
continued operation of the system. and
.These particular elements. of major concern to d
screening. are discussed herein.

-------
3. Specific data Quality control activities,
those activities which involve the calibration and
data output from the meteorological and pollutant
measurement instruments and are explicitly involved
in acquiring quality data.

Procurement and Manaqement
D!.ta..llua 11 ty Obje..tl!:!.!!. A requ1 rement of the
initial contract stated that 901 valid data were to
be achieved. Valid data for pollutant measurements
were defined as the data obtained during periods
when the daily zero and span drifts were less than
2 per cent, with an allowance for the time required
to perfonl daily zero/span checks and periodic
multi-point calibrations.

Procurement. In planning to achieve the
object hes very stri ngent requi rements were placed
on the suppliers of the various instruments of the
system and extensive perfo~nce tests (with numerous
rejections) were conducted prior to final acceptance.
~ Acceptance Test (SAT~ After installation
of the entire network, a one..on system performance
demonstration was required to assure satisfactory
operation with respect to obtaining data of adequate
quantity and quality. The SAT was co.pleted in
December 1974.
Incentiv. Contract. Th. current contract has
introduced award fee perfo~nce incentives for manage-
ment, schedule, and for quality. The quality portion
of the award fee provides a continu.l mativation for
obtaining and 1-orov1ng data qu.lity.


been ~::~~::,:s~~:'c:~~~to~~ e~t::~~~eo~ ~~:s~:s

is th.t the QA plan (and its 1-o1ement.tion) is dyn181c
.-continually being revised and improved based upon
experience with the system. Th. QA pl.n outlines in
detail the activities of the v.rious QA elements
previously mentioned.
9~an1zat1on. To implement the QA plan, one
full-t1me emp~ is assigned to overall QA
responsibilities reporting directly to the Progr..
Manag.r. In addition, two persons are assigned for QA
on a h.lf-time basis, one for the remote monitoring
stations, and the other for the central computer
flcl1i ty.
Oper.tion and Mainten.nce

Document Control. Detailed operation and
maintenance manuals have been prepared for the remote
stations and for the central computer facility. Thes.
manuals are issued in a loose-leaf revis.ble and
document-control format so that needed additions.
and/or revisions can be made. Also, a complfte history
of changes are kept so that traceability to the
procedures in effect for any past period of time can
be made. A document control system also exists for
the computer programs.
Preventive Maintenance. Record-keeping and
appropriate analysis of the equipment failure records
by instrument type and mode of failure have enabled
more e~ficjent and effective scheduling of maintenance
and optimum spare parts inventory with resultant
improvement in instrument performance. ~~ station
p~eventive maintenance is completed twice each week.
NOnlal1y, the remote stations are unattended except
for the weekly checks, for other scheduled maintenance,
or for speci.l corrective maintenance.

cen~r'l COIIIOUt~~TMo~it~.!1' Central computer
personne , uS1ng a d1sp ay, periodically monitor
the output from all stations to detect problems as
$~on as possible. To maximize the satisfactory opera.
tion of the network equipment, the assigned QA
personnel review the following activities associated
with preventive mainten.nce:
1.
remote st.tion logbook entries,
remote station corrective maintenance reports,
2,
3.
laboratory corrective maintenance reports,
and
4.
central computer operator log.
Additionally, the QA indiv1~uals are in frequent
verbal communication with field and laboratory
supervisors to discuss qua1tty aspecU of the
operations.
Reliability Records and Analysis

Tel.communications Status Summ.ries. Each
day, a summary of telecommunications operations is
prep. red to determine which stations and/or telephone
lines are experiencing significant problems that
.ight require corrective action.
Dlny Analog/Status Check SUllllliries. Each
day, the central computer prepares. sUllllllry of anal09/
status checks by station so that major probl81S can be
correct.d as soon as possible by av.ilable field
technidans. These analog/status checks are explain"
in the section on data validation.

COnfiguration Control. Histories are kept
01 the station assignment 01 specific instruments,
by serial number, so that possible future problems
with specific instruments can be traced back to the
stations. A logbook for each instrument is maintain"
for recording in a systematic manner the nature and
date of any ch.nges or modifications to the hardware
design of the instruments.
Specific Data Quality Control Activities
Calibration
Calibration References for Gaseous Pol1utlnt~
NBS stand.rd reference materials are used for calibra.
tion standards if available. Otherwise. commercia!
gases are procured and certified at HBS for use as
standards.
Multipoint Calibrations. As a check on the
linearity of instrument response, an on-site, S-point
calibration is scheduled at each station at 8-week
intervals. Originally, acceptability was determined
by visual evaluation of the calibration data plots;
more recently, quantitative criteria are being
established for linearity.

Measurement Audits. Independent measurement
audits for pollutant instruments are performed hy the
contractor using a portable calibration unit and
independent calibration sources at each station once
each calendar quarter. Similar audits are performed
on the same frequency for temperature, radiation. and
15-12 .

-------
mass flowmeters; and independent checks ~re m~d. on
reI a t i ve humidity, wi ndspeed, and wi nd d 1 rect~ on
instruments-. In addition to the internal aud1ts per-
fonncd by the contractor on his own operation, a
number of external audits have been performed by EPA

and other contractorsS to check the entire measurement

system.

On-s~~ S~s~~~. A thorough, on-site quality
system au it 0 MS was performed for EPA by an

independent contractor.6 The results of this audit
pointed out several areas of weakness for which
corrective actions have been implemented.

Data Validation. As a part of the overall QA
system, a number of data validation steps are
implemented. Several data validation criteria and
actions are built into the computer data acquisition
system:
Status Checks. About 3S electrical checks
are made to sense the condition of certain critical
portions of the monitoring system and record an
on-off status. For example, checks are made on power
on/off. valve open/shut. instrument flame-out. air
flo.. When these checks are unacceptable. the
corresponding monitoring data are automatically
invaHcllted.

Analog Checks. Several conditions including
reference voltage,permeation tube bath temperature.
and calibration dilution gas flow are sensed and
recorded as analog values. Acceptable limits for
these checks have been determined. and. if exceeded,
the corresponding affected .onitoring are invalidatld.

Zero/Span Checks. Each day. betwelft' 8-12 pia.
tach of the gaseous pollutant instruments in each'
station art zeroed and spanned by autonatic. sequenced
commands fr08 the central co.puter. The results of
the zero/span checks provide the basis for a two-point
calibration equation. which is automatically computed
by tht central computer and is used for converting
voltage outputs to pollutant concentrations for the
following calendar day's data. In addition. the
instrument drift at zero and span conditions between
successive daily checks are computed by the central
computer and used as a basis for validating the
previous day's monitoring data. Originally. zero and
span drifts were considered as acceptable if less than
2 per cent, but the span drift criterion has recently
been increased to 5 per cent. a .ore realistic llvel.
If the criteria are not ..t. the .inute data for the
previous day are flagged. Hourly averages art
CQlllputed during routine data processing onlY with data
whtd! have l!2l been flagged as invalid.
OATA SCREEHING IN RAMS
The tests which are used to screen RAMS data are
synaarized in Table 2. Specific tests and associated
data base flags are listed. The types of screens that
have been employed or tested will be detailed. the
mechanisms for flagging will be reviewed. and then
the implementation of screening within RAMS will be
discussed.
15-13
T"h I. lCUP'''' C&TlCIIUU .... .uocun. n.IGI ral ..,. DU'
C.Uto'"
!..!.!1
I. .... 0........,
.. I.".~I
IDJ/
IDJ/

"1- . '0-"
..n
.." I.. ....,---t
,.....
c.",".,,'''
If. c..,t......,... ....,1....
&. .."....,1..
~.... ....1"8" ,..u,
,,..,, II-In
......... '."'8AC' ",.rllHl,I..
1...,1....1..
'--re' ,..,....1',
,-,.... ew....
. V.I..
IOJO
'.'''' 1..1.""'-1
-.1- . IOU
WI'" . IOU
NeC...I... ..".,....
'.1.. 1..'.-",
I. ....",., I...
..,...I..IC.I ......... _1''''1',
'...ltllc.1 ...tl.,.,
"- ..,I.
'.''''' . 10"
-.'- . 10-10
",. A ""."'_1
".1- .1 ...,1.. '..
.-....., ...... .. 18M" ,-
II
1....11"'8 10"
-.11.... - ...
rI..
"- '_II. eI ....
For descriptive purposes, the tests are divided
into three categories. The first category, "Modus
Operandi," contains checks which document the networ
instrument configuration and operating mode of the
recording system. Included are checks for station
instrumentation, missing data, system analog and
status sense bits, and instrument calibration mode.
These checks. which have beln described above, are
part 0' the quality control program incorporated in
the data acquisition syste8 and central facility da1
processing. and are an important data management
function used to document syst~ perfo~nce.

The second category, .Continuity and Relationa'
contains temporal and spatial continuity checks and
relational checks between parameters which are bas~
on physical and instrumental considerations or on
statistical patterns of the data. A natural sub-
division can be made between intrastation checks,
those checks which apply only to dati from one stat'
and interstation checks, which test the me.sured
parameters for unifonai ty acron thl RAMS network.
Intrastation checks include tests for gaseous
analyzer drift. gross limits. aggregate frequency
distributions, relationships. and temporal continui'
The drift calculations. which are part of the quali'
control progr18. have been discussed above.

Gross limits. which are used to screen impossil
values, are based on the ranges of the recording
instruments. These. together with the parametric
relationships which check for internal consistency
between values. are listed in Table 3. Setting li.
for relationship tests requires a working knowledge
noise levels of the individual instruments. The
relationships used are based on meteorology. atmos-
pheric chemistry, or on the principle of chemical m
balance. For example, at a station for any given
minute. TS cannot be less than S02 + H,S with allow
an CIS for noise limits of the instruments.

-------
'..,. J. .., UI'" - IlUTlCIW. OI(CU
  IIII'IIICIT M. 01 11"".UIIITfI
!!!!I!D  ..,~ lI",n CO.OITIOII
 lCMI "HI  
01- 0- , ,.. ..-0, .;, o.~
.u..e Do'. 0- , ,.. .. . "1 .: ..". (.0,
O,'_.f 0- ,- .0 . .0, ~ ..". (.oI'
"I"'"     
ea- -.. .- 11-  
1IItu... 0- -- Ct. . TIC .: ..... 101.1
'."''''- 0- "- Ct. . nc ~ ..... ITltCI
(1-     
S.oIfW 01.... 0- '- SIIz . " .: ..... ISlzI
,..., "'I fW 0- '- SIz . " ~ ...,. Inl
"'*- Stolf.. 0- ,- .~ . " .: ..... (.~I
_I Scan.. 0....,." ........1  
I'- s... 0.,. D.Z .,.  
iii- 01..11. " ..  
,-_ .~ ..ee  
... ...... . JI"C .see . . ..S .: ,
,-- . see  see  
arM,..     
..-..e "'. ,.-  
,-.-     
~ . 0.. I.. \.I8t,.,.,... 
~ 0.. .." \..,.,.,... 
"'-"- ...11 I.. UetI.,.,...
A refinement of the gross 1imit checks can b.
nade using aggregate frequency distributions. With a
(now1edge of the under1ying distribution, statisticl1
limits can b. found which have narrower bounds thin
the gross 1imits and which represent measurement
lev.1s that are rlrely exceeded. A method for fitting
a parametric probabf1ity lIIOdel to the underlying
distribution his been dev.loped by Dr. Wayne Ott of

EPA's Offic. of Research Ind DeYelDP8Iftt.7. B.E.

Sutl and G.V. Lucha8 hive ext.nded Dr. Ott's progrlll
to estimate parameters, perform goodness-of-fit tests,
and calculate quality control limits for the normal
distribution, 2- and 3-parameter 10gnorma1 distribu-
tion, the gamma distribution, an~ the Weibull
distribution. These programs have been implemented
on the OS I cQ8PUter in Washington and tested on
water qua1ity data from STORET. This technique is
being studied for possible use in RAMS as a test for
potential recording irregularities as we11 as a
refinement of the gross li.it check currently
employed.

Under intrastation checks are specific tests
which examine the tempora1 continuity of the data as
output from each sensor. It is useful to consider,
in general, the types of atypical or erratic responses
that can occur from sensors and data acquisition
systems. Figure 1 illustrates graphica11y examples
of such behavior, all of which have occurred to some
extent within RAMS. Physical causes for these
reactions include sudden discrete changes in component
operating characterisitcs. component failure. noise.
telecommunication errors and outages. and errors in
software associated with the data acquisition system
or data processing. For example. it was recognized
early in the RAMS program that a constant voltage
output from a sensor indicated mechanical or electri-
cal failures in the sensor instrumentation. One of
the fi rst screens that was implemented was to check
for 10 minutes of constant output from each sensor.
Barometric pressure is not among the parameters
tested since it can remain constant (to tne number 0'
digits recorded) for periods much longer than 10
minutes. The test was modified for other paramete"
which reach a low constant background level during
night-time hours.
.
.......
...... .......

AI ....... ..'LI..
...... .
. m, 'UIICTI..
.,
....... ........

a....
.... .....-....
II~.
......
.....
..... ..
......
.......
a-i
" CAL'..""
. .....
..
......
81 III"
,..,. 1. ."....,. IN,,-c ~
A technique which cln detect any sudden Jump in
the response of an instrument. whether it is from an
individual outlier, step function or spike, is the
CQ8Parison of minute successive differences with
predetermined control limits. These limits are
dete~ined for each parameter frQ8 the distribution
of successive differences for that par...ter. Thes.
differences wi11 be approximat.,y normally distributed
wi'th. mean zero (and computed variance) wtten taken over
a sufficiently long time series of measuremen~s.

Exploratory application of successive differences,
using 4 standard deviation limits which will flag 6
values in 100,000 if the differences are truly
normally distributed, indicate that there are abnormal
occurrences of MjumpSM within certain parameters.
Successive difference screening will be implemented
after further testing to examine the sensitivity of
successive difference distributions to varying
cpmputational time-periods and to station lo~ation.

The type of .jUIIP" can easily be identified. A
single outlier wi11 have a large successive difference
followed by another about the same magnitude but of
opposite sign. A step function will not have a ret~r".
and a spike will have a succession of large success1ve
differences of one sign followed by those of opposite
sign.
The i nten tati on or network uni formi ty screenill9
tests that have been implemented in RAMS will now be
described. Meteorological netWork tests are performed
on hourly average data and are based on the principle
that meteorological parameters should show limited
differences between stations under certain definable
conditions typically found in winds of at least
moderate speeds (>4 mlsec). Each station value ;,
compared wi th the network mean. The network mean; S
defined as the average value for a given parameter
from all stations hav'ng reported valid data. (If
more than 50S are missing, a network mean is not
15-14

-------
~omputed and the test is not made.) Values exceeding
prescribed limits are flagged. The limits have been
set on the advice of experienced meteorologists. The
tested parameters and flagging limits are listed
below.
Maximum allowable deviations from network mean
under moderate winds (network mean> 4 m/sec)
Winti spef!d
Wind direction

Temperature
Temperature difference

Dew point

Adjusted pressur~
2 mlsec or MEAN/3
(whichever is larger)
300
30e
.soe
30e

5.0 millibars
In addition to network screening techniques
which are based on knowledge of underlying physical
processes, methods from statistical outlier

theory9,lO wert also examined. Specifically, the

Dixon rltio testll was implemented to determine
extreme observations of a parameter across the RAMS
network. The Dixon rltio test is based entirely on
ratios of differences between observations from an
Issumed normal distribution and is easy to calculate.
The Dixon criteria for testing a low suspect value
from a sample size of n, n < 25, are shown in
Figure 2. Though the entire sample is shown as
ranked, only the extreme 2 or 3 values need to be
ordered. Associated with each value of n are
tabulated ratios for statistical significance at
various probability levels. For exa~le, if n-25.
Xt would be considered as In outlier at the 11 level
Of signifiCJnce when r" > .489. Since the under-
lying distribution may-not be normal, the calculated
probabilities may not be exact. but are used as
indicators of heterogeneity of the netNork observations
It a given tt....
0101110 ~I: X, 
-------
gure 3. G.n.,.lliZld dlel flow for environmene.. m.ltUremene svscems.
Data screening should take place as near to
ta acquiSition as possible either in data processing
ict! 15 traditionally concerned with laboratory
alysis. conversion to engineering units. transcribing
tennediate results. etc.. or in a separate module.
illustrated. designed specifically for the screening
'ocess. Screening data soon after data acquisition
!nnits system feedback in the fonn of corrective
lintenance..changes to control processes. and even
. changes in system design. This feedback is
,sential to minimize the amount of lost or marginally
:ceptab 1 e data.

The RAMS screening tests. which have been
!veloped at Research Triangle Park (RTP). are now
Irt of the data processing carried out at the RAPS
!ntral facility in St. Louis. Slow computation
leeds of the St. Louis rDP 11/40 co.puter required
!stricting the intrastltion screening tests to hourly
,erlge data. RAMS data is still passed through the
rp screening module before archiving.
J""ARY
The experiences gained in RAMS and applicable to
ther monitoring systems are:

1. Data validity is a function of Quality
Isurlnce and ~ ~c!..tt"J.!!9..

2. A QA plan and data screening rules should
1 established initially and maintained throughout
~e program.
3. The QA plan and screening !"Ules are dyn..ic.
ling improved as additional knowledge and experience
5 gained.

4. Applied during data acquisition or shortly'
~ereafter. quality control and screening checks
onstitute an important feedback mechanism. indicating
requi rement for corrective action.
EFERENCES
Burton. C.5. and G.M. H1dy. Regional Air
Pol1ution Study Progr.. Objectives and Plans.
EPA 630/3-75-009. Dec. 1974.

Thompson. J.E. and S.L. Kopczynski. The Role of
Aerial Platfo~ in RAPS. Presented at an EPA
meeting on Monitoring fr08 Las Vegas. Nevada.
March 1975 (unpublished).
".,ers. R.L. and J.A. Reagan. Regional Air
Monitoring System at St. Louis. Missouri.
International Conference on Environmental Sensing
and Assessment. Sept. 1975 (unpublished).

Quality Assurance Handbook for Air Pollution
Measurement Systems. Volume I. Principles.
EPA 600/9-76-005. March 1976.
15-16
5.
van Lehmden, D.J.. R.C." Rhodes and S. Hochheiser.
Applications of Quality Assurance In Major Air
Pollution Monitoring Studies-CHAMP and RAMS.
International Conference on Environmental Sensing
and Assessment. Las Vegas. Nevada. Sept. 1975.

Audit and Study of the RAMS/RAPS Programs and
Preparation of a Quality Assurance Plan for RAPS.
Research Triangle Institute. Research Triangle
Plrk. N.C. 27707. EPA Contract No. 68-02-1772.
6.
7.
Otto W.R. Selection of Probability Models for
Determining Quality Control Data Screening
Range Limits. Presented at 88th Meeting of the
Association of Official Analytical Chemists,
Washington. D.C.. Oct. 1974.

Suta. 8.E. and G.V. Lucha. A Statistical
Approach for Quality Assurance of STORET-Stored
Parameters. SP.I. EPA Control No. 68-01-2940.
Jan. 1975.
8.
Grubbs. F.E. Procedures for Detecting
Outlying Observations in Samples. Technometrics
11 (1). 1-21. 1969.

10. Ansconbe. F.J. Rejection of Outliers.
Technometrics 2 (2). 123-147. 1960.
9.
11.
Dixon. W.J. Processing Data for Outliers.
8iometrics 9 (1). 74-89. 1953.

-------
Section 16
Quality Costs
Lesson Goal
To familiarize you with the concept of quality costs and with considerations when
establishing a quality cost system.
Lesson Objectives

At the end of this lesson, you should be able to-
1. recall the three types of cost that compose the total cost per measurement
result of an air quality measurement system,
2. describe the relationship between unacceptable data cost and quality
assurance cost,
3. explain the purpose of a quality cost system,
4. list and define the three cost categories of a quality cost system,
5. identify at least two groups of activities that are related to each of the three
cost categories, and
6. describe a procedure for establishing a quality cost system.
16-1

-------
QUALITY
COSTS
QUALITY ~ PAYS
==
::I
'"
"
II:
....
..
'"
o
(,J
. Totilll
- Un,ilcceptilble
Dlltill

- QUiIIUt)'
A,surillnce
p-,
eJicelient
Quality
QUALITY
RELATED
COSTS
8
Prevention Cost Groups
PreventJve
"'.tnten.nee
Sy.tem
Cilllbr."on
Procure.....
5pea1
,,"ep&8-
"aIll81..,
-_&811-
6)
Appraisal Cost Groups
Quality
Control
rrocedures
Audit Ne."rea
DIIta
Valldatl-
16-3

-------
(9 failure ~::tl~e:I::I~S
CorftCtlve Action
Lost D~ta
?
ACCUMULATION
OF COSTS
. lost data costs
. other costs
. Fd=fxB
Where:
r.. ~ lost data cost
r = .. lost data

8 ~ part 0'
network budget
allSoclated
with lost data
. Prorate
Personnel
Salaries
COST EFFECTIVENESS
p I
pi A
pI
A I [!]
10
A
II F I
16-4

-------
QUALITY COST
REPORTING
. data obtained from source
documents
. reports understandable
at a glance

. data summarized
. graphs preferred
QUALITY COST TREND CIIART

m(!][!]oo
ITJQ]0[!J55~~
GG8rJ000

p p p p p p P
113.113
Quart.er. - 1979-80
...
.
o
u
!
o
"'"

-------
80-43.3
GUIDELINES FOR IMPLEMENTING A QUALITY
COST SYSTEM FOR ENVIRONMENTAL MONITORING PROGRAMS

Presented at 73rd APCA Annual Meeting
and Exhibition in Montreal, Quebec,
Canada, June 1980
Ronald B. Strong
Research Triangle Institute

J. Harold White
Research Triangle Institute

Franklin Smith
Research Triangle Institute

Raymond C. Rhodes
U.S. Environmental Protection Agency
Messrs. Strong, White, and Smith are with the Research Triangle Institute, P.O. Box 1294,
Research Triangle Park, North Carolina 27709.

Mr. Raymond C. Rhodes is in the Quality Assurance Division, Environmental Monitoring
Systems Laboratory, U. S. Environmental Protection Agency, Mail Drop 77, Research
Triangle Park, North Carolina 27711.
16-7

-------
80-43.3
GUIDELINES FOR IMPLEMENTING A QUALITY
COST SYSTEM FOR ENVIRONMENTAL MONITORING PROGRAMS
Introduction

Program managers with Governmental agencies and industrial organizations involved in
environmental measurement programs are concerned with overall program cost-effectiveness
including total cost, data quality and timeliness. There are several costing techniques designed
to aid the manager in monitoring and controlling program costs. One particular technique
specifically applicable to the operational phase of a program IS a quality cost system.
The objective of a quality cost system for an environmental monitoring program is to
minimize the cost of those operational activities directed toward controlling data quality while
maintaining an acceptable level of data quality. The basic concept of the quality cost system is
to minimize total quality costs through proper allocation of planned expenditures for the
prevention and appraisal efforts in order to control the unplanned correction costs. That is,
the system is predicated on the idea that prevention is cheaper than correction.
There is no pre-set formula for determining the optimum mode of operation. Rather, the
cost effectiveness of quality costs is optimized through an iterative process requiring a con-
tinuing analysis and evaluation effort. Maximum benefits are realized when the system is
applied to a specific measurement method in a stable long term monitoring program. For
example, a monitoring program with a fixed number of monitoring sites, scheduled to
operate for a year or more, would be a desirable candidate for a quality cost system.
Quality costs for environmental monitoring systems have been treated by Rhodes and
Hochheiserl. The purpose of this paper is to present guidelines for the implementation of a
quality cost system. The contents of this paper are based on work performed by the Research
Triangle Institute under contract to the U.S. Environmental Protection Agency'.
Structuring of Quality Costs

The first step in developing a quality cost system is identifying the cost of quality-related
activities, including all operational activities that affect data quality, and dividing them into the
major cost categories.
Costs are divided into category, group, and activity. Category, the most general classifica-
tion, refers to the standard cost subdivisions of prevention, appraisal, and failure. The
category subdivision of costs provides the basic format of the quality cost system. Activity is
the most specific classification and refers to the discrete operations for which costs should be
determined. Similar types of activities are summarized in groups for purposes of discussion
and reporting.
Cost Categories

The quality cost system structure provides a means for identification of quality-related
activities and for organization of these activities into prevention, appraisal, and failure cost
categories. These categories are defined as follows:
. Prevention Costs-Costs associated with planned activities whose purpose is to ensure
the collection of data of acceptable quality and to prevent the generation of data of
unacceptable quality.
. Appraisal Costs-Costs associated with measurement and evaluation of data quality.
This includes the measurement and evaluation of materials, equipment, and processes
used to obtain quality data.
. Failure Costs-Costs incurred directly by the monitoring agency or organization
producing the failure (unacceptable data).
16-8

-------
80-43.3
Cost Groups

Quality cost groups provide a means for subdividing the costs within each category into a
small number of subcategories which eliminates the need for reporting quality costs on a
specific activity basis. Although the groups listed below are common to all environmental
measurement methods, the specific activities included in each group may differ between
methods.
Groups within prevention costs. Prevention costs are subdivided into five groups:
. Planning and Documentation-Planning and documentation of procedures for all
phases of the measurement process that may have an effect on data quality.
. Procurement Specification and Acceptance-Testing of equipment parts, materials, and
services necessary for system operation. This includes the initial on-site review and
performance test, if any.
. Training-Preparing or attending formal training programs, evaluation of training status
of personnel, and informed on-the-job training.
. Preventive Maintenance-Equipment cleaning, lubrication, and parts replacement per-
formed to prevent (rather than correct) failures.
. System Calibration-Calibration of the monitoring system, the frequency of which could
be adjusted to improve the accuracy of the data being generated. This includes
initial calibration and routine calibration checks and a protocol for tracing the cali-
bration standards to primary standards.
Groups within appraisal costs. Appraisal costs are subdivided into four groups:
. Quality Control (QC) Measures-QC-related checks to evaluate measurement equip-
ment performance and procedures.
. Audit Measures-Audit of measurement system performance by persons outside the
normal operating personnel.
. Data Validation-Tests performed on processed data to assess its correctness.
. Quality Assurance (QA) Assessment and Reporting-Review, assessment, and reporting
of QA activities.
Groups within failure costs. Under m~st quality cost systems, the failure category is sub-
divided into internal and external failure costs. Internal failure costs are those costs incurred
directly by the agency or organization producing the failure.
Internal failure costs are subdivided into three groups:
. Problem Investigation-Efforts to determine the cause of poor data quality.
. Corrective Action-Cost of efforts to correct the cause of poor data quality, imple-
menting solutions, and measures to prevent problem reoccurrence.
. Lost Data-The cost of efforts expended for data which was either invalidated or not
captured (unacquired and/or unacceptable data). This cost is usually prorated from
the total operational budget of the monitoring organization for the percentage of data
lost.
External failure costs are associated with the use of poor quality data external to the
monitoring organization or agency collecting the data. In air monitoring work these costs are
significant but are difficult to systematically quantize. Therefore, this paper will only address
failure costs internal to the monitoring agency. However, external failure costs are important
and should be considered when making decisions on additional efforts necessary for
increasing data quality or for the allocation of funds for resampling and/or reanalysis.
Examples of failure cost groups are:
. Enforcement actions-Cost of attempted enforcement actions lost due to questionable
monitoring data.
. Industry-Expenditures by industry as a result of inappropriate or inadequate standards
established with questionable data.
. Historical Data-Loss of data base used to determine trends and effectiveness of control
measures.
16-9

-------
80-43.3
Cost Activities

Examples of specific quality-related activities which affect data quality are presented in
Table 1. These activities are provided as a guide for implementation of a quality cost system
for an air quality program utilizing continuous monitors. Uniformity across agencies and
organizations in the selection of activities is desirable and encouraged. however, there are
variations which may exist, particularly between monitoring agencies and industrial/research
projects .
Agencies should make an effort to maintain uniformity regarding the placement of activities
in the appropriate cost group and cost category. This wilJ provide a basis for future "between
agency" comparison and evaluation of quality cost systems.
Development and Implementation of the Quality Cost System

Guidelines are presented in this section for the development and implementation of a
quality cost system. These cover planning the system, selecting applicable cost activities. iden-
tifying sources of quality cost data. tabulating. and reporting the cost data.
Planning

Implementation of a quality cost system need not be expensive and time consuming. It can
be kept simple if existing data sources are used wherever possible. The importance of plan-
ning cannot be overemphasized. For example, implementation of the quality cost system wilJ
require close cooperation between the quality cost system manager and other managers or
supervisors. Supervisors should be thoroughly briefed on quality cost system concepts.
benefits. and goals.
System planning should include the foUowing activities:
. Determining scope of the initial quality cost program.
. Setting objectives for the quality cost program.
. Evaluating existing cost data.
. Determining sources to be utilized for the cost data.
. Deciding on the report formats, distribution, and schedule.
To gain experience with quality cost system techniques, an initial pilot program could be
developed for a single measurement method or project within the agency. The unit selected
should be representative. i.e.. exhibit expenditure for each cost category: prevention,
appraisal, and failure. Once a working system for the initial effort has been established, a full-
scale quality cost system can then be implemented.
Activity Selection

The first step for a given agency to implement a quality cost system is to prepare a detailed
list of the quality-related activities most representative of the agencies monitoring operation
and to assign these activities to the appropriate cost groups and cost categories. Worksheets
and cost summaries for collecting and tabulating cost data for specific measurement methods
will then need to be assigned and methods developed to accumulate the costs as easily as
possible. Ultimately and most important is the analysis of the accumulated costs, discussed in
the next section.

The general definitions of the cost groups and cost categories, presented in the previous
section, are applicable to any measurement system. Specific activities contributing to these
cost groups and categories, however, may vary signmcandy between agencies, depending on
the scope of the cost system, magnitude of the monitoring network, parameters measured,
and duration of the monitoring operation. The activities listed in Table I are provided as a
guide only, and they are not considered to be inclusive of all quality-related activities. An
agency may elect to add or delete certain activities from this list. It is impOrtant, however, for
an -agency to maintain uniformity regarding the cost groups and categories the activities are
listed under. As indicated previously, this will provide a basis for future cost system com-
parison and evaluation.
16-10

-------
. .
80-43.3
Quality Cost Data Sources

Most accounting records do not contain cost data detailed enough to be directly useful to
the operating quality cost system. Some further calculation is usually necessary to determine
actual costs which may be entered on the worksheets. The cost of a given activity is usually
estimated by prorating the person's charge rate by the percentage of time spent on that activ-
ity. A slightly rougher estimate can be made by using average charge rates for each position
instead of the actual rates.
Failure costs are more difficult to quantize than either prevention or appraisal costs. The
internal failure cost of lost data (unacquired and/or unacceptable data), for example, must be
estimated from the total budget.
Cost Accumulation and Tabulation

Cost collection and tabulation methods should be kept simple and conducted within the
framework of the agency's normal reporting format whenever possible. During initial system
development, a manual approach will allow needed flexibility, whereas, automatic quality cost
data tabulation would be complicated, since many of the quality-related activities are not
typical in existing accounting systems. Automatic tabulation of costs may be practical after the
basic quality cost system has been developed.
Also, an effective cost system does not require precise cost accounting. Reasonable cost
estimates are adequate when actual cost records are not available.
Worksheets and summaries used to collect and tabulate the cost data should be designed
to represent expenditures by activity.
Quality Cost Worksheets

Worksheets- for collecting and tabulating quality cost data should be prepared for each
specific measurement method. The worksheet should be designed to allow cost tabulation for
each quality-related activity performed and to accomodate more than one personnel level per
activity. In addition, activities should be organized into appropriate cost groups and cost
categories so that when total costs are computed, they can be transferred directly to cost
summaries later.
Quality Cost Analysis Techniques

Techniques for analyzing and evaluating cost data range from simple charts comparing the
major cost categories to sophisticated mathematical models of the total program. Common
techniques include trend analysis and Pareto analysis.
Trend analysis. Trend analysis compares present to past quality expenditures by category.
A history of quality cost data, typically a minimum of I-year, is required for trend evaluation.
(An example is given in Figure 1 of the next section).
Cost categories are plotted within the time frame of the reporting period (usually quarterly).
Costs are plotted either as total dollars (if the scope of the monitoring program is relatively
constant) or as "normalized" dollars/data unit (if the scope may change). Groups and
activities within the cost categories contributing the highest cost proportions are plotted
separately.
Pareto analysis. Pareto analysis identifies the areas with greatest potential for quality
improvement by:
. Listing factors and/or cost segments contributing to a problem area.
. Ranking factors according to magnitude of their contribution.
. Directing corrective action toward the largest contributor.
Pareto techniques may be used to analyze prevention, appraisal, or failure costs. They are
most logically applied to the failure cost category, since the relative costs associated with
activities in the failure category indicate the major source of data quality problems. Typically,
relatively few contributors will account for most of the failure costs.3.4 (An example is given in
Figure 3 of the next section.)
16-11

-------
80-43.3
Quality Cost Reports

Quality cost reports prepared and distributed at regular intervals should be brief and factual,
consisting primarily of a summary discussion, a tabulated data summary. and a graphic
representation of cost category relationships. trends, and data analysis. The summary discus-
sion should emphasize new or continuing problem areas and progress achieved during the
reporting period.
Written reports should be directed toward specific levels of management. Managers and
supervisors receiving reports should be thoroughly briefed on the concepts, purpose, and
potential benefits of a quality cost system, i.e., identification of quality-related problems,
potential input into problem solution, and quality cost budgeting.
Quality Cost System Example

A hypothetical case history of a quality cost system is presented in this section. In this
example, a cost system is developed for an agency operating sixteen sulfur dioxide monitor-
ing stations. The stations are located within a SO-mile radius and each is equipped with a
continuous sulfur dioxide monitor. The monitoring network has been in operation for 2 years.
The QA Coordinator is given the responsibility for implementing the quality cost system.
The QA Coordinator plans the implementation of the pilot cost system. Planning for the
system includes selecting cost activities, determining cost methods, and establishing pro-
cedures for maintaining the system.
To establish an historical basis quality costs are estimated for the past year. This allows for
trend observation over an adequate period of time. These costs are shown (see Figure 1) and
discussed in the following paragraphs.
Unacceptable data costs are a major cost group in the failure category. In order to establish
the value of "lost data". the overall monitoring budget is determined from contracts,
accounting documents. and other source documents. Table II summarizes total monitoring
costs for the criteria pollutants and the sulfur dioxide costs are used in this example quality
cost system. The cost data includes the maximum possible number of data units and cost per
data unit.'
Quality-related costs are estimated for each quarter over the preceding year. The estimated
costs are subject to the following considerations:
. Estimates of time spent by an operator performing a specific activity takes into account
the capability of the operator to perform several activities simultaneously. For
example. an operator performing daily analyzer zero/span will have time to simul-
taneously perform other duties while the analyzers stabilize to the zero/span inputs.
. The activities are performed by three personnel types: manager. supervisor. and
operator. The cost per hour for each level is consistent with "Cost of Monitoring Air
Quality in the United States.'"
Analysis and evaluation of the collected cost data will determine several facts about the
example agency's quality effort. The cost data should reflect the present status of the quality
program, where major problem areas exist, and what immediate goals should be established.
A graph of the expenditures for each cost category is shown in Figure 2. Throughout the
preceding year prevention costs were relatively small, appraisal costs were moderate, and
failure costs were significant. Also, failure costs showed an increasing trend throughout the
year.
A Pareto distribution of the failure costs (Figure 3) shows that the major cost contributor is
"lost" data. The "lost" data cost represents over 80 percent of the total failure costs. Although
the "lost" data cost represents less than 20 percent of the total data possible, the cost of this
loss is significant.
16-12

-------
..
80-43.3
An investigation determines the major cause of the problem to be a shortage of station
operators. The workload of the one fulltime operator does not allow adequate time for an
effective preventive maintenance program. The lack of proper preventive maintenance
increases the frequency of analyzer/equipment failure resulting in an additional workload for
the station operator, i. e., equipment repair.
The quality manager prepares a quality cost report covering the initial study results. The
report presents several recommendations, including:
. Hire and train an additional operator.
. Increase prevention efforts for the monitoring operation.
. Reduce failure costs 50% by the end of the next reporting period.
During the following quarter, an additional operator was hired and trained. Preventive
maintenance procedures were reviewed and modified as required. At the end of this
reporting period, quality costs were collected, analyzed, and evaluated. The quality cost
report covering this reporting period shows that failure costs were reduced 37%, prevention
costs were increased 81 % and appraisal costs increased 32%. A net decrease in total quality
cost, amounting to $2,584 (11 %) was experienced for the quarter as seen in Figure 1 when
comparing the first quarter of 1979 with the fourth quarter of 1978.
The changes in category expenditures (Figure 2) reflect specific corrective measures
initiated during the reporting period. These measures included hiring and training an addi-
tional operator and increasing the preventive maintenance effort.
Although the unacceptable data costs were decreased significantly, these costs are still
excessive and a preliminary analysis of the last sulfur dioxide data indicates that additional
effort in preventive maintenance is necessary to further reduce the networks operating costs.
16-13

-------
80-43.3
   TOTAL QUAUTY COST SUMMARY 
   (Combined network costs, 1978-79) 
 COST GROUP 2nd Quarter 3rd Quarter 4th Quarter 1st Quarter
 PREVENTION    
 Planning & documentation    179
 Procurement    179
 Training    459
 Preventive maintenance 588 559 587 1,046
 System calibration    
 and operation 1,254 1,317 1,386 1,713
 TOTAL PREVENTION COSTS 1,842 1,876 1,973 3,576
- APPRAISAL    
Q') QC measures 768 806 742 1,631
, 
- Audits 1,308 1,508 1,470 1,913
~ Data validation 1,468 1,668 1,868 1,887
 QA assessment & reporting 1,748 1,839 1,686 2,179
 TOTAL APPRAISAL COSTS 5,292 5,821 5,766 7,610
 FAILURE    
 Problem Investigation 1,579 1,886 1,760 704
 Corrective action 1,361 1,334 1,365 546
 Lost data (unacqulred data) 12,430 13,893 13,162 9,506
 TOTAL FAILURE COSTS 15,370 17,113 16,287 10,256
 TOTAL QUALITY COSTS 22,504 24,810 24,026 21,442
 MEASUREMENT BASES    
 Total program cost per quarter 48,304   
 Total data units per quarter 33,792    
 Figure 1. Total quality cost summary.  

-------
 25   Total Cost  
 20     
    Failure Cost  
6      
0      
0      
.....      
~      
a 10     
f-      
CJ)      
0   Appraisal Cost  
u    
 5     
   Prevention Cost  
 0     
  1 2 3 4 1
    QUARTERS  
  I- 1978  -I" 1979---f
   Figure 2. Quality cost trends.  
100
f-
CJ)
o
U
..J
<
f-
o
f-
t.L..
o
f-
Z
LU
U
0:::
LU
0..
','.;::;
.'."',
",'..'

~~!f1
z
o
:.:E-
LUf-
..J<
a:12
Of-
c::CJ)
o..LU
:>
~
<
f-
<
C
f-
CJ)
o
..J
LU
:>
-z
f-O
U-
LUf-
o:::u
0:::<
o
u
Figure 3. Failure cost distribution.
16-15
80-43.3

-------
.....
en
.
.....
en
      80-43.3
  TABLE 1. EXAMPLE COST ACTIVITIES GUIDE FOR AMBIENT AIR MEASUREMENT METHODS 
APPRAISAL COST ACTIVITY PREVENTION COST ACTIVITY FAILURE COST ACTIVITY
 CATEGORY CATEGORY CATEGORY
QC Measures Intercomparlson of calibration Planning and Planning and preparation of: Problem Investigation Special tests and information
  teams. Documentation . Audit schedules and  collection to detect source
  Operation of collocated  procedures  and characteristics of
  samplers.  . Operating standards  problem
  Special activities performed by  . Apparatus configuration  Data and procedure review to
  station operator strictly  . Control procedures  identify problem area
  for obtaining data to  . Special tests  
  evaluate accuracy and  . Operating procedures  
  precision.  . Preventive maintenance Corrective Action Cost of correcting data
    procedures  Cost of changing established
Audits Independent performance and    procedures to prevent
  system audits. Procurement Specification and acceptance  reoccurrence of problems
    testing of:  
Data Validation . Statistical evaluation of data  . Calibration gases/devices Lost Data Estimated value of data which
  . Review of data handling  . Equipment (un acquired and/or was lost or Invalidated
  procedures   unacceptable data) 
  . Comparison of data with Training Operator training/proficiency  
  historical data and data  evaluation.  
  from nearby stations  Certification of calibration  
  Review and evaluation of:  personnel and equipment.  
  . Control tests    
  . Audit schemes    
  . Audit results Preventive Maintenance of records, logs  
  . Special Interference  Maintenance operation, calibration, and  
  tests  maintenance).  
,     
\    Performance of preventive  
QA Assessment and Assessment of overall data quality.  maintenance procedures.  
 Reporting Generation of data quality reports.    
 I Maintenance of quality cost system System Calibration Routine muhipotnt calibration  
 records.  of analyzers.  
    Zero/span checks.  
    Special checks on calibration  
    system (voltage, tempera-  
    ture, variation, etc.)  
.
'.~

-------
80-43.3
TABLE II. Total monitoring cost {dollars}.
 Annualized Maximum 
 Total Cost Data Units Cost Per
Pollutant Per Station Per Station - Data Unit
CO 9,969 8448 1.18
S02 12,076 8448 1.43
0] 8,713 8448 1.03
TSP 1,535 61 25.10
N02 8,757 8448 1.04
THC 9,231 8448 1.09
TOTAL FOR S02= $12,076 x 16=$193,216
. Maximum data units for continuous analyzers
based on total possible hourly averages per year.
Summary

The first step in implementing a quality cost system for an environmental monitoring pro-
gram is to categorize quality-related activities into prevention, appraisal, and correction
categories. An example listing for measurement methods involving continuous gaseous
analyzers is given in this paper. Major items to be considered when implementing a system
have been discussed along with an example quality cost system. Emphasis should be
placed by management on preventive activities to decrease total cost of quality related
activities.
References

1. Rhodes, Raymond C. and Seymour Hochheiser. "Quality Costs for Environmental
Monitoring Systems," American Society for Quality Control, Technical Conference, 1977,
p.328.
2. Strong, R. B., J. H. White and F. Smith "Guidelines for the Development and
Implementation of a Quality Cost System for Air Pollution Measurement Programs",
Research Triangle Institute, Research Triangle Park, North Carolina, 1980, EPA Contract
No. 68-02-2722.
3. American Society for Quality Control, Quality Costs Technical Committee. "Guide for
Reducing Quality Costs," Milwaukee, Wisconsin, 1977.
4. American Society for Quality Control, Quality Cost-Effectiveness Committee. "Quality
Costs-What and How," Milwaukee, Wisconsin, 1977.
5. PEDCo Environmental, Inc. "Cost of Monitoring Air Quality in the United States,"
December 1978.
16-17

-------
         TECHNICAL REPORT DATA           
        (Please read IlUllUctions on the reverse before completing)       
1. REPORT NO.     r.        3. RECI PI ENT"S ACCESSI O~ NO.  
 EPA 450/2-81-016          
4. TITL.E AND SUBTITL.E          5. REPORT DATE      
 AP:'I Course 470          May 1984      
 Quality Assurance for Air Pollution Neasurement System B. PERFORMING ORGANIZATION CODE 
 Student Workbook, Second Edition                 
7. AUTHORCSI             8. PERFORMING ORGANIZATION REPORT NO. 
 B. Michael Ray                    
9. PERFORMING OR'":ANIZATION NAME AND ADDRESS       10. PRCIGRAM EL.EMENT NO.   
 Northrop Services, Inc.         B1SA2C       
 P.O. Box 12313          11. CON~RACT/GRANT NO.   
 Research Triangle Park, NC 27709       68-02-3573      
12. SPONSORING AGENCY NAME AND ADDRESS        13. TYPE OF REPORT AND PERIOD COVERED 
 U.S. Environmental Protection Agency     Student Horkbook   
 Manpower and Technical Information Branch    14. SPONSORING AGENCY CODE  
 Air Pollution Training Institute                 
 Research Triangle Park, NC 27711       EPA-OAR-OAQPS     
15. SUPPL.EMENTARY NOTES                    
 Project Officer for this publication is R. E. Townsend, EPA-ERC, RTP, NC 27711 
18. A85TRACT                       
   This workbook is desiRned for use in APTI Course 470, "Quality Assurance for 
 Air Pollution Measurement Systems." Lhe workbook contains course end lesson  
 objectives, selected readin~ material, representations of course visuals, and  
 problem exercises coverin~ the following topics:           
   Basic Areas of QA Activities Outliers           
   ~~na~erial QA Elements    Intralaboratory/Interlaboratory Testing 
   Statistical Control Charts   Procurement Quality Control      
   Calibration      Performance/System Audits       
   Regression Analysis    QA Requirements for SLA~S/PSD     
   Data Validation     Quality Costs           
   This workbook is designed for use with APTI Course 470 Instructor's Guide, 
 Second Edition (EPA 450/2-81-015) and Volumes I and II of the Quality Assurance 
 Handbook for Air Pollution Measure~ent Syste~s (EPA 600/9-76-005, EPA 600/4-77-027a). 
17.        I(EY WORDS AND DOCUMENT ANAL. YSIS           
~.    DESCRIPTORS      b.IDENTIFIERS/OPEN ENDED TERMS C. COSA TI Field/Gtoup 
 Training          Training Course        13 ~ 
 Quality Assurance       Air Pollution Monitoring    5 I 
 Quality Control       Data Validation        68 A 
 Air Pollution       Control Charts          
 !1easurement       Calibration           
19. DISTRIBUTION STATEMENT Unlimited, available 19. SECURITY CL.ASS (1'IIu RepOrf) 21. NO. OF PAGES 
 from the National Technical Information  Unclassified      165  j
 S~rvice, 5285 Port Royal Rd., Springfield 20. SECUFlITY CI.ASS (Thu pap) 22. PFlICE  
 VA "11:'1          Unclassified          
EPA ~orm 2220.1 ('-73)                    
            16-18            
                          "
                 -       

-------