United States	Air Pollution Training Institute	EPA
Environmental Protection	MD 20	April 1981
Agency	Environmental Research Center
Research Triangle Park NC 27711
_
SEPA APTI
Course 470
Quality Assurance
for Air Pollution
Measurement Systems
Student Workbook

-------
United States
Environmental Protection
Ayency
Air Pollution Training Institute
epa 450281016A
MD 20
Environmental Research Center
April 1981
Research Triangle Park NC 27711
Air
APTI
Course 470
Quality Assurance
for Air Pollution
Measurement Systems
Student Workbook
Prepared By:
B. M. Ray
Northrop Services, Inc.
P.O. Box 12313
Research Triangle Park, NC 27709
Under Contract No.
68-02-2374
EPA Project Officer
R. E. Townsend
United States Environmental Protection Agency
Office of Air, Noise, and Radiation
Office of Air Quality Planning and Standards
Research Triangle Park, NC 27711

-------
Notice
This is not an official policy and standards document. The opinions and selections are
those of the authors and not necessarily those of the Environmental Protection
Agency. Every attempt has been made to represent the present state of the art as well
as subject areas still under evaluation. Any mention of products or organizations does
not constitute endorsement by the United States Environmental Protection Agency.
Availability
This document is issued by the Manpower and Technical Information Branch, Con-
trol Programs Development Division, Office of Air Quality Planning and Standards,
USEPA. It was developed for use in training courses presented by the EPA Air Pollu-
tion Training Institute and others receiving contractual or grant support from the
Institute. Other organizations are welcome to use the document.
This publication is available, free of charge, to schools or governmental air pollution
control agencies intending to conduct a training course on the subject covered. Submit
a written request to the Air Pollution Training Institute, USEPA, MD-20, Research
Triangle Park, NC 27711.
Others may obtain copies, for a fee, from the National Technical Information Service
(NTIS), 5825 Port Royal Road, Springfield, VA 22161.
Sets of slides and films designed for use in the training course of which this publication
is a part may be borrowed from the Air Pollution Training Institute upon written
request. The slides may be freely copied. Some films may be copied; others must be
purchased from the commercial distributor.
ii

-------
COURSE 470
QUALITY ASSURANCE FOR AIR POLLUTION MEASUREMENT SYSTEMS
TABLE OF CONTENTS
LESSON	CHAPTER
Registration, Course Information, and Pre^test 		1
Basic Areas of Quality Assurance Activities 		2
Managerial Quality Assurance Elements for Establishing
a Quality Assurance Program and Recording Changes 		3
Review of Pre-Course Problem 3		3A
Basic Concepts of Statistical Control Charts 		4
X-R Statistical Control Charts 		5
The Measurement Process with Emphasis on Calibration 		6
Group Problem		6A
Review of Control Chart Homework				6B
Regression Analysis and Control Charts for Calibration Data 		7
Review of Pre-Course Problems 1 and 2		7A
Identification and Treatment of Outliers 		8
Intralaboratory Testing 		9
Interlaboratory Testing 		10
Procurement Quality Control 		11
Performance Audits						12
Systems Audits		 . . . 			13
Quality Assurance Requirements for SLAMS and PSD 		14
Precision Work Session 		14A
Data Validation		15
Quality Costs 		16
iii

-------
APPENDIX	SECTION
The Quality Assurance Bibliography
iv

-------
REGISTRATION, COURSE INFORMATION, AND PRE-TEST
LESSON GOAL:	To familiarize students with the overall structure
of the course.
LESSON OBJECTIVES:	At the end of this session, you should know:
1.	By whom this course, 470, "Quality Assurance for
Air Pollution Measurement Systems," is presented.
2.	The name of*all instructors and their affiliations.
3. The phone number where you may receive messages
during the course offering.
4.
The
course goal and objectives.
5.
The
requirements for passing the course.
6.
The
teaching approach used in this course.
7.
The
location of

a.
Restrooms

b.
Refreshments

c.
Emergency exits
8.
The
address and phone number of EPA - Air Pollution
Training Institute:
United Stated Environmental Protection Agency
Environmental Research Center
Air Pollution Training Institute
MD-20
Research Triangle Park, NC 27711
919-541-2766
FTS: 629-2766
1-1

-------
NOTES
1-3

-------
#470 March 19-22, 1984 Student Roster
Sammy L. Archer
Scott Paper Co.
2600 Federal Ave.
Everett, WA 98201
(206) 259-7422
Charles L. Goerner
Texas Air Control Board
6330 Hwy. 290 E.
Austin, Texas 78723
Diana S. Gould
Ventura County
800 S. Victoria Ave.
Ventura, CA 93009
(805) 654-3818
Karl P. Barke
Springfield-Greene Co. APCA
227 E. Chestnut Expressway
Springfield, MO 65802
(417) 864-1662
Hilllam J. Basbaglll
EPA Region 8
1960 Lincoln St.
Denver, CO 80295
(303) 234-6849
Tim Booker
EIC/Alr Quality Bureau
P.O. Box 968
Santa Fe, New Mexico 87504
(505) 984-0020
Leslie Ann Carpenter
Wash. Dept. of Ecology
4350 NE 150th
Redmond, UA 98052
(206) 731-1111
John A. Coefleld
Montana Air Quality Bureau
Cogswell Bldg.
Helena, KT 59620
(406)444-3454
Jerald A. Cross
EPA Air Quality Lab
P.O. Box 25366
Denver Federal Center
Lakewood, Colorada 80225
(303) 234-6849
W.S. Davis, III
Florida Env. Reg. Dept.
2600 Blair Stone Rd.
Tallahassee, 'FL 32301
(904) 488-1344
Robert Givens
Wash. Dept. of Ecology
4350 NE 150th
Redmond, UA 98052
(206) 459-6240
James T. Hansen
Wisconsin Dept. Nat. Resources
Box 12436
Milwaukee, Wisconsin 53212
(414) 562-9577
David L. Kay
UW Army Depot
Activity: UMATILLA
Attention: SDSTE-UA-QA
Hermiston, Oregon 97838
(503) 567-6421 x-248
Ken A. Knowle
Puget Sound Air Pollution
Control Agency
200 West Mercer
P.O. Box 9863
Seattle, WA 98109
(206) 344-7323
Anupan Komkrichwarakool
Electricity Generating
Authority of Thailand
Nouthaburi 11000
Thailand
or
Acres Consulting Services
P.O. Box 1001
Niagara Falls, Ontario
3/19/84
Warren Krug
Northwest Air Poll. Authority
207 Pioneer Bldg.
Mount Vernon, WA 98273
(206) 336-5705
Alben T. Myren, Jr.
InterMountain Ambient
P.O. Box 5106
Missoula, MT 59806
(406) 543-6174
James J. Olsen
Montana Air Quality
Cogswell Bldg.
Helena, MT 59620
(406) 444-3454
Duane Pollock
Puget Sound APCA
P.O. Box 9863
Seattle, WA 98109
(206) 344-7323
Herman A. Ragsdale
Ventura APCD
800 S. Victoria Ave.
Ventura, CA 93009
(805) 654-2668
Robert Raisch
Air Quality Bureau
Cogswell Bldg.
Helena, MT 59620
(406) 444-3454
Michael A. Stahl
Getty Oil Co.
P.O. Box 10267
Bakersfield, CA 93389
(805) 832-4010
Rick Volpel
6305 S.E. Needham
Milwaukee, OR 97222
(Oregon - DEQ)
(503) 229-5983
Gary L. Young
Polk Co. Air Poll. Control
1915 Hickman
Des Moines, Iowa 50314
(515) 286-3992
Joyce L. Hargreaves
Vermont Agency of
Environmental Conservation
Air & Solid Waste Program
Montpelier, VT 0552
(802) 828-3395
E. Craig Jackson
Getty Oil Co.
Rt. 1 Box 197-x
Bakersfield, CA 93308
(805) 399-2961

-------
COURSE GOAL
To train students in quality assurance principles and techniques to the
extent that they will understand the usefulness of them and be able to
apply them in the development and implementation of a comprehensive
quality assurance program for air pollution measurement systems.
COURSE OBJECTIVES
Upon successful completion of this course, the student will be able to
coordinate the design of a comprehensive quality assurance plan for an
air pollution measurement system. Specifically, the student should be
able to:
a.	Develop an organizational plan for quality assurance, including the
development of an organization chart indicating those positions with
major quality assurance responsibilities, the delineation of the
quality assurance responsibilities for key personnel, and the de-
velopment of an implementation schedule in terms of the various
elements of quality assurance.
b.	Formulate a quality assurance policy for an air pollution monitoring
organization.
c.	Develop objectives for a measurement process in terms of completeness,
precision, accuracy, representativeness, and comparability.
d.	Describe the principles that should be considered in preparing
quality reports to management and the quality facts that should
be reported.
e.	Describe the types of training that are available to develop and
maintain personnel at the level of knowledge and skill required to
perform their jobs.
f.	Design a reporting format for quality costs that allocates the
quality assurance elements into cost categories.
g.	Compare and contrast a quality assurance manual and quality assurance
plan in terms of their components (elements) and functions.
h.	Explain the importance of establishing a closed loop corrective
action system.
i.	Explain the purposes for and describe how a basic document control
system and a basic configuration control system should be established.
j. List the factors that should be considered in designing a preventive
maintenance program,
k. Describe the mechanisms which can be used to assure the quality of
procured items.
1. Define the two types of audits recommended by EPA and describe the
steps and factors that must be considered in the design of each,
m. Describe the types of quality control checks that should be performed
on sample collection and analysis systems (manual and continuous),
and what statistical analyses and records should be maintained,
n. Describe the purposes of both intralaboratory and interlaboratory
testing programs, the factors that must be considered in establishing
the programs, and the methods of analyzing and reporting results
of each program.
o. Develop calibration programs incorporating the elements recommended
in the EPA Quality Assurance Handbook, Volume I.
1-5

-------
pi Select the appropriate types of control charts to be used to control
measurement systems, calculate control limits for them, and interpret
plotted results.
q. Outline the basic elements of a data qualification scheme for estimating
accuracy and precision, select the appropriate statistical techniques
to be used; and calculate estimates of precision and accuracy,
r. Explain the importance of timely data validation and, using appropriate
techniques, develop a data validation scheme for a given air pollution
monitoring system.
1-6

-------
BASIC AREAS OF QUALITY ASSURANCE ACTIVITIES
LESSON GOAL:	To familiarize students with the four basic areas of
quality assurance and the various activities that
relate to each.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Define quality assurance.
2.	List the four basic areas of quality assurance:
management, measurement, systems, statistics.
3.	Recognize specific activities that relate to
each basic area.
4.	Explain the need for quality assurance to
be involved with the wide scope of activities
which affect air pollution data quality.
5.	Explain the dynamic nature of a quality assurance
program, i.e., the need for continual improvement
of the program through planning, implementation,
assessment, and corrective action.
2-1

-------
BASIC AREAS OF QUALITY
ASSURANCE ACTIVITIES
-4-5-6-7-8
Measurement
Management
Systems
Statistics
QUALITY
[assurance I
3
Audit of
Calibration
<7
CONTROL
¦a
Calibration
rAir Monitoring
L_.
Network
Quality Assurance |
Program i
.iJL
j~Data
		
Valid Data |
MONITORING SYSTEM
Variable
QC Activity
° method
technical procedure
•	materials
•	machines
procurement
* maintenance
preventive/corrective
* men/women
training
• measurement
calibration procedures
operating procedures
MONITORING SYSTEM
Variable
QC Activity
• monitoring sites
conditions
• mathematics
computations
• management
objectives
• manual
policies
• motivation
procedures
• meteorology
siting
• money	quality costs
2-3

-------
"2 5S
uratlon	CorrMthr# Action
c##,,,S5tf«» QA program 0o
V©*	*	**¦/»/-
X	<• o °<\ "
^	:?'« \
«*  3 4 5 6 / meter m
- 1 ' ' • 1 -' length
kilogram-kg
mass
—if 7\i— second-s 	
	Iks ~ I!«l 	 ..			

— time
ampere-A
electric current
_ temperature
mole-mol
amount of
substance
~ candela-cd
luminous
intensity
2-4

-------
IETROLOGY
iku.
	 	 ———	
——5	6	7	8	9	10	U"~
1111111111 n 11 111111111111111111 n 11 ii > 111111 i ii lL li 1111 ii 111111 n 11
X) the
science of measurement
METROLOGY REFERENCES
(available from
National Bureau of Standards)
!|
Special Publication 300
Volume I
"Precision Measurement
and Calibration.
Statistical Concepts
and Procedures"
Special Publication 408
"Standard Reference
Materials
and Meaningful
Measurements"
1
SYSTEMS
•	quality planning
•	procurement
quality control
•	document control
•	preventive
maintenance
•	configuration control
data handling
data validation
performance/
systems audits
corrective action
quality costs
*=4* STATISTICS
•	control charts
•	regression analysis
•	outlier tests
Corrective
Action

Planning
^ THEQA ft
>> CYCLE f)
Assessment
Implementation
^3

-------
> ^
^ .71 ? UNITED STATES ENV!RON'.\; ENTAL PROTECTION AGEN'CY
KING TON. D.C
.. a .pyr-,
!>i> *k U • '"J V
¦-¦ —*of you shares,
my concern about the need to improve our monitoring programs 'and
data; "therefore, I knew that you will take the, necessary actions,
that will ensure the success of this effort./' / i
/ ' Lb.1 , tv;,
Attachment
Ls Doug/as f|. Costle
2-7
I

-------
CHALLENGES OF IMPLEMENTING QUALITY
ASSURANCE FOR AIR POLLUTION MONITORING SYSTEMS
Raymond C. Rhodes
Quality Assurance Specialist
S. David Shearer, Jr., Ph.D.
		D t recto p	
ABSTRACT
Special considerations are necessary In Implementing a quality assurance system
|for air pollution monitoring. Of particular concern are the following:
i
{	(1) Quality characteristics of environmental data.
(2)	Network design and sampling.
(3)	Measurement methods and standard reference materials.
(k) Statistical quality control.
I	(5) Data analysis and validation.
(6) Preventive maintenance.
i
I
'Accuracy, precision, completeness and representativeness are the quality
^characteristics of air monitoring data. The physical sampling of the air environ-
ment presents a number of untque and difficult problems. The technology of air
'pollution measurement has created special demands for measurement methods and
'.standard reference materials. Because of the variability patterns of pollution
'data, and the non-uniform error variability of the measurement methods, particular
;types of statistical control and data analysis and data validation are required.
The wide diversity tn the scope and requirements of compliance and research
monitoring makes It necessary to develop flexible quality assurance procedures.
;ln spite of the many difficulties involved, much is being accomplished tn Imple-
menting quality assurance for air pollution monitoring systems.
i
MWr.T cni'Y JI Kill-:

-------
INTRODUCTION
With the Increased Interest and activity In the environment In recent years, a
need exists to apply the principles and techniques of modern quality assurance
to the various pollution monitoring systems. Pollution measurement methods
Involve field sampling and chemical laboratory analyses and, to these portions
of the measurement process, most of the traditional laboratory quality control
(Q.C.) techniques apply. Of concern, however, is the need to apply the general
I
• principles and techniques to the entire monitoring system.
i
i
; The following elements of quality assurance (Q.A.) system are generally appli-
j cable to pollution monitoring systems:
| Elements of a Quality Assurance System
1.	duality Pol icy
2.	Quality Objectives
3.	Quality Organization and
ResponsibllIty
k. Quality Assurance Manual
5.	Quality Assurance Plans
6.	Training
7.	Procurement Control
Ordering
Receiving
Feedback and Corrective
Action
8.	Calibration
Standards
Procedures
9- Internal Q.C. Checks
10. Operations
Sarapling
Sample HandlIng
Analysis .
(2,8)
11.	Data
Transmission
Computation
Record i ng
Validation
12.	Preventive
Maintenance
13- Reliability Records
and Analysis
14.	-.Document Control
15.	Configuration Control
16.	Audits
On-Slte System Audits
Performance AudIts
17» Corrective Action
18. Statistical Analysis
19- Quality Reporting
20.	Quality Investigation
21.	Interlaboratory Testing
22.	Quality Costs
However, In a number of very important areas, special considerations must be
made. These areas, which require special attention are:
1.	Qual.Ity Characteristics of Environmental Data.
2.	Network Design and Sampling.
3- Measurement Methods and Standard Reference Materials.
2-10

-------
k. Statistical Quality Control.
5.	Data Analysis and Validation.
6.	Preventive Maintenance.
The ultimate uses of air pollution monitoring Information are decisions relative
to human health and welfare. Air pol lution, monltorlng data are used as measures
of air quality to make the best decisions for human health and welfare.
*
The quality of air Is measured by the cleanliness of the air—Are the pollutant
concentrations below the levels established as standards? The quality of air
pollution data is measured by the accuracy, precision, completeness and repre-
sentativeness of the data.
QUALITY CHARACTERISTICS OF ENVIRONMENTAL DATA
These quality characteristics of data may be defined as follows:
1.	Accuracy -- The closeness of a measured value to the true value.
2.	Precision — The repeatabl11ty of the data (actually the repeata- •
(b)
bility of the measurement system).
3.	Completeness — The amount of valid data obtained as a fraction of
that intended or planned to be obtained.
4.	Representativeness — The typicalness of the pollution samples with
respect to time, location, and conditions from which
the pollutant data are obtained.
These quality characteristics are not evident nor can they be determined from j
i
1
examination of the data itself. Measures of accuracy, precision, completeness, j
I
j
and representativeness'must be obtained from other Information. Provision for j
j
obtaining measures of these characteristics must be included In the Quality Plan ;
!	I
i for each monitoring effort because the relative importance of accuracy, precision,
completeness, and representativeness depends upon the specific objectives of each
monitoring program.
2-11

-------
t
!	NETWORK DESIGN AND SAMPLING
I
i
!
i
• The monitoring network design, which Incorporates decisions with respect to time,
location and conditions of sampling, along with the specification of pollution
measurement methods and equipment, specify to a large extent the "process" of
obtaining monitoring data. Quality assurance personnel should be involved witH
the network design for pollution monitoring because of the statistical aspects
! Involved, and because of the need to establish the best possible network at the
|
! beginning of a monitoring effort. Changes in monitoring networks can destroy
; the previous history or baseline necessary for trend studies.
I The process or media being sampled for air pollution measurement Is not in
j statistical control, but is subject to many effects such as diurnal cycles,
| day-of-week differences, seasonal cycles, and local and area meteorological
I
! factors. The changing pattern of air pollution is a dynamic process, sometimes
(6)
"out of control." The objective of a quality assurance program for air
i
monitoring is to assure that the measurement system remains "in control," no
matter what the state or condition of the air.
Consideration for temporal and spatial effects in the location and scheduling of
pollution sampling are critical concerns with respect to representativeness.
Planning of the network design and samp Iing'schedules are very important since !
resampling in air monitoring is impossible. The air which was at the sampling j
J
point a moment ago is ho longer available! Although duplicate sampling Is	j
desirable, in air monitoring, duplicate sampling is not possible for particulates^
| and is not very practical for gaseous pollutants. The most satisfactory way of
i
duplicate sampling for quality assurance purposes is to use duplicate sampling
equipment at the same site. Although such dual sampling requires an additional
'2-12

-------
>	I
sampling Instrument, this procedure Is Invaluable In estimating the precision of !•
1	!
| the total measurement process.	|
I
t
I
In most chemical analytical work duplicate analyses are desirable. Howeverj for
continuous, automated pollution analysis Instruments, reanalysis Is not possible.
Reanalysts is possible for some of the manual methods where bubbler solutions or
filter media have been used to collect the pollutants.
MEASUREMENT MEtHODS AND STANDARD REFERENCE MATERIALS	j
i	•	•	i
!	i
j Most of the manual analytical measurement methods for gaseous pollutants involve
i
j bubbling the air through selective absorbing solutions for an extended period
| (usually 24 hours} and then analyzing the solution by wet chemical/absorbance
! ¦ -
techniques. These methods have the limitation of providing daily averages only.
i In the Interest of obtaining more accurate measurements on a short-time basis,
f.
I
j numerous automated Instrumental methods have been developed In recent years,
j Problems with these instruments Include the manufacturing and 'reliability
J	#	.
! problems associated with newly-designed equipment, and the technological problems
i
of measuring minute concentrations (parts per million or parts per billion) in
the presence of possible Interferents. Further, problems arise relating to the
stability and reliability of these instruments If operated remotely or
unattended. The development of completely satisfactory measurement methods Is a
very Important effort of quality assurance for air monitoring. Because of the
Instability of gaseous mixtures, primary standards (Standard Reference Materials
i
of the National Bureau, of Standards) are difficult to prepare, and must be
prepared and assessed from time to time as required by users. For some gases
I
! (for example, ozone), no primary standard has yet been developed. Neither has a
particulate standard for particle size or chemical content In a naturally-
occurring matrix yet been developed.
2-13

-------
¦ 1
J Because of the problems tn developing and using primary standards for air
I
1
j pollution measurement, the achievement of comparability for accuracy among the
i various agencies and facilities within a country is not an easy task and this
concern is further magnified when comparability among different countries is
considered, tn most other physical measurement areas, comparability among
nations Is relatively easily achieved through traceabillty to common primary
standards.
j	STATISTICAL QUALITY CONTROL
j
i
i
! In traditional quality control systems much importance is placed on the
i
! establishment of average and range (x, R) control charts to control quality.
i
| Averages are obtained from measurement of a sample from some assumed homogeneous
|'rational sub-group of products. In this way, the average is used as a measure
J and means of control of the level of the quality characteristic and the range
i	¦	.
»	v
| of the measurements is used as a measure and means of control of variability.
j
j Except In the laboratory, batches or rational sub-groups seldom exist in
J
i
¦pollution measurement; and even in these cases, replication is accomplished
I usually on a duplicate basis only, such as duplicate measures of the same sample,
I	•»
I
(duplicate analyses by different analysts, or measurement from duplicate co-
located sampling Instruments. Further, except for repeated measurements of
homogeneous control samples, the averages of the duplicates vary depending upon j
the concentration level. Therefore, the x chart Is of little value in quality j
i
control for pollution measurements.	j
I	"	!
¦Further, In the cases of duplicate data, some Identity can usually be associated r
I	i
| with each of the pair of measurements, so that the range Is not the best value ;
of Interest. Because of suspected bias between the two sources, sIgned
differences should be used rather than the unsigned range. Further, since the
2-14

-------
I average levels may vary widely between pafrs, and the error variation ts
I
j usually proportional to levels, the value of concern Is the signed percentage
| difference (or signed relative difference). This value Is an.appropriate
parameter to plot on control charts as a means to control variabj11ty of the
measurement process.
Control on the accuracy of the data must be maintained by frequent calibrations
with materials traceable to primary standards. Some type of calibration Is
usually required on air pollution measurement systems daily or for each use, and
occasionally calibration Is necessary before, during and after analysis of a
given batch of samples. Control charts which may be maintained to assure that
•	t
the calibration process remains In statistical control are those for the
slope, intercept, and standard error of prediction for the calibration curves
for multipoint calibrations, and zero and span drift checks to control the drift
of continuous Instruments.
i
DATA ANALYSIS AND VALIDATION
A number of special considerations exist In air pollution measurement systems
with respect to data analysis and data validation. For most air pollution
measurements, the error variations are proportional to the pollutant concentra-
tion level, thus complicating error analysis of the measurement system.
-The aggregate frequency distributions of air pollution data are skewed, often
lognormal or nearly so, requiring logarithmic or other transformations when
summarizing or analyzing data distributions.Complications arise when
taking logarithms of zero values! Also, special treatment of data below the
minimum detectable levels may be required In the characterization or summari-
zation of air pollution data.
2-15

-------
| Because of the many possible causes of variability In air pollution data, the
I
! data validation process as a separate activity Is very Important in air
! monitoring.^ Since the quality of the data Is not evident from the data
i
itself, the routine checks of ancillary data for accuracy and precision must
i
j be made. Some further checks of the data with relation to other data or
information may be made to validate the final product. Various types of checks,
which can and should be made include:
Manual Editing — checks for human error or equipment malfunction, such as:
|	1. impossibly high or low values,
!	2. spikes, such as caused by electronic Interference, and
: |
I	3. repetitious values, such as caused by equipment malfunction,
j	Scientific Validation — checks Involving scientific considerations, such as,:
I
|	1.	time continuity,
!	2.	spatial continuity,
3.	relationships among different pollutants, and
k.	relationships with meteorological data.
i
t
J	PREVENTIVE MAINTENANCE
I
i	* •
Preventive maintenance activities are not usually considered as part of quality
assurance. However, for air pollution monitoring systems, the effectiveness of
preventive maintenance is critical In determining the continuous operation of
j remote, unattended sampling equipment, particularly automatic sampling/analysis
Instruments. Unplanned malfunctioning of these instruments can prevent the
| obtaining of sample results for peak concentration periods, or prevent the
I	-
j accumulation of sufficient data to establish valid trend Information.
1
I
Needless to say, all the above special and important features indeed make
implementation of quality assurance of air pollution monitoring systems an
2-16

-------
Interesting, but difficult and challenging effort.
REFERENCES
1.	Curran, Thomas G. and Nell H. Frank, "Assessing the Validity of the
Lognormal Mode! when Predicting Maximum Air Pollution Concentrations."
U.S. Environmental Protection Agency, Research Triangle Park, North
Carolina. 75-51-3-
% •
2.	Environmental Protection Agency, "Quality Assurance Handbook for Air
Pollution Measurement Systems, Volume I, Principles." EPA-600/9"76-005»
March 1976.
3.	Larsen, Ralph 1., "An Air Quality Data Analysis System for Interrelating
Effects, Standards, and Needed Source Reductions." Air Pollution Control
Association Journal, November 1973 and June 197^.
4.	National Bureau of Standards, "Precision Measurement and Calibration,
Statistical Concepts and Procedures." Special Publication 300, Volume I,
February 1969.
5.	Rhodes, R.C. and R. Jurgens, "Data Validation Techniques Used In the
Regional Air Monitoring Study of the St. Louis Regional Air Pollution
Study," proceedings for "A Conference on Environmental Modeling and
Simulation." Environmental Protection Agency, ORD and 0PM, Cincinnati,
Ohio, April 20-22, 1976.- .
6.	Rhodes, R.C., "Importance of Sampling Errors In C.hemlcal Analysis,"
symposium on "Validation of the Measurement Process," American Chemical
Society, New York, New York, April k-5, 1976.
7.	Rhodes, R.C., "Quality Assurance for ADP (and Scientific Interaction
with ADP)," proceedings of second ORD AOP Workshop, Environmental
Protection Agency, Gulf Breeze, Florida, November Il-M, 1975.
8.	Rhodes, R.C. and S. Hochheiser, "Quality Costs for Environmental
Systems," transactions, 30th Annual Technical Conference, American
Society for Quality Control, Toronto, Ontario, Canada, June 7"9» 1976.
9> von Lehmden, D.V., R.C. Rhodes and S. Hochheiser, "Applications of
Quality Assurance in Major Air Pollution Monitoring Studies—CHAMP
and RAMS," proceedings of International Conference on Environmental
Sensing and Assessment, Las Vegas, Nevada, September I4-I9> 1975*
2-17

-------
6/30/76
8/23/77 Hov.
QUALITY ASSURANCE FOR POLLUTANT MONITORING
by
R.C. Rhodes
An on-going monitoring system will already have implemented a number
of essential elements of a total quality assurance system. When reviewing
an existing monitoring operation or when establishing a new monitoring effort,
it is very desirable that a systematic review be made to consider or
reconsider the quality assurance activities which should be required.
The various elements of a total quality assurance program, listed
below, are discussed in the "Quality Assurance Handbook for Air Pollution
Measurement Systems, Volume I, Principles," EPA 600/9-76-005, March 1976.
Quality policy
Quality objectives
Quality organization
and responsibility
QA manual
QA plans
Training
Procurement control
Ordering
Receiving
Feedback, and
corrective action
Calibration
Standards
Procedures
Internal QC checks
Operations
Sampling
Sample handling
Analysis
Data
Transmission
Computation
Recording
Validation
Preventive maintenance
Reliability records and
analysis
Document control
Configuration control
Audits
On-site system
Performance
Corrective action
Statistical analysis
Quality reporting
Quality investigation
Interlab testing
Quality costs
The extent to which each of the above elements should be implemented
by a given agency will depend upon (1) the objective of the monitoring,
2-18

-------
-2-
(2) the duration of the monitoring period, and (3) the type of sampling/
analysis methods utilized. Each monitoring agency should review the
quality assurance elements with respect to their particular needs, and
should establish a prioritized long-range plan (schedule) for implementation.
For on-going monitoring efforts the quality assurance program should be
dynamic in nature, being continually improved and revised according to
increased knowledge, changing conditions, and assigned priorities.
The elements listed above fall into 4 general categories:
(1)	Management — those activities which are of particular concern
to, and must be initiated and sustained by management, notwith-
standing the fact that all activities of a monitoring system
are management's responsibility.
(2)	Measurement — those activities which are directly involved
in the sampling and analysis of pollutant concentrations.
(3)	Systems — those activities mainly involving the paperwork
systems essential to operate and support the quality assurance
system.
(4)	Statistics — those computational and statistical analysis
techniques and procedures which are necessary as part of the
quality assurance system.
From the above, it is evident that a total quality assurance program is
concerned with all activities which may affect the quality of the monitoring
data, and is not limited in a very narrow sense to essential calibrations
and a few routine duplicate analytical checks.
Management. It is obvious that management's responsibilities should
include a stated written policy and objectives concerning quality. The need
2-19

-------
-3-
for monitoring data of high quality must be continually made evident
by the management with a continual awareness of such need by all the
people whose activities affect the quality of the data. One individual
of the organization should be specifically designated and assigned the
responsibility to oversee all quality assurance activities, even though
the individual may have other assigned duties, and even though "Quality
assurance is everybody's business." This individual should be designated
as the "Quality Assurance Coordinator."
Management should establish training requirements for each individual
whose activities affect quality. Detailed systematic written plans should
be prepared summarizing the various quality control checks made for each
pollutant measurement method or special project. A manual containing
administrative-type procedures applicable to all measurement methods and
projects and to general quality assurance activities should, in time, be
prepared to consolidate in one document all quality-related procedures.
The manual should incorporate the above-mentioned plans by reference.
Management, obviously, is concerned with costs. And after operation
of a monitoring system for, say, a year, a systematic review should be
made of the costs related to quality, to assess the cost-effectiveness of
these activities, and to make indicated changes in expenditures of effort
to obtain the most high quality data for the least cost.
Additionally, management should establish some type of periodic
(say quarterly) report summarizing quality assurance activities and providing
some continual assessment or measure of data quality. This report should
be prepared by the Quality Assurance Coordinator.
2-20

-------
-4-
Measurement. Various EPA guideline documents* have been prepared
/
for each measurement method. These documents provide the identification
of calibration standards and detailed procedures for calibration for each
of the methods. Also included in these documents are detailed procedures
and internal quality control checks which should be made for the sampling,
sample handling, and analysis for each of the methods.
It may be economically prohibitive to implement all of the recommended
checks of these documents, at least initially. Specific minimum checks
for ambient methods are included in EPA 600/4-77-027a, "Quality Assurance
Handbook for Air Pollution Measurement Systems," Volume II, Ambient Air
Specific Methods, May 1977. Specific minimum checks for source emission
methods are included in EPA 600/4-77-027b, "Quality Assurance Handbook for
Air Pollution Measurement Systems," Volume III, Source Emission Specific
Methods, August 1977. Some judgment may need to be exercised as to which
checks seem to be most critical and need to be implemented first. However,
it is best to implement more checks at a lesser frequency than to concentrate
heavily on just a few. The frequency of quality control checks should be
flexible, being increased for those which by experience seem to give most
problems, and being decreased for those which seem consistently to remain
"in control." Similar reasoning applies with respect to the types and
frequencies of independent performance audits described in the guideline
documents.
*EPA R4-73-028, Environmental Monitoring Series (for ambient monitoring
methods)
EPA 650/4-74-005, Environmental Monitoring Series (for source emission
monitoring methods)
2-21

-------
-5-
One essential for obtaining high quality data is the procurement
of measurement equipment and materials of adequate quality. Adequate
specifications should be included in the procurement ordering documents,
and the equipment and materials should be given adequate inspection when
received. Generally, procured ite-is should not be paid for until after
they have been determined to meet the specifications. Obviously, those
methods and equipment designated or specified by the government as
official for determining compliance to ambient air or source standards
should be strictly and consistently complied with.
One part of the measurement method which may not receive adequate
attention is that for flow measurement. For those methods which require
flow measurement, the flow measurement is equally as important as the
pollutant measurement.
A critical requirement of the measurement method (.for pollutant and
flow) is the use of secondary reference standards for calibration, traceable
to a national or international primary standard.
Systems. Detailed, systematic and meticulous records need to be
kept concerning all of the necessary measurements.and computations integrally
involved with the measurement process. Of equal importance is the record-
keeping concerning (1) the written procedures for calibration, operation
and computations, (2) preventive maintenance procedures and records, and
(3) measurement equipment records. A document control system should be
established to identify by number and date each written procedure or
revisions thereof so that the exact procedure used at any specified time
(past and present) can be determined. A configuration control system
should be established to record the nature and dates of any changes in the
2-22

-------
-6-
hardware design, or major corrective maintenance of the sampling, sample
handling, and analysis equipment. These records should be kept by
manufacturer's serial number or an agency-assigned identification number.
Such records should enable one to determine for any past and present
time, the exact configuration of any specific piece of equipment. Also
considered as part of a configuration control system is the site assignment
history for each piece of identified sampling equipment.
Recordkeeping systems are essential to record changes to the
procedures and equipment of the monitoring system. Experienced quality
assurance and statistical personnel are suspicious of the possible effects
of changes to the total measurement process. Their motto might well
be "CAVE VICISSITUDINES" or "CAVE VARIETASOftentimes, seemingly
innocuous changes may cause significant changes in the results. As a
precaution against the introduction of such undesirable effects into the
system, the basic principle of performing overlap checks or comparisons
should be made to assure that such changes are appropriately valid.
Statistics. The use of statistical analyses is essential to an
adequate quality assurance system. Some of the more basic statistical
applications are presented in APTD 1132, "Quality Control Practices in
Processing Air Pollution Samples." Other applications are included in
the Appendices to EPA 600/9-76-005. If a given agency does not have a
person with some training and experience in the basic statistical applica-
tions presented in these documents, either (1) an individual of the agency
*CAVE VICISSITUDINES: Beware of changes
CAVE VARIETAS: Beware of differences
2-23

-------
-7-
with mathematical capability should attend a course to receive such
training or (2) a statistician experienced in these applications should
work with individuals of the agency on a temporary consulting basis
to establish such techniques and provide such training. The applications
of statistics to air monitorii 5 expend from the simplest (control charts)
to the very complex (modeling and computer simulation) and are limited
only by the statistical and computation capability of available personnel
and resources. The techniques of data validation and equipment reliability
analyses are several specific applications of value to a local agency.
In addition to the above, several points deserve further emphasis
with respect to the accuracy and precision of the measurement system. In
addition to the use of good calibration standards and procedures,
inter laboratory tests, such as the exchange of stable samples bei..cen
peer laboratories, or the dissemination of blind samples from some
recognized national or international laboratory is quite valuable in
determining the accuracy of participating agencies. Such testing may
reveal weaknesses in the system which would require special quality
investigations. The use of statistics in planning such studies and in
analyzing the data therefrom, is emphasized.
An excellent way to check the internal precision of an agency's
system, is to establish at one (or a few) selected cities a dual or colocated
sampling instrument for each measurement method.* This type of duplicate
check is one form of the independent performance audits described in the
EPA QA Guidelines document for manual integrated methods. The duplicate
*This technique may be cost prohibitive for continuous instruments.
2-24

-------
-8-
sampling instruments should be maintained as independently as possible
from the regular instrument. For example, where possible, independent
calibrations and flow.measurements should be made for the colocated
duplicate instrument. Similarly, for integrated manual methods the
pollutant analyses should be performed as independently as possible in
the laboratory. For example, the samples from the colocated instrument
should be analyzed on a different batch (using a different calibration)
from that in which the regular sample is analyzed. In the above-described
manner, the best possible estimate for within-agency precision for the
total measurement process can be made. Excessive differences in results
between the paired instruments will indicate weaknesses in the system which
should be isolated by investigation and corrected by appropriate corrective
action.
As a part of the recordkeeping system, each agency should compile
(or maintain) a "Significant Event History." Documentation of the location,
nature, dates and times of special events affecting pollutant concentrations
should be kept in a systematic chronological file. Such events which might
explain unusual results would be those such as dust storms, large fires,
construction work, etc.
Quality Assurance System Review. On occasion, the Quality Assurance
System of a given monitoring agency may be subject to an on-site system
audit or review by an external organization, for the purpose of evaluating
the capability of the agency to produce data of acceptable quality. Such
an independent review is made of the agency's facilities, equipment, personnel,
organization, procedures, etc. by persons knowledgeable in both quality
assurance technology and the measurement technologies involved. The audit
2-25

-------
-9-
should include a review of the agency's actual operations, procedures
and recordkeeping for all of the elements of quality assurance system
discussed herein. The audit team's evaluation should include specific
identification of areas of weakness and specific recommendations for
improvemen t.
2-26

-------
MANAGERIAL QUALITY ASSURANCE ELEMENTS FOR ESTABLISHING
A QUALITY ASSURANCE PROGRAM AND RECORDING CHANGES
LESSON GOAL:	To familiarize students with managerial quality
assurance elements involved in establishing a quality
assurance program and recording changes in an air
pollution monitoring system.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	List the quality assurance elements that are involved
in establishing a quality assurance program and
discuss the factors that should be considered in
their implementation.
2.	List the quality assurance elements that are involved
in recording changes in an air pollution monitoring
system.
3.	Explain the purpose of document control and design
a basic document control system.
4.	Explain the purpose of a configuration control system.
5.	Explain the purpose of preventive maintenance and
discuss the factors that should be considered in
designing a preventive maintenance system.
3-1

-------
MANAGERIAL QUALITY
ASSURANCE ELEMENTS
•	Establishing a quality assurance
program
•	Recording changes in the air
quality monitoring system
ESTABLISHING A QUALITY
ASSURANCE PROGRAM
•	policy and objectives
•	organization
•	quality assurance manual
•	training
•	audit procedures
•	corrective action
•	reports to management
QUALITY
ASSURANCE POLICY
AND OBJECTIVES
Each organization should
have a written quality
assurance policy that
should be made Known to
all organization personnel
QUALITY ASSURANCE
OBJECTIVES
• Data meeting user requirements
•	completeness • representativeness
•	precision	• comparability
•	accuracy
QUALITY ASSURANCE
OBJECTIVES
•	Data are complete if a
prescribed percentage of total
measurements is present.
•	Precision • spread of data
•	Accuracy - nearness to true
value

-------
QUALITY ASSURANCE
OBJECTIVES
i Data must be representative of the
condition being measured (example:
ambient sampling at midnight is not
representative of CO during rush-hour
traffic)
• Data from several agencies should be
in the same units and corrected to the
same conditions (temperature and
pressure) to allow comparability
among groups
ORGANIZATION
Quality assurance is
normally a separate
function in the
organization
BASIC FUNCTIONS
OF QA ORGANIZATION
QA Policy Formulation
•	agency policy
•	contracts
•	procurement
•	staff training
and development
QA GUIDANCE
AND ASSISTANCE
•	laboratory operations
•	monitoring network operations
•	data reduction
•	special field studies
•	instrument maintenance
and calibration
QA GUIDANCE
AND ASSISTANCE
•	preparation of legal actions
•	source emission testing
•	development of control
regulations
•	preparation of technical
reports

-------
TRAINING
• essential for all personnel in any
function affecting data quality
•sample collection
•analysis
•data reduction
•quality assurance
•	on-the-job training (OJT)
•	short-term course training
(normally 2 weeks or less)
•	long-term course training
(quarter or semester in length)
AUDIT PROCEDURES
Performance Audits
•	independent checks
•	made by supervisor or auditor
•	evaluate data quality of total
measurement system
•	quantitative appraisal of quality
AUDIT PROCEDURES
System Audits
•	on-site inspection and review
of quality assurance system
•	qualitative appraisal of
quality
Action*6 QACYCLF ,mp,ement
TRAINING
Plan
Assessment

-------
QUALITY REPORTS
TO MANAGEMENT
Quality facts usually reported:
•	percentage duplication or replication
of determinations
•	instrument or equipment downtime
•	percentage voided samples versus
total samples
•	quality cost in terms of prevention,
appraisal, and correction costs
QUALITY REPORTS
TO MANAGEMENT
Quality facts usually reported:
#	system audit (on'Slte inspection) results
#	performance audit results
•	interlaboratory test results and
intralaboratory test results
(precision and accuracy)
•	status of solutions to major quality
assurance problems
GRAPHIC REPORT TO MANAGEMENT
A SYSTEM FOR RECORDING
CHANGES IN THE MONITORING
SYSTEM IS NEEDED
9 for written procedures ¦ document control
9 for design and location of the monitoring
system - configuration control
• for routine service during and after
operation has begun ¦ preventive
maintenance
DOCUMENT
CONTROL SYSTEM
Purpose:
To provide the latest
written procedures to
all concerned personnel

-------
DOCUMENT
CONTROLSYSTEM
Should include:
• an easy way to make changes
•	removable pages
•	easily identifiable pages
indexed by Section #
Revision #
Date
Page #
Total pages
DOCUMENT
CONTROL SYSTEM
Should include:
• a distribution record
system
DOCUMENT
CONTROL SYSTEM
A new table of contents
should be distributed
with each revision
CONFIGURATION
CONTROL SYSTEM
To record changes in
equipment and the
physical arrangement
of equipment
CONFIGURATION
CONTROLSYSTEM
Purpose:
•	Provide a history of changes
during the life of a monitoring
project
•	Provide design and operational
data on the first monitoring
equipment or system when
multiples are planned

-------
PREVENTIVE MAINTENANCE
An orderly program of positive
actions for preventing failure
of a monitoring system
•	cleaning equipment
•	lubricating
•	reconditioning
•	adjusting
•	testing
PREVENTIVE
MAINTENANCE
Increased Measurement
System Reliability
Increased Data Completeness
DEVELOPMENT OF A
PREVENTIVE
MAINTENANCE PROGRAM
•	review equipment - highlight
items most
likely to fail
•	define spare part^ list
•	define frequency for servicing
•	prepare a checklist
DAILY CHECK LIST FOR NO, ANALYZER
Cl«
all

At laal tallbratloa no
)«lf riagc
hO,


Spaa knot Mlllog
	
D«U


"
¦ rd
That
rTTj
Sal
Su

Oajgca pi«M>rc • ata> lalllal
~





"

-------
REVIEW OF PRE-COURSE PROBLEM 3
LESSON GOAL:	To assure that students can perform the
calculations assigned in pre-course problem 3.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to
calculate:
1.	Arithmetic mean, x
2.	Standard deviation, S
3.	Range, R
4.	Geometric mean, x
g
5.	Geometric standard deviation, S
g
3A-1

-------
BASIC CONCEPTS OF STATISTICAL CONTROL CHARTS
LESSON GOAL:	To familiarize students with basic concepts in
developing and using control charts.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Describe a control chart based upon a period
of acceptable performance.
2.	Distinguish between assignable (non-random)
and unassignable (random) causes of variation.
3.	Describe steps in developing a control chart
system.
4.	Describe the characteristics of a normal
(Gaussian) frequency distribution.
5.	Describe considerations in using control
charts.
4-1

-------
CONTROL CHART
•	how o process
should behove
•	how o process
Is behoving
•	when action should
be taken to make the
process behave as it should
Walter A. Shewhart
Dell
Telephone
Laboratories
The Economic
Control of
Quality of
Manufactured
Product (1931)
"CONSTANT CAUSE" SYSTEM
A system in which we
measure something that
remains constant
Measurements will vary
over time due to random
variations.
\ • • •
5	T tt	i«»t
I • • • • • •
RANDOM NON RANDOM
VARIATIONS VARIATIONS
• unassignable •assignable
•statistical	• out-of-control
control
4-3

-------
OBJECTIVES
OF A CONTROL CHART
•	detect assignable causes
•	trigger investigation leading
to corrective action
DEVELOPMENT AND USE
OF A CONTROL CHART
1.	Determine what data to chart
2.	Accumulate data
3.	Prepare histogram
4.	Determine form of frequency distribution
5.	Calculate mean and standard deviation
6.	Establish limits
7.	Construct chart
8.	Plot points
9.	Highlight out-of-control conditions
10.	Take corrective action
11.	Revise control limits
12.	Maintain historical file
Determine
what data to chart
Accumulate data
TT~
4-4

-------
Prepare histogram
Determine form of
frequency distribution
F.
-do -2o -io |i 1o 2 o 3o
Calculate mean
and standard deviation

^-93.43%
~66.27
I
/0,340*7-
7*^26-/
S =
x =
Establish limits
control	warning
USA ±3 o	±2 a
99.7%	95.4%
British ±3.09o	+1.96a
99.8%	95.0%
Construct chart

-------
Plot points
Highlight out-of-control
conditions
Take corrective action
Revise control limits
Maintain historical file

-------
X-R STATISTICAL CONTROL CHARTS
LESSON GOAL:	To familiarize students with the preparation and
use of X-R statistical control charts.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Explain the Shewhart concept of local control
(i.e., use'of rational subgroups) as a basis
for developing control charts.
2.	Distinguish between situations involving rational
subgroups and situations where no rational
subgroups exist.
3.	Distinguish between control charts that are
based upon a period of acceptable performance
and control charts that are based upon rational
subgroups.
4.	Compute control limits for X-R control charts.
5.	Recall three rules for detecting out-of-control
data points.
6.	Describe five types of out-of-control patterns
which can be visually detected using a control
chart.
7.	List three assumptions concerning the detection
and correction of assignable causes of measurement
process variability.
5-1

-------
LOCAL STATISTICAL CONTROL
Shewhart
Control limits based on:
•	short term rational subgroups
•	small or homogeneous variation
Control charts may be based on:
•rational subgroup
• period of acceptable
performance
CONSTRUCTING A x-R CONTROL CHART
•	identify rational subgroup
•	calculate subgroup arithmetic mean (x) and range (R)
•	calculate overall average arithmetic mean (%) and
average range (R)
•	use factors to establish control Emits for two control
charts (x and R)
•	x chart controls day-to-day variability
•	R chart controls within-day variability
x CHART CONTROL UMITS
UCLS = X+(A2)(R)
Where:
x= 29.92
A 2 = 1.88 (for subgroups containing two
_	data values)
R = 4
UCLS = 29.92+ (1.88) (4)
UCLX = 37.44
LC^=X-(A2)(R)
LCLx = 29.92-(1.88) (4)
LCLx = 22.40

-------
UWLx = X+0(A2)(R)
UWLx =29.92 +(|)(1.88)(4)
UWLx =34.93
LWL,= ^-(|)(A2)(R)
LWLX=29.92 - (| )(1.88) (4)
LWLx=24.91
R CHART CONTROL LIMITS
UCLr=(D4)(R)
Where:
D4 = 3.27 (for subgroups containing two
data values)
R = 4
UCLr = (327)(4)
UCLr = 13.08
LCLr=(D3)(R)
Where:
D, = 0 (for subgroups containing two
data values)
R =4
Id r = (0) (4)
LCLr = 0
UWLr= (D5)(R)
Where:
Ds = 2.51 (for subgroups containing two
data values)
R = 4
UWLR=(Z51)(4)
UWLr = 10.4
5-4

-------
LWLr=(D6)(R)
Where:
Dt = 0 (for subgroups containing two data
values)
R = 4
LWLr = (0)(4)
lwlr = o
•	construct x - R control chart
•	draw control and warning limits
•	plot individual x's and R's
•	use prepared x - R control chart for
evaluating future x's and R's
OUT-OF-CONTROL CRITERIA
•	Points beyond Limits
•	Runs
•	Patterns
Points beyond Limits
one point outside
control limits
two points outside
warning limits
Runs
seven points -
up or down
A77
seven points-above
or below central line

-------
Patterns
•	Recurring Cycles
® Change in Level
•	Lack of Variability
•Trends
•	Most Points near Outside
Limits
Recurring Cycles
Change in Level
Lack of Variability
Trends
5-6

-------
Most Points near Outside Limits
ASSUMPTIONS:
Concerning Assignable Causes
•	possible to identify and correct
•	technically feasible to correct
•	economically practical to correct

-------
I. Homework Assignment
A standard material is checked at periodic intervals during
routine analyses to assure that the analytical measurement
process remains in control. Following are the results in
the chronological order in which they were obtained:
1.
19.0
14.
18.5
2.
18.3
15.
19.1
3.
18.0
16.
21.8
4.
17.2
17.
20.1
5.
17.4
18.
20.6
6.
18.3
19.
18.4
7.
19.6
20.
21.0
8.
20.7
21.
25.1
9.
18.2
22.
21.1
10.
18.8
23.
20.9
11.
20.4
24.
20.8
12.
20.1
25.
23.3
13.
19.6
26.
20.2
1.	Prepare and plot a control chart with appropriate limits,
assuming a single_analysis is performed each day.
2.	Prepare and plot x and R control charts with appropriate
limits, assuming two analyses are performed each day,
i.e., results Number 1 and 2 were performed on day 1,
results Number 3 and 4 were performed on day 2, etc.
(Hint: each day is a subgroup.)
3.	Do the charts indicate any out-of-control conditions?
If so, describe them.
5-9

-------
PROJECT
NAME
No Subgroups
X and R CHART
MEASUREMENT	
PERFORMED.
MEASUREMENT
UNITS
DATE

























MEASUREMENT
CODE
1

























2

























3

























RESULT
1

























2

























3

























SUM

























lVBRAGE,X

























RANGE,R


































	L
1	I
J	1
1	1
J	L
i—1
I—>
1	I
1	I
A	I
1	>
f—
j—a
i	i
J	i
X
J-
INDIVIDUAL VALUES, X




















































































































































































































































































































































































































































































RANGES, R


















































































































































































































































































































































Comments
(Correct.
Action,
etc.)


























-------
X and R CHART
ROJ	MEASUREMENT	 		 MEASUREMENT
NAME	"sing Subgroups	PERFORMED		 .	UNITS
DATE

























MEASUREMENT
CODE |
1

























2

























3

























RESULT
1

























2

























3

























SUM

























IVSRAGE ,X

























RANGE,R


































19 ]
J	1
1	1
1	1
i	L
i	L
i	L
I	1
t ij a
0
1	I

S . -JJL-
AVERAGES, X
















1





























































































































































































































































































































































































































































































RANGES, R




















































































































































































































	r .




































































































,
























Comments
(Correct.
Action,
etc.)


























-------
THE MEASUREMENT PROCESS WITH EMPHASIS ON CALIBRATION
LESSON GOAL:	To familiarize students with quality control
considerations (especially calibration) for the
measurement of air pollutants.
LESSON OBJECTIVES:	At the end of .this lesson, you should be able to:
1.	Discuss quality control considerations for the
three components (pollutant separation from the
air matrix, determination of the amount of pollutant
and the volume of air sampled, and calculation of
pollutant concentration) of an air pollutant
measurement.
2.	Define calibration.
3.	List and discuss the six general elements of a
calibration program.
4.	Define traceability.
5.	Identify services available for EPA's Standards
Laboratory.
6-1

-------
AIR POLLUTION MEASUREMENT
•	separate pollutant from air
•	determine pollutant quantity and
air volume
•	calculate pollution concentration
by dividing pollutant quantity by
air volume
SEPARATION OF POLLUTANT
iw
Manual	Automated
DETERMINATION OF AMOUNT OF
I POLLUTANT AND VOLUME
MlUi OF AIR SAMPLED ft*
CALIBRATION
The process of establishing
the relationship between the
output of a measurement
process and a known input.

-------
ELEMENTS OF A
CALIBRATION PROGRAM
•	statements of allowable time between
calibrations
•	statements of minimum quality of
calibration standards
•	provisions for standards traceablllty
•	provisions for written procedures
•	statements of proper environmental
conditions
•	provisions for proper record Keeping
NBS-SRPls
EPA'S STANDARDS
LABORATORY
• certification of client-
owned calibration and
auditing materials
CERTIFICATION
SERVICES AVAILABLE
•	cylinder gases
•	permeation tube rates
•	flow measuring devices
•	calibration /audit devices
•	static calibration/audit
standards
•	special analyses upon
request
WRITE TO:
¦
ENVIRONMENTAL MONITORING
SYSTEMS LABORATORY
Quality Assurance Division
US EPA, MD-77
Research Triangle Park. NC 27711

-------
CALCULATION OF AMBIENT
POLLUTANT CONCENTRATION
25°C
or
298 K
760 mm Kg
or
1 atmosphere
EPA Standard EPA Standard
Ambient	Pressure
Temperature

-------
United States
Environmental Protection
Agency
Environmental Monitoring and Support
laboratory
Cincinnati OH 45268
Volume 3
Number 1
February 1980
wEPA NEWSLETTER
Quality
Assurance
Environmental
Monitoring
Systems Laboratory
Research Triangle
Park. NC 27711
Sample Repository
A repository of quality control materials is maintained for use by governmental,
industrial and commercial laboratories. A wide variety of samples is available
without charge to the user. These materials are intended as independent
measures of working standards or as internal quality control samples. A
certification of analysis is furnished with each sample. Materials currently
available are listed below.
Quality Control Samples for Ambient Air and Stationary Source Analyses
Compressed Gases
NITRIC OXIDE
SULFUR DIOXIDE
CARBON. MONOXIDE
CARBON DIOXIDE
OXYGEN
NITROGEN DIOXIDE
METHANE
METHANE/PROPANE
Static Samples
LYOPHILIZED MIXTURE
OF SODIUM SULFITE-
TETRACHLOROMER-
CURATE
AQUEOUS SODIUM
NITRITE
DILUTE SULFURIC
SOLUTIONS
AQUEOUS POTASSIUM
NITRATE
Multiple levels from 50 to 1500 ppm
Multiple levels from 50 to 10,000 ppm
3 levels from 5 to 50 ppm
3 levels from 3 to 8 percent
3 levels from 1 to 8 percent
3 levels from 25 to 100 ppm
Multiple levels from 1 to 10 ppm
2 ppm methane with propane ranging from 0.5
to 6 ppm
Simulate collected ambient level SO2 samples
from 10 to 200 f
-------
EMSL-RTP
(Cont'd)
Filter Samples
LEAD FILTER STRIPS
ARSENIC FILTER
STRIPS
SULFATE-NITRATE
FILTER STRIPS
SULFATE ON
CELLULOSE
MEMBRANE FILTERS
SULFATE-NITRATE
ON TEFLON
MEMBRANE FILTERS
LEAD ON CELLULOSE
MEMBRANE FILTER
Lead nitrate deposited on Vi" x 8" glass-fiber filter
strips. Samples simulate collected concentrations
from 0.4 to 15 /xg/m3 of lead. Nine levels are
available.
Arsenious oxide deposited on Vi" x 8" glass-fiber
filter strips. Samples simulate collected concentra-
tions from 0.02 to 1.0	of arsenic. Nine
levels are available.
Sodium sulfate and potassium nitrate deposited on
W x 8" glass-fiber fi'ter strips. Samples simulate
collected concentrations from 0.6 to 40 g/m3 of
sulfate, and from 0.6 to 15 jug'm3 nitrate. Nine
levels are available.
Sodium sulfate deposited on Vi of 4" diameter
cellulose membranes. Samples simulate collected
concentrations from 25 to 320 nq of sulfate.
Seven levels are available.
Sodium sulfate and potassium nitra-e deposited on
37 mm diameter teflon membranes. Samples
simulate collected concentrations from 50 to
200 Aig of sulfate andfrom 50to200 jug of nitrate.
Three levels are available.
Lead nitrate deposited on Vi of 4" diameter
cellulose membranes. Samples simulate collected
concentrations from 10 to 100 /u9 of lead. Five
levels are available.
Flow Measurement
Devices
HI-VOL REFERENCE
DEVICE
CRITICAL ORIFICES
Consists of a set of resistance plates to simulate
various filter, loading conditions used to confirm
flow calibration for the measurement of Suspended
Particulate in ihe air by the High Volume (Hi Vol)
method.
Consists of an orifice assembly to verify the volume
meter calibration of Method 5 sampling train.
Organic Materials
BENZENE
ETHYLENE
METHANE/ETHANE
Compressed gas mixtures of the following organic
materials. Very limited quantities are available for
short-term loan only.
Multiple levels from 8 to 300 ppm
Multiple levels from 5 to 20.030 ppm
Several levels from 1 000 to 8000 ppm of methane,
and 2Q0 to 700 ppm of ethane
PROPANE
Several levels from 5 to 700 ppm
6-8

-------
Several levels from 5 to 700 i>i>m
Several levels from 5 to 700 ppm
Several levels from 5 to 700 |>|>m
Several levels from 5 to 40 ppm
Several levels from 7 to 650 ppm
Several levels from 8 to 600 ppm
Several levels from 5 to 700 ppm
Several levels from 5 to 700 ppm
One level ¦ 2b ppm
Several levels from 30 to 3000 ppm
Several levels from 5 to 10 ppm
One level - 50 ppm
(Robert Lampe, FTS: 629-2573; COML: 919-541-2573)
Standards Laboratory
The Environmental Protection Agency, Environmental Monitoring Systems
Laboratory, Quality Assurance Division, Research Triangle Park, NC
(EPA/EMSL/QAD/RTP) Standards Laboratory offers calibration, standardiza-
tion and certification of client-owned sample material. There is no charge for this
service. Where applicable, certifications are referenced directly to National
Bureau of Standards (NBS) Standard Reference Materials (SRM). The following
services are offered.
•	verification of compressed gas standards used for calibration, span checks
or audits of air quality analyzers (NO, NO2, SO 2, CO, CO2, CH4, hydrocarbons)
•	verification of permeation tube rates (gravimetric or direct comparison with
SRM)
•	verification of flow measuring devices (mass flowmeter, hi-vol orifice
meters)
•	verification of outputs of calibration or audit devices (SO2, ozone, NO. NO2,
CO, CO2, CH4, hydrocarbons)
•	verification of static audit or calibration standards (nitrite solution,
potassium tetrachloromercurate sulfite freeze-dried powders; sulfate and
. lead on glass-fiber filter strips)
•	other special analyses are available upon request.
For detailed information or to receive sample material, contact Berne I. Bennet
at the Quality Assurance Division, Standards Laboratory, EMSL, MD-77
Research Triangle Park, NC 27711.
(Berne Bennett, FTS; 629-2366; COML: 919-541-2366)
EMSL-RTP	PROPYLENE
(Cont'd)
TOLUENE
METHYL ACETATE
VINYL CHLORIDE
HYDROGEN SULFIDE
m-XYLENE
CHLOROFORM
PERCHLOROETHYLENE
BUTADIENE
HEXANE
METHYL MERCAPTAN
METHYL ETHYL KETONE
6-9

-------
COURSE 470
QUALITY ASSURANCE FOR AIR POLLUTION MEASUREMENT SYSTEMS
Group Problem
The occurence in ambient air of a highly toxic gaseous pollutant, cyclolehm-
done (CL), has recently been reported. Each group is to develop a monitoring
and quality assurance plan for the project of determining the ambient level of
CL.
The following data are provided:
•	This is a state-wide problem. All efforts are coordinated
through the state central office.
•	There are 3 locil offices located throughout the state.
The local offices will be engaged in the field work.
Each local office has a laboratory where CL analyses
will be performed. Assume each local office and the state
office have adequate staff and funding.
•	Just by coincidence, there are 3 plants suspected of CL
emissions located in the state - one plant is located in
each of the jurisdictional areas of the local offices. Each
plant utilizes CL in the manufacture of the products.
•	Both a manual method and a continuous monitoring (instru-
mental) method exist. Each local office and the state office
has one gas chromatograph for analyzing manual samples and one
continuous monitoring instrument available for use in the study.
Gas chromatographs must remain in their labs. Continuous monitoring
instruments must remain in the field. The purchase of additional
continuous monitors is not possible.
•	The length of the sampling program is 2 months.
o For manual sampling, 24-hour integrated sampling will be done
every day.
•	Sampling sites have been properly selected around each plant using
historical meteorological data available. The siting team has de-
cided that six stations are needed:
PLANT 1
PLANT 2
PLANT 3
O
6A-1

-------
•	Manual sampling equipment and supplies must be procured.
•	An NBS-SRM (permeation tube) exists (located at the state
office); cylinders of "known" concentrations of CL are
available from FBN, Inc. Purchase of additional permeation
tubes is not possible.
Manual Method - Attachment I
Instrumental Method - Attachment II
6A-3

-------
GROUP PROBLEM PLANNING SHEET
NAME	GROUP
1. Write what you consider to be the QA policy for the group problem.
2. List the quality objectives that your group will require to be met
with regard to the group problem.
6A-5

-------
REVIEW OF CONTROL CHART HOMEWORK
LESSON GOAL:~ ——	To assure that students can perform the tasks
i	assigned in the control chart homework exercise.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Prepare a'control chart based on individual
data values (no rational subgroups).
2.	Prepare an x-R control chart (based on
rational subgroups).
3.	Detect out-of-control conditions indicated by the
prepared charts. '
6B-1

-------
REGRESSION ANALYSIS AND CONTROL CHARTS
FOR CALIBRATION DATA
LESSON GOAL:	To familiarize students with regression analysis
techniques (especially the linear least-squares
method) and control chart considerations for
calibration data.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	List three (3) advantages of using the least-
squares method for determining calibration curves.
2.	List four (4) implied assumptions of the linear
least-squares method.
3.	Discuss the mathematical basis for the least-
squares method.
4.	Compute a linear least-squares calibration equation
from calibration data (given the appropriate formulas).
5.	Compute the standard error for a calibration curve
(given the appropriate formulas).
6.	Compute an inverse calibration equation (given the
appropriate formulas).
7.	Select appropriate control chart calibration para-
meters to plot for a specific monitoring situation.
8.	List two (2) non-linear calibration data analysis
techniques.
7-1

-------
REGRESSION
ANALYSIS
AND CONTROL
CHARTS FOR
CALIBRATION
DATA
CALIBRATION
The process of establishing
the relationship between the
output of a measurement
process and a known input.
Observed Output, y
(dependent variable)
Voltage
ft*
o
Known Input, x
(Independent variable)
Calibration Gas Concentration
METHODS OF DETERMINING
THE INPUT-OUTPUT
RELATIONSHIP
Manual Computation
MANUAL
METHODS
•	draw line by eye
•	draw line using
ruler

-------
COMPUTATIONAL
METHODS
•	mathematically determine
relationship (least-squares
method)
•	advantages
•	more precise
•	everybody gets same line
•	provides formula for transfer
LEAST-SQUARES
METHOD
Assumption:
•	linear relationship
•	error in y - no error in x
•	scatter of error is uniform
•	errors normally and
independently distributed
y
(a minimum)
X
EXAMPLE PROBLEM
16
12
y a
~
o
o I 2 3 4 5
X
• Obtain sums and averages
of data
X
y
X2
y2
xy
X-X y-y
1
2
1
4
2
-2 -5
2
7
4
49
14
-I O
4
7
16
49
28
1 o
_5
12
25
144
60
2 5
12
28
46
246
104

3
7





-------
• Obtain sums of squares
and sum of products
(XX)2
(x-x)(y-y)
(y-y)
4
10
25
I
0
O
1
o
O
4
10
25
io
20
50
• Calculate slope of
line: acceptable
method
s,0pe! „ =
I(x-x)
2£L — 2
10 *
• Calculate slope of line:
preferred _ regression
method	analysis

io4-a» _	~
46-36 10 *
• Determine y-intercept
Intercept :a = y*bx
= 7-2(3)
=1
Equation :y = l+2x
STANDARD
ERROR: Se
The standard
deviation of the
residuals
distribution.
7-5

-------
16
2
8
4
y=l+2x
O
6
O
3
4
5
2
1
DETERMINATION OF
STANDARD ERROR
X
Obs. y Pred. y
d
d2
1
2
3
-i
1
2
7
5
2
4
4
7
9
-2
4
5
12
1 1
1
Id2
1
= 10
Se=71
Se = \/~T~ = ^5*= 2.236
INVERSE CALIBRATION
EQUATION
y=a+bx
• used to relate output value
(y) to input value (x)
•	Using the basic equation :
y = a + bx
•	Obtain the inverse equation
(by solving for x):
_ y-a
X_ b
•	Then: x = -£y + (-f)
x = b'y +a', where b'=-jj

-------
y = a + bx	_
APPLYING
y=l + 2x DATA
= yji	FROM
X 2	EXAMPLE
, , i x PROBLEM
x=Ty+
-------
DATA
^g/ml	Transmittance
O.OOO	0.863
0.032	0.815
0.081	0.752
0.162	0.650
0.326	0.484
0.663	0.279
0.952	0.165
EQUATION
A = log (-^)
A = -logT
• Determine
absorbance values
jUg/ml
T
A
0.000
0.863
0.064
0.O32
0.815
0.089
0.081
0.752
0.124
0.162
0.650
0.187
0.326
0.484
0.315
0.663
0.279
0.555
0.952
0.165
0.782
LINEAR CALIBRATION CURVE
y = .065 + . 750x
0.0
0.2
0.4
0.6
0.8
1.0
Concentration, ^g/ml
• Compute non-linear
least-squares equation
—Quadratic
--Exponential -
7-8

-------
REVIEW OF PRE-COURSE PROBLEMS 1 AND 2
LESSON GOAL:	To assure that students can perform the tasks assigned
in pre-course problems 1 and 2.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Recognize the usefulness of data plotting in
detecting outliers.
2.	Calculate percent differences of paired data
values.
3.	Recognize the usefulness of percent difference
determinations in detecting outliers.
7A-1

-------
IDENTIFICATION AND TREATMENT OF OUTLIERS
LESSON GOAL:	To familiarize students with the need for identifying
and eliminating outliers of quality control data, and
with two statistical outlier tests (Dixon Ratio and
Grubbs T).
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Define outlier.
2.	Recall four (4) possible reasons for the existence
of an outlier in a data set.
3.	Discuss the need for identifying and eliminating
outliers of quality control data.
4.	Recall that data is initially screened for
suspect values using visual techniques.
5.	Employ the Dixon Ratio and Grubbs T Tests (given
the appropriate formulas and critical values
tables) to identify outliers.
6.	Explain in general terms, the meaning and derivation
. . of the significance level critical values of the
Dixon and Grubbs critical values tables.
7.	Discuss advantages and disadvantages of using either
the Dixon Ratio Test or the Grubbs T Test.
8-1

-------
IDENTIFICATION AND
TREATMENT OF OUTUERS
Mu	1 ImmxmI	1 xl
0 20 40 60 80 100
CAUSES OF OUTUERS
A • inaccurate
instrument M[. reading V7^<,
nultunrtion B-*v of output jpy
transcribing [^SU- • calculation	. ^
error	errors
NEED FOR IDENTIFICATION /
ELIMINATION OF OUTUERS
Identification:
*	indicates need for doser control
Elimination:
*	assures analysis is vaEd
*	assures conclusions are correct
PROCEDURE FOR
IDENTIFYING OUTUERS
® screen data
* subject suspect data to
statistical tests
USE OF
DATA PLOTS
FOR INITIAL
SCREENING

-------
STATISTICAL
OUTLIER TESTS
•	Dixon Ratio Test
•	GrubbsTTest
DIXON RATIO TEST
PROCEDURE
[~T| Arrange data in ascending or descending
order
[~2] Calculate a ratio
[~3j Compare ratio to Dixon table
[~4] If appropriate, eliminate suspect value
E Arrange data values in either
ascending or descending order
• If smallest data value is suspect
X < X ^ X < . X
1 ~ '2 — 3 — 	 n
• If largest data value is suspect
X, > X, > -X, > ••'•¦•X
Calculate a ratio - equation
depends upon sample size
•3 to 7	'io=xi*2-
i
• 8 to 10	. rn = x, x2	x„_, x„
• 11 to 13	x, x3	x„., x„
• 14 to 23 ; ra = x, x2 x3	x„.j x„., x„
3
Compare ratio value to Dixon
table oif critical ratio values

-------
4
Eliminate suspect value if ratio
is greater than critical value
.465 > .400
calculated	critical
ratio value	value
EXAMPLE PROBLEM #1
Using the Dixon Ratio Test
determine if the data value,
25.1, is an outlier at the 5%
significance level
DATA VALUES
19.0
19.1
17.2
20.6
iao
20.1
183
21.0
17.4
18.4
207
21.1
19.6
25.1
1^8
20.8
1&2
20.9
20.1
20.2
20.4
233
183

19.6
183
218

DATA VALUES: ARRANGED
<^j)
20.7
19.6
183
233
20.6
19.1
1&2
(218)
20.4
19.0
mo
211
20.2
1&8
17.4
210
20.1
183
17.2
20.9
20.1
ia4

20.8
19.6
183

SOLUTION:
I -	T
r = 25.1 233 21.8	1&0 17.4 17.2
^	'	I
_ M
r22 ~ 7.1
r^ = .465
Since .465 > .400
Then 25.1 is an outlier

-------
EXAMPLE PROBLEM #2
After eliminating 25.1 from the
data set, use the Dixon Ratio
Test to determine if 233 is an
outlier at the 5% significance
level.
DATA VALUES: ARRANGED

20.7
19.6
183

20.6
19.1
182
218
20.4
19.0
mo:-
(211
20.2
; 18.8
17.4
210
20!l
185
172
20.9
20.1
18.4

20.8
19.6
183

SOLUTION:
r ^ = 23.3 21.8 21.1	18.0 17.4 17.2
' •
22
r22 ~ 53
f 22 = .415
Since .415 > .406
Then 233 is an outlier
GRUBBSTTEST
PROCEDURE
m Calculate arithmetic mean
|~2"1 Calculate standard deviation
[31 Calculate a ratio
[4] Compare ratio to Crubbs table	i
|~5~1 " appropriate, eliminate suspect value
EGtlculate arithmetic mean (x)
of data set values

-------
2
Calculate standard deviation (s)
of data set values
s =
Calculate a ratio
• If smallest data value is suspect
T - *~X1
1 _ s
• If largest data value is suspect
x_ - x
T„ =
Compare ratio to Grubbs table
Eliminate suspect value if ratio
is greater than critical value
192 > 2.84
calculated
ratio value
critical
value
EXAMPLE PROBLEM #3
Using the Grubbs T Test, determine if
the data value, 25.1, is an outlier at the
5% significance level for the data set
used in the Dixon Ratio Test procedure
(example problem # 1).

-------
DATA VALUES
19.0
19.1
17.2
20.6
iao
20.1
183
21.0
17.4
1&4
207
211
19.6
25.1
18.8
20.8
1&2
20.9
20.1
20.2
20.4
233
18J

19.6
183
218

SOLUTION:
2
Determine Xx., Xx., and n
Xx. = 516.5
Xx* = 10,340.87
n = 26
To find x:
Xxj
X = 	-
n
5165
X 26
x = 19.87
To find s:
s .
n - 1
(5165)2
26-7
s = 179

Because laigesl data value is suspect,
calculate Tn:
T _ 25JO - 1ft87
n	179
T =292
n
Snce 2.92 > 2.841
Then 25.1 is an outfier

-------
DIXON RATIO TEST
Advantage
•	simple calculations
Disadvantages
•	not aO data set values used
•	limited to data sets with ~ 25 values
or less
GRUBBS T TEST
Advantages
•	more powerful than Dixon Ratio Test
•	can be used for large data sets
Disadvantage
•	involved calculations
Both tests assume an underlying normal
(Gaussian) distribution of data
¦3o -2o -1 a \i 1 o 2o 3a

-------
INTRALABORATORY TESTING
LESSON GOAL:	To familiarize students with intralaboratory
testing considerations.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Distinguish between intralaboratory and inter-
laboratory testing.
2.	Discuss the purposes of intralaboratory testing.
3.	Distinguish among the three levels of precision
measurement: replicability, repeatability, and
reproducibility.
4.	Discuss considerations necessary for designing
an intralaboratory testing program.
9-1

-------
TESTING
Intralaboratory Interlaboratory
PURPOSES OF
INTRALABORATORY
TESTING
Identify:
•	sources of measurement error
Estimate:
•	bias (accuracy)
•	variability (repUcability,
repeatability)
THREE LEVELS OF
PRECISION
MEASUREMENT
•	Replicabllity
•	Repeatability
•	Reproducibility
ReplkabUlty
Repeatability
Reproducibility
Reproducibility
Repeatability
Replicability
9-3

-------
INTRALABORATORY
TESTING DESIGN
CONSIDERATIONS
•	types of measurement
methods
•	potential sources of error
•	testing philosophy
MEASUREMENT
METHODS
Manual:
•	collection
•	analysis
Continuous:
•	collection/analysis
POTENTIAL SOURCES OF ERROR
Operator/
/\ZVfr


Calibration


MEASUREMENT OF
OPERATOR PROFICIENCY
Major Problems
•	what kinds of audit samples to use
•	how to introduce samples into
analytical process without analyst's
knowledge
•	how frequently to audit
KINDS OF
AUDIT SAMPLES
•	duplicate of real samples
•	prepared reference
samples

-------
AUDIT SAMPLE
INTRODUCTION
« samples should have identical
sample labels and appearance
as real samples
• supervisor and analyst should
overlap the process of logging
in samples
AUDITING
FREQUENCY
Decision based on:
•	degree of automation
•	total method precision
•	analyst's training, attitude,
and past performance

-------
INTERLABORATORY TESTING
LESSON GOAL:	To familiarize students with interlaboratory performance
testing considerations and EPA's interlaboratory performance
audit program.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Pescribe and distinguish between the two types
of interlaboratory tests - collaborative tests
and periodic performance tests.
2.	Describe considerations in designing an inter-
laboratory performance test.
3.	Describe EPA's interlaboratory performance
audit program.
4.	List the six common types of performance audits
conducted by EPA.
5.	Identify the audit materials that are available
from EPA.
6.	List sources of information concerning EPA's inter-
laboratory performance audit program.
7.	Discuss data analysis performed on the results
of EPA's interlaboratory performance audits.
8.	Discuss results of EPA's interlaboratory per-
formance audits.
10-1

-------
INTERLABORATORY AUDITS
Lab D	Lab E
INTERLABORATORY
TEST
•	identifies biased labs (and/or
analysts)
•	estimates "between laboratory"
measurement method
reproducibility
CONSIDERATIONS
IN PLANNING THE
INTERLABORATORY
TEST
• Selection of the parameter
to be measured
•	automated method - total
•	manual method - portion
• Selection of the proper
sample
TC
• J collection
analysis

-------
SAMPLE SIZE
• Sample preparation —
insure uniformity.stability
"Pigeon Sampling"
A A
Sample preparation — evaluate
sample-to-sample variability
!~

i- i- n
r~ i"
r
• Test instructions
•	clear and complete
•	only one interpretation
•	specify handling - routine
or special?
•	specify reporting form and
units
SELECTION OF METHODOLOGY
•	inter-method lab variability —
lab selects method
•	same method lab variability—
specify method
Always require written copy of method used!

-------
• Report results to the labs
tab A
timely
confidential
recommend corrective
action if needed
Deal Uo*
encl°»td l!
another W»P<
(or you to try-
P,M«f°1,OW
lostructK"1*
• Follow-up
RECAP
•	select the parameter to be measured
•	select the sample
•	prepare the sample
•	prepare the instructions
•	provide feedback of results
•	specify corrective action
•	follow-up
EPA I INTER LABORATORY
PERFORMANCE AUDIT PROGRAM
EPA
ENVIRONMENTAL
MONITORING SYSTEMS
LABORATORY
•	sample repository
•	free samples for
quality control

-------
QC SAMPLES FOR AMBIENT AIR AND
STATIONARY SOURCE ANALYSES
cylinder gases
static samples
niter samples
critical orifices
organic gas mixtures
WRITE TO:
ENVIRONMENTAL MONITORING
SYSTEMS LABORATORY
Quality Assurance Division
US EPA. MD-77
Research Triangle Park. NC 27711
V
60
3





>
40


V
30
b

o

a

41
DC
IO
DATA DISTRIBUTION
I
• t •
I
20 30 40
True Value, ppm
DISTRIBUTION WITH
OUTLIERS ELIMINATED

60 • -
V
3
» 50
> 40
V
W 30
u
0 20
a
fig 10
¦ i
10 20 30 40 SO
True Value, ppm
AVERAGE VALUES
 40 .



t 30 "

0 20.

a

£ 10-

10 20 30 40 50
True Value, ppm

-------
STANDARD DEVIATION
S =
Ix?-

n
ni
COEFFICIENT OF
VARIATION
C =			xi oo
true value
Cv VERSUS TRUE VALUE
60-r
50- ¦
40- ¦
97®
!*• >

¦ 1977
• 1978
• •
I I I I I I
100	300	300
nOa. fig/in3

-------
CO SURVEY RESULTS
¦ 197*
;:v«
J 1977
• 1973
• 1978
A 1976
I I I I I I I I
CO, ppm
Why are audit test results
optimistic?

-------
PROCUREMENT QUALITY CONTROL
LESSON GOAL:	To familiarize students with quality control procedures
for the procurement of supplies and equipment.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Recall the four (4) major groups of procured
items of concern in procurement quality control.
2.	List at least two (2) procured items from each
major group that affect air monitoring data quality.
3.	Describe a quality control procedure for the
procurement of an ambient air quality analyzer.
4.	Describe quality control considerations in the
procurement of calibration standards, chemicals,
and materials.
11-1

-------
f*lb
Equipment
iM! II
!-H-1-
i n 1
Colibrotlon
Stondoidj
Ctemfcal}
MO(*HoU
PROCUREMENT
QUALITY
CONTROL
EQUIPMENT
CALIBRATION STANDARDS
CHEMICALS
MATERIALS
11-3

-------
PROCEDURE FOR PROCURING AN
AMBIENT AIR QUALITY ANALYZER
|T| Prepurdiose Evaluation/Selection
[2)	Writing of Purchase Contract
Specifications
[3)	Acceptance Testing
[4)	Overlap Testing
[51 Record Keeping
[j]Prepurchase Evaluation/
Selection
•	analysis of analyzer performance
specifications
•	assessment of analyzer
Analysis of Analyzer Performance
Specifications
Assessment of Analyzer
• review
operations
manuals
Contact
Users for
Opinions
11-4

-------
I

•Nhj
O D >'
Uev




~ll
In-House
Testing
Field
Testing
' V-jziH lln;.!




X 1
Selection
of
Analyzer
2
Writing of Purchase Contract
Specifications
inclusion of performance specs test doto
payment contingent upon successful
acceptance testing
inclusion of warronty
inclusion of consistent operating monuols
provision of operator training
provision for burn-in
inclusion of consumables and spore parts
31 Acceptance Testing
11-5

-------
[4] Overlap Testing
5] Record Keeping
PROCUREMENT
CONSIDERATIONS FOR
CALIBRATION STANDARDS
•	Purchase Contracts
•	Overlap Testing
Purchase Contracts
Requirements:
•	NBS traceability
•	certificate of analysis
•	calibration curves
•	user instructions
Overlap Testing

-------
PROCUREMENT
CONSIDERATIONS FOR
CHEMICALS
•	Certified Analyses
•	Overlap Testing
•	Record Keeping
PROCUREMENT
CONSIDERATIONS FOR
MATERIALS
•	Performance Parameter Specs
•	Acceptance Testing
•	Overlap Testing
11-7

-------
1979-ASQC TECHNICAL CONFERENCE TRANSACTIONS - HOUSTON	35
QUALITY ASSURANCE FOR PROCUREMENT
OF AIR ANALYZERS
Mary Jo Kopecky and Bruce Rodger
Wisconsin Department of Natural Resources
Madison, Wisconsin
ABSTRACT
Ambient air monitoring in the vicinity of a point source requires different
characteristics in an analyzer than monitoring for background data in an area
where there are no point sources. Different degrees of sensitivity, different
response times, and the degree of automation required, will differ in each
setting.
Before purchasing an analyzer the user must, therefore, define his needs in
terms of sensitivity, accuracy, data completeness, response to changes in ambient
concentrations, reliability and maintainability, degree of automation, ease of
operation and cost. The Wisconsin Department of Natural Resources has esta-
blished a program of procurement quality assurance to both define the user's
needs and to evaluate the ability of different analyzers to meet these needs.
This program is divided into four stages: 1) User Needs Analysis, 2) Pre-
Purchase Evaluation, 3) Purchase Specifications and Contract Conditions, and 4)
Acceptance Testing.
This four stage process was applied In the recent purchase of twelve sulfur
dioxide analyzers for the Department's Monitoring Program. Surprisingly, the
instrument that looked the best at the beginning of the pre-purchase evaluation,
and toward which the user group was leading, was nat the analyzer that scored
highest in the final evaluation. As a result of the Department's evaluation
process, a different analyzer was purchased. By defining the user needs in
quantifiable form, and then objectively measuring the ability of different
analyzers to meet these needs, the Department of Natural Resources has assured
itself of purchasing the best available analyzer that csn do the Job required.
INTRODUCTION
Environmental Protection Agency regulations state that no later than
February 1980, all ambient air analyzers used in state monitoring programs as
specified in their state Implementation plan must be approved reference or
equivalent analyzers. For most states this will mean replacing "obsolete"
analyzers with newer models. The money spent on this new equipment in the next
two years could easily reach ten million dollars. Unless state agencies and
private air monitoring groups take precautions, newly purchased analyzers may
not meet their needs, or if they do, it may be at an excessive cost. To avoid
such problems, a Quality Assurance Plan for procurement of analyzers and other
capital purchases, is desirable.
The Wisconsin Department of Natural Resources (DNR) has developed such a
plan for its instrument procurement and has recently used the plan in the purchase
of sulfur dioxide analyzers for its statewide monitoring network. This paper
describes the general features of the DNR procurement plan, and how the plan was
applied in selecting a specific model of sulfur dioxide analyzer for Wisconsin.
This plan provided DNR with an objective means of selecting an analyzer which
best meets the needs and resources of the agency. It has general applicability
to all agencies and to private consultants and corporations as well.
The plan consists of three parts:
I.	Pre-purchase evaluation and selection of the analyzer.
II.	Purchase Contract Specifications based on the pre-purchase evaluation.
III.	Acceptance Testing of the purchased analyzers.
"Copyright 1979 American Society for Quality Control, Inc.
Reprinted by permission."
f 1979 American Society for Quality Control
11-9

-------
38	1979 - ASQC TECHNICAL CONFERENCE TRANSACTIONS - HOUSTON
PRE-PURCHASE EVALUATION
The pre-purchase evaluation defines the specifications that the analyzer
must meet and then to determine which analyzer best meets these specifications.
1.	Analysis and Rating of Performance Needs
Before evaluating Individual analyzers, the performance required of the
analyzer must be defined. Where will the analyzer be used - around a point
source where concentrations of sulfur dioxide exceeding 500 parts per billion
are not uncommon, or In a rural setting where values as high as 50 parts per
billion are quite rare? What levels of accuracy and precision are needed? What
should the response time of the analyzer be? Do the expected ambient concen-
trations change rapidly or over a period of hours? What maintenance requirements
does the agency have - will operators attend the site dally, or only once per
week? How much funding Is available for this purchase?
Once the performance specifications are defined, they are ranked in order
of their importance to the monitoring network. The most .important specification
receives the highest number and the least Important specification receives a
ranking of "1",
2.	Instrument Assessment
An evaluation of each specific type of Instrument must be made to decide
which analyzers should be brought to the lab for further checkout. This as-
sessment is a two step process.
a.	The advantages end disadvantages of each type of Instrument are
determined by evaluating information provided by the manufacturer, as well as
that found in the analyzer's operating manual, this involves a comparison of
measurement principles, performance characteristics and the relative complexity
of operation.
b.	Several users of each analyzer are contacted to check on the analyzer's
performance in the field. A user contact questionnaire which was developed by
DNR which includes such information as the percent of valid data capture, the
average number of instrument breakdowns since the analyzers were purchased, the
parts replaced most frequently, and the percent span drift experienced.
The analyzer's ability to meet each of the performance specifications is
converted to a numerical rating, with the highest number assigned to the analyzer
which best meets the specification. The rating is multiplied by the ranking
assigned that specification In the earlier needs analysis. This process is
repeated for each specification, and the results for all specifications are
added. The result' is a ranking of Instruments according to their apparent
ability to meet the performance specifications. The three top rated analyzers
are then evaluated further.
3.	In-House Testing
The three analyzers with the highest scores in the Instrument Assessment
are subjected to a laboratory checkout to determine which analyzer should be
purchased. The ln-house testing consists of evaluating the critical performance
parameters identified in the earlier needs analysis. For example, if low
ambient levels are routinely measured, instrument noise will be an important
parameter. Each instrument is then checked for its noise level using the
methods described by EPA In their regulations for equivalency testing. If low
maintenance costs are required, the instrument is evaluated as to the type of
parts used and the expected frequency of replacement, in an effort to estimate
the costs.
Each analyzer is rated using the same rating method used in the earlier
instrument assessment. The in—house testing scores are combined with the scores
from the instrument assessment to give a grand total for each analyzer. The
Instrument with the highest 9core will be the one which best meets the monitoring
need.
11-10

-------
1979 - ASOC TECHNICAL CONFERENCE TRANSACTIONS - HOUSTON
PURCHASE CONTRACT SPECIFICATIONS
The performance specifications for the instrument with the highest ranking
are written into the contract for purchase. The purchase contract specifies a
60-day period, after instrument delivery, in which DNR can evaluate each instru-
ment to assure that each one meets the performance specifications written Into
the contract. Instruments not meeting the specifications can be returned to the
manufacturer for replacement, without charge to DNR.
The contract also requires the vendor to post a performance bond - 20Z of
the total purchase price - for a one year period. The bond would be forfeited
for:
a.	failure of any instrument to meet the performance specifications for
at least one year,
b.	failure of the vendor to honor a one year'warranty on all instrument
components,
c.	failure of the vendor to provide a substitute analyzer to replace a
faulty analyzer being repaired under the one year warranty, and
d.	failure of any instrument to operate properly for more than 30 days
during the first year of operation.
These contract specifications help Insure that DNR will have reliable,
functioning analyzers providing	data capture.
ACCEPTANCE TESTING
Before a new instrument is considered capable of generating valid ambient
air quality data, it oust be checked to insure that it meets the performance
specifications in the purchase contract. As each Instrument is received it is:
1.	Inspected to be sure that all parts and optional equipment are present,
connections are tight, and that each analyser Is configured the same way - same
number of circuit boards, same type and size of pumps, etc.
2.	Operated in the laboratory for at least one week to detect immediate
malfunctions due to defective parts, poor connections, etc.
3.	Tested for critical parameters - e.g., the noise level.
In addition, a random sampling of analyzers is chosen and more in-depth
performance checks are conducted. If these checks fall to meet the performance
specifications in the purchase contract, all analyzers will be checked in-depth.
Instruments passing through this process without problems are placed at
monitoring sites and run simultaneously with the "old" analyzers for at least 30
days. The data obtained is used to determine if the new analyzer is functioning
properly, and also to establish any difference in the data base due to the
switch to the new analyzer. It is important to have this information when
evaluating data from a site over a period of years.
PRACTICAL APPLICATION OF PROCUREMENT PLAN
The procedures previously discussed were used during the summer of 1978 by
the State of Wisconsin to purchase 12 new sulfur dioxide analyzers. The first
step in this process was to perform a needs analysis. This analysis indicated
we were required to generate valid continuous ambient sulfur dioxide data at
seven permanent stations in the Milwaukee area and at three mobile vans which
collect data statewide. Also, there was a requirement to obtain continuous SO2
data fToo sites in Green Bay and Madison. As mentioned earlier, by February
1980, all ambient air analyzers in state monitoring programs must be approved
reference or equivalent model analyzers. Therefore, it was determined that the
state needed to purchase 12 sulfur dioxide monitors approved by EPA as being a
reference or equivalent method. In addition to this basic need, the following
Items were also specified in the analysis:
11-11

-------
38	1979 - ASOC TECHNICAL CONFERENCE TRANSACTIONS - HOUSTON
1)	Generation of continuous SO2 data.
2)	Operate unattended for long periods of time (over weekends, etc.).
3)	Generate valid SO2 data In areas of both high and low ambient con-
centrations (minimum detectable limit to 1.000 ppm).
4)	Capability of automated zeroing and spanning.
5)	Efficient, cost-effective operation (low maintenance).
At the time of our study EPA-designated equivalent continuous SO2 analyzers
were available for purchase from five manufacturers. Following the needs
analysis our next step in the procurement process involved contacting these
companies for operating manuals of these analyzers, written results of their
equivalency testing, plus a liat of firms or governmental agencies that owned or
operated their analyzers. We will refer to the five analyzers as A, B, C, D,
and E. It should be noted here that most members of our monitoring staff leaned
toward purchase of analyzer B at the beginning of the study. Analyzer B was
favored due to the fact that it used the same method of detection presently in
use by the Department - it was familiar to monitoring personnel. A number of
major improveaents in this method Incorporated into analyzer B also made it much
more attractive than the existing monitors using the same basic method of
detection.
Within two weeks of notification by telephone, all companies had furnished
us with operational manuals for their analyzers. Only company A provided us
with a written report on their equivalency testing. The other companies Indicated
the data were available, however, It was In the form of very extensive technical
documentation which they would provide us with if we absolutely needed the data.
All of the manuals were examined and judged on the following criteria:
1)	Readability and ease of understanding.
2)	Sufficient Information available to allow a chemist to troubleshoot
the analyzer at the site.
3)	Sufficient Information available to allow an electronic technician to
work on the analyzer (circuit diagrams, etc.).
4)	Understandable start-up, operation, calibration, and maintenance
instruction.
5)	Listing of spare parts Inventory.
In addition to the above information, operating specifications for each of
the analyzers were taken from the manuals. This information Included the
following:
1)	Standard ranges	6)
2)	Noise	7)
3)	Lower detectable licit	8)
4)	Rise, fall, lag time	9)
5)	Precision
Sample flow
Length of unattended operation
Hydrogen flow rate (If using H2)
Ambient operating temperature
All the above information was organized into tables to allow easy comparison
of criteria between analyzers. These are shown in Tables I and II attached to
this report.
The user's list in all cases did not come as quickly as the manuals.
Company E was so late in sending their user's list that we did not have sufficient
time to contact users of their analyzers. A minimum of four users of each
analyzer was contacted and questioned concerning each of the following:
Cost of operation
Instrument downtime
Interference problems
Number of instruments in use and
number of years in use
1)
Mechanical dependability
7)
2)
Electrical dependability
8)
3)
Chemical dependability
9)
4)
Ease of working with
10)

instrument

5/6)
User experience with vendor

11-12

-------
1979 - ASOC TECHNICAL CONFERENCE TRANSACTIONS - HOUSTON	39
The above information for all of the users questioned for each analyser «oa
put in table form. Tables I It-VI at the end of this report contain that data.
Each manufacturer vas then contacted again and asked about the following:
1)	Location of factory repair service and response time
2)	Warranty terms
3)	Auto zero/span availability
4)	Standard instrument ranges
5)	Unit cost of Instrument with auto zero/span and amount of discount
This information vas also placed In table form (Table VII) for all the
analyzers to allow for ease of comparison between analyzers. Also considered in
the pretesting segment of the procurement process were the following;
1)	Vendor cooperation for pre-purchaae agreement concerning ln-house
testing - This Involved contacting each vendor to determine if they
would allow us to use an analyzer of theirs, without cost, for a
period of two to three weeks for the purpose of performance testing.
2)	Required support equipment, e.g., electronic equipment, gas cylinders,
high mortality parts, etc,
3)	Conformity to existing calibration devices and site sampling manifolds.
4)	Conformity to existing data acquisition systems and ability to be rack
mounted.
The above information was also placed In a table (Table VIII) to allow for
comparison between the analyzers. Finally a table (Table IX) of major advantages
and disadvantages for each of the analyzers vas drawn up for consideration in
determining which three analyzers should be chosen for ln-house testing.
To determine which three analyzers vould.be tested ve used a total point-
rating system. Each of the criteria considered in the pretesting data search
vas rated from 1-6 depending on Its degree of importance. In our particular
situation noise and precision were considered very important and were given a
rating of 6. Sample flow, not considered as important, was given a rating of 2.
Each analyzer was ranked from 1-5 depending upon how favorably they compared to
other analyzers being checked for a particular criteria. A ranking of 5 meant
that the analyser was best among the analyzers considered for that particular
criteria. To determine the number of points each analyzer received for each
criteria, the rating and ranking numbers were multiplied together. These
products were then summed for each analyzer. The analzyers with the highest
total points would be the ones chosen for ln-house testing. The pretesting work
indicated that analyzers A, B, and C should be chosen for further testing. At
this point In the procurement process analyzer B was still the favored analyzer.
ln-house testing performed on the analyzers generated test data concerning
the following parameters:
with multiple order
1)' Noise*
•80X Full Scale
2) Zero Drift
'24-Hour at 20Z of Full Scale
3) Span Drift-
>8-Hour at 80Z of Full Scale
4) Precision
'80% of Full Scale
!0Z of Full Scale
S) Lag, Pall, Rise, and Calibration Times
11-13

-------
1979 - ASQC TECHNICAL CONFERENCE TRANSACTIONS - HOUSTON
The testing procedures followed were taken from the Federal Register, Vol.
40, No. 33, Part II, Ambient Air Monitoring Reference and Equivalent Methods.
Company C was slow in providing us with an analyser for testing. We were not
able to complete all the testing procedures on that analyzer. Results of the
testing were summarized in a table (Table X). Prior to the in-house testing v*
had feared that response time for analyzer A would be too slow for our needs.
Analyzer B was expected to have the most rapid response time. The surprising
test results indicated that analyzer A had a more rapid response time than
analyzers B and C.
Next analyzers A and B were moved to an active monitoring site where they
were installed and operated for a two week period as if they were being used to
routinely collect ambient S02 data. This included routine calibrations and
zero/span checks. Testing was also done at the monitoring site to determine if
analyzer response was adversely affected by any interference. The analysis
method for analyzer B was flame photometry. A Technical Assistance Document
(EPA-600/4-78-024) concerning the use of flame photometric detectors for mea-
surement of SO2 in ambient air referred to a suppression of analyzer response
for this method by carbon dioxide (CO2) gas. We discovered at this point In the
testing that analyzer B was subject to the above interference from COj. We also
found that analyzer B was less stable than analyzer A during calibration and
zero/span checking.
SELECTION OF ANALYZER AND CHOICE
At the end of the testing we had obtained sufficient information to allow
a decision to be made on instrument procurement. Copies of all the data generated
during the procurement process were distributed to all DNR parties affected by
the Instrument purchase. A meeting between these parties was held to decide on
which analyzer to purchase. All the data was reviewed and the advantages and
disadvantages of each of the analyzers were discussed. As mentioned earlier
analyzer B was heavily favored before the procurement process began. However,
as a result of the data collected and testing done, analyzer A (T.E.C.O. Model
043) emerged as the analyzer which would best satisfy our needs expressed
earlier in the needs analysis. Had we not involved ourselves in this procurement
process, it is possible we would have purchased analyzer B, and its associated
problens, without giving full consideration to the T.E.C.O. We intend on using
this procurement process for purchasing all capital equipment in the future and
strongly recommend other agencies use this or a similar process for all their
equipment purchases.
LCS 650:90:992
11-14

-------
PERFORMANCE AUDITS
LESSON GOAL:	To familiarize students with performance auditing
considerations (especially when conducting performance
audits on continuous ambient air quality analyzers).
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Distinguish between a performance audit and a
systems audit.
2.	Describe differences in performance audit procedures
for continuous vs. manual measurement methods.
3.	List the four purposes of performance audits.
4.	Describe considerations in conducting performance
audits on continuous ambient air quality analyzers.
12-1

-------
AUDITS
Performance	System

l;: i
l,si
» Quantitative
%U
>-a ^
Qualitative
PURPOSES OF PERFORMANCE AUDITS
•	identify sensors operating out-of-control
•	identify systematic bias of monitoring
network
•	measure improvement in data quality
•	assess accuracy of monitoring data
PERFORMANCE AUDITS
Continuous
•	sampling/analysis/data reduction
Manual
•	sampling
•	analysis
•	data reduction
PROCEDURE FOR MANUAL METHODS
• Sampling — check flow rate
with rotameter
Analysis—analyze reference
samples
Data Reduction—perform
independent calculations
Plot audit results on
control chart
QA
Handbook
Vol II
Sec 2.18
pg 10 of 10
pg 4 of 10

-------
PROCEDURE FOR CONTINUOUS
AMBIENT AIR ANALYZERS
1.	Select audit materials
2.	Select audit concentration levels
3.	Determine auditor's proficiency
4.	Select out-of-control limits
5.	Establish communications system
6.	Conduct audit
7.	Verify stability of audit materials
8.	Prepare audit report
9.	Follow up audit recommendations
1. Select audit materials
(©¦ ,

[. . V 'WL
hla*—*








V1 
-------
• use materials
traceable to NBS
Standard Reference
Materials
2. Select audit concentration levels
3. Determine auditor's proficiency
Cylinder
No.
Known
Concentration Value
Auditor's Measured
Concentration Value
1


2


3

		t~-






4. Select out-of-control limits
analyzer known
%Diff.= ,a',"e ~ *100
known value
5. Establish communications system

-------
) ' I if 1111 lilTTLlin 11 i
6. Conduct
audit
7. Verify stability of audit materials
Audit
Gas 5RM
I
~ §0B
i'
8. Prepare
audit report
9. Follow up audt
recommendations
12-6

-------
SYSTEMS AUDITS
LESSON GOAL:	To familiarize students with systems auditing
procedures.
t
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	State the purpose of systems auditing.
2.	Recognize items that should be evaluated during
a systems audit.
3.	Describe the procedure for conducting a systems
audit.
13-1

-------
SYSTEMS AUDIT
• independent, on-site
inspection and review
of quality assurance
system
qualitative appraisal
of system
PROCEDURE FOR
CONDUCTING A SYSTEMS AUDIT
•	Prepare questionnaire
•	Review questionnaire
•	Identify weaknesses/prepare checklist
•	Arrange entrance interview
•	Perform audit
•	Conduct exit interview
•	Prepare report
•	Follow up recommendations
Qocsitonnalia

Organiiaslonal Ch*ri
~ ~
SOP't
~ ~
Personnel/Training
UU
racllilU*
~ u
EquipRMml/Suppfic*
~~
Monitoring
~ ~
Data Handling
~ ~
Quality Auurantr
~ ~
• Prepare
pre-audit
survey
questionnaire
Review completed
questionnaire
* Identify
agency
weaknesses
and prepare
audit checklist

-------
* Arrange entrance interview
•Perform
audit
• Conduct exit interview
Audit
Report
findings
to audited
agency
• Follow up audit
recommendations

-------
QUALITY ASSURANCE REQUIREMENTS FOR SLAMS AND PSD
LESSON GOAL:	To familiarize students with quality assurance regulations
pertaining to ambient air quality monitoring (especially
data quality assessment in terms of precision and accuracy
requirements)•
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Briefly describe the Standing Air Monitoring Work
Group (SAMWG) and its major quality assurance
finding and recommendation.
2.	List the four types of ambient air monitoring stations
defined in 40 CFR Part 58.
3.	List the appendices of 40 CFR, Part 58 that describe quality
assurance requirements for ambient air monitoring.
4.	Recognize that appendices A and B describe quality
assurance requirements for SLAMS and PSD stations,
respectively.
5.	List the two quality assurance functions required
by 40 CFR Part 58 appendices A and B.
6.	Describe air monitoring activities that must be
addressed by the quality assurance program.
7.	Distinguish between precision and accuracy.
8.	Recognize the need for precision and accuracy assessments.
9.	Describe the precision and accuracy checks required
for manual and automated measurement methods.
10.	Compute precision and accuracy assessments for manual
and automated measurement methods (given necessary
equations).
11.	Describe quality assurance reporting requirements.
12.	Compare and contrast quality assurance requirements
for SLAMS and PSD stations.
14-1

-------
QUALITY
ASSURANCE
FOR SLAMS
AND PSD
SAMWG
Standing Air
Monitoring Work
Group
MAJOR QA
FINDING
• questionable data quality
MAJOR QA
RECOMMENDATION
• establish formal QA
programs
40CFR58
14-3

-------
MONITORING STATIONS
SLAMS
State and Local Air Monitoring

Stations
NAMS
National Air Monitoring Stations
SPMS
Special Purpose Monitoring

Stations
PSD
Prevention of Significant

Deterioration
40CFR58
APPENDIX A — "Quality Assurance
Requirements for State
and Local Air Monitoring
Stations (SLAMS)"
APPENDIX B — "Quality Assurance
Requirements for
Prevention of Significant
Deterioration (PSD) Air
Monitoring"
APPENDIX A
QA Functions
*	control requirements
*	data quality assessment
CONTROL REQUIREMENTS
•	will be in general terms
•	states to develop and implement a
QA program which will:
•	provide data of adequate quality to meet
monitoring objectives
•	minimize loss of air quality data due to
malfunction or out-of-control conditions
•	must be approved by Regional
Administrator
GUIDANCE
•	Quality Assurance Handbook for Air
Pollution Measurement Systems
•	Volume I - Principles
•	Volume II - Ambient Air Specific Methods
•	Reference and Equivalent Methods given in
40 CFR 50 and 40 CFR 53
•	Operation and Instruction Manuals of
Designated Analyzers

-------
PROGRAM CONTENT
•	method or analyzer selection
•	equipment installation
•	calibration
•	zero and span checks and adjustment
•	control checks
•	control limits for zero, span and
control checks - corrective action
PROGRAM CONTENT
(continued)
•	use of multiple ranges
•	preventive maintenance
•	quality control procedures for
episode monitoring
•	recording and validating data
•	documentation of QC information
TRACEABIUTY REQUIREMENTS
•	gaseous standards for CO, SO2, and
NO 2 traceable to NBS
•	03 test concentrations generated by
UV photometer
•	flow measuring instruments traceable
to authoritative volume
EPA INTERLABORATORY
PERFORMANCE AUDIT PROGRAM
and return rtW®1
EPA SYSTEM AUDIT
•	facilities
•	equipment
•	procedures
•	documentation
•	personnel
•	(all 23 QA elements)

-------
QA program reviewed for:
•	adequacy
•	compliance

DATA QUALITY
ASSESSMENT"
Precision and
Accuracy
PRECISION AND ACCURACY
PRECISION REFERS	ACCURACY REFERS
TO REPRODUOBtUTY TO CORRECTNESS
A	A
vs?	v&r
Precision is good but	Accuracy is good but
accuracy is poor.	precision is poor.


Both precision
and accuracy
are good.
IMPORTANCE OF
PRECISION AND ACCURACY
DETERMINATIONS
•	needed to determine quality of data
recorded
•	useful for data validation
•	minimizes generation of erroneous data

-------
MANUAL
METHODS
Precision
MANUAL METHODS
Intenul Checks
M),
NU,
JSf
Precision
Accuracy
fb»
Afut)W«l
CuOoKlrd
fcjmplrr
[
Al lri\l 1 pn quMta
Al ulr* Mfh yrii
1 lr»»U
Ucfe ifwJftl*
i Al Ir4ti rwkf p*f qwrtn
MANUAL METHODS
Extern*! Audtti

PcHomuncc


riu»
AM))!**!
Syslem
M>,
-
IMU
Rr*ion
NO,
-
IMS1
Rrftiun

IMy
-
¦•Won
litem o>
Annul
Vmi-Aanu^i
Annual
ImntM)



COLLOCATED SAMPLER DATA BY SITE
Day
Duplicate
Sampler
Designated
Sampler
Difference
dj
(%)
1
1
n.
i


)
(
Y
- X
Y - X
(100)




di
95% probability limits = dj ±
d: = Xdi/n
Si =
i =1
Xdf- (Zdj)2/n
n-1
1/2

-------
COLLOCATED SAMPLER DATA BY
REPORTING ORGANIZATION




Av.tjh«
P.rirni
SUmUrrl IVvutinn
sill-
NuhiIh
i 1>avs
Dillcf
•net'
IVrtcnt |>i]lt>rtfH<'



¦
d
¦
s





d

s

9:>".. probability limits — D 1 -~-Sj
D — (njdj + n2d2 > ... +¦	• n2 t ... * nk)
\/(n 1-1)5? • (n,-l|s; • ...|nt-l|S;
' 	">>->•
COLLOCATED TSP SAMPLERS
To do:
Compute 95% probability Dmits
Given:
Site
Duplicate
Designated
i
81.0 \t g/m'
81.9

119.9
11 J/,

1244
I22J
2
127.9
129.0

1375
134.2

118.0
113.4
FOR SITE 1

Duplicate
Designated

4
Day
Sampler
Sampler
Difference
<%)
1
83.0
81.9
+1.1
+ u
2
119.9
113.6
+ 63
+ 15
3
128.4
12Z7
-t 5.7
+ 4.6




d, = +3.80
S, = Z21
FOR SITE 2
Day
Duplicate
Sampler
Designated
Sampler
Difference
(%)
1
127.9
129.0
-1.1
- 0.9
2
1375
134.2
+ 3J
+ 15
3
118.0
1114
+ 4.6
+ 4.1




d2 = 1.90
S2 = 235

-------


Ave™#e P«*«vnl
iUndrffd Ocvution
bill'
Number Days
DMetente
PtrrenJ Difference
1

* .1.60
121
1

+ UHJ
r»s
D = [ (3) (3.80) + (3) (1-90)] /3 + 3 = 185
S' ~ V (i +
95% probability limits = D± -^=-(Sa)
= Z85±-^(2.38)
= Z85 ± 3.30
= > 6.15% or • 06
- 0.4596 or - 00
MANUAL
METHODS
Accuracy
ACCURACY DATA BY REPORTING ORGANIZATION
Arutyvs D*y
o» Sjmpirr
Obwntd
LevH
Known
lfw«4
Diff entice
d
N
1
1
1
f
)
<
Y-
X
y • x
A
(100)
1
k












D
95% probability limits = D± 1.96 (SJ
O" (n,d, + n2d2 ~ ... + nkdi, )/(n, + n2 + ... + nj


-------
AUTOMATED METHODS
Internal Checks

Predskm
Accuracy
SO,
Precision Check
Local Audit
CO
Piecnion Check
Local Audit
NOj
Precision Check
Local Audit
o,
Precision Check
Local Audil
Extent or
Biweekly
3-4 Levels
frequency

25"u each quarter


At least 1 per quarter


All analyzers each year
AUTOMATED METHODS
External Audits

Performance
System
SO,
EMSl
Region
CO
EMSL
Region
NO j
—
Region
O,
—
Rep on
hutenl or
Semi-Annual
Annual
Frequency


AUTOMATED
METHODS
Precision
CO PRECISION CHECKS BY ANALYZER 1
Biwtt'kly
(hint
ObM-rved
Gwurnlration
(V)
Known
CcmtcnluticKi
(X)
DifJeu'rwo
(Y-X)
d
("¦•)
1
IIDO
111'.
• .1".
- I.H
I
\
B.U»
ti.07
H.J >
K.r»
-	.10
-	.08
-	12
-	1.0
4
8.17
e.r»
• .02
•0.2
<>
6.10
>1.04
H.r.
iu*.
- .0">
• .11
• Oh
- 1.»




d, > - 0.95
S, = 0.69
95% probability limits = d^l.96 (S1)
= - 0.95 ± 1.96 (0.69)
= +0.40 or+00
= -230 or-02

-------
CO PRECISION CHECKS
BY REPORTING ORGANIZATION

Number



Biweekly


Analyzer
Checks
3i
si
1
I
n.
I
d,
I
s,
I
I
k
I
nt
I
a,
I
St
k
Ind.
— »•!
D = —	
* V
Analyzer
Number
Biweekly
Checks
d
S
1
6
- 0.95
0.69
2
6
+ 1.03
0.94
3
6
-1.76
051
- (6K- 0.15) + (6K t 10J) + (6K- 1.76)
6+4 + 6
= -056
, / (5*0.69)- + (3K0.94)- + (5K0Jl)>
S' V	5 + 5+5
= 0.73
95% probability limits = D± 1.96 (Sa)
= - 0.56 ±1.96 (0.73)
= + 0.87 or + 01
-1.99 or-02
14-11

-------
AUTOMATED
METHODS
Accuracy
ACCURACY DATA BY
CONCENTRATION LEVEL
Analyzer
Observed
Level
Known
Level
Difference
d
(%>
1
k


>

Y
X
¥
100)




0
95% probability limits = D±1.96 (Sa)
S02 AUTOMATED METHOD
LEVEL 3 (.40-45)

Observed
Known

d
Analyzer
Level
Level
Difference
(v«<)
1
39
.43
-.04
- 93
2
.40
.42
-.02
- 4.8
3
.45
.44
+ .01
+ 23




D = - 3.9
S, = 5.8
95% probability limits = D ± 1.96 (Sa)
= - 3.9 ±1.96 (5.8)
= + 75 or + 08
-15.0 or -15
REPORTING
REQUIREMENTS
SLAMS
*	pooled quarterly precision and
accuracy averages
•	reported through EPA Regional
Office to EMSL within 90 days
after end of quarter
14-12

-------
REPORTING
ORGANIZATION
A state or subordinate organization
responsible for a set of stations
which monitor the same
pollutant and for which precision
and accuracy assessments can be
pooled.
A reporting organization
should usually have:
•	common team of field
operators
•	common calibration facilities
•	common laboratory support
PRECISION AND ACCURACY
SUMMARY ANALYSIS
•	quarterly summary analysis from
EMSL to states — within 6 months
after end of each quarter
•	annual summary analysis from
EMSL to states — within 9 months
after end of year
EPA REGIONAL SYSTEM AUDIT
• Veibal Report
From: Regional Audit
Team
To: Auditee
When: Immediately
following audit
• Written Report
From: Regional Audit
Team
To: Auditee
Copy: State
When: Within 1 month
of audit
• Annual Regional
Summary
From: EPA Regional
Offices
To: SUtes/EMSL
When: Within 6
months after
end of year
* Annual National
Summary
From: EMSL
To: States (EPA
Regional Offices)
When: Within 12
months after
end of year

-------
EMSL PERFORMANCE AUDITS
• True Values
(written)
From: EMSL
To: Each Reporting
Organization
When: Within 1
month after
each audit
• Annual Summary
Report
From: EMSL
To: Regions/States/
Reporting
Organizations
When: Within 9
months after
end of year
APPENDIX B
Quality assurance
requirements are the
same as Appendix A
requirements except
for the following;
APPENDIX B
Topic
Appencfi* A
Appendix B
Mnnknrinjt *nd QA
Rr^xnNbiStv
SUlv lot jI i*ron
S*Hjrtr Otkftei Oprrjlur
MMUtofinft Duration
Indefinitely
l>p lo 12 moflths
QA Arportmg Prnod
CjfeniUr Qujrtrf
Simpfang Quitter
Akimo A^smrfll
- Awtt->
SUnditifc Jixl rquipfnrnl
diflpimi (ram ttMnAtng
iftd r*R)«tinn, prefer
drtn-trfli personnel
Personnel. sUrtdjrdv 
-------
DATA ASSESSM
REPORTING
STATE ORGANIZATION
1 2 3 4 5 6 7
NAME OF REPORTING ORGANIZATION	
PORT
YEAR QUARTER SEND COMPLETED FORM
TO REGIONAL OFFICE
WITH COPY TO EMSL/RTP
m ~
8
OMB No. 158-R0012
Expires
AUTOMATED ANALYZERS
PRECISION
A. CO
C
4
2
1
0
1
9
-14



C
4
2
6
0

9
-14



C
4
4
2
0
1
9
-14



C
4
2
4
0
1
9
-14



C





NO. OF	NO. OF
ANALYZERS PRECISION CHECKS PROBABILITY LIMITS



15-17



15-17



15-
17



15-17






18-
21




18-
21




18-
21




18-
21




LOWER UPPER
M
lh


22-27


LOWER UPPER
1 22-27
II"-




LOWER UPPER
ItL
ir


22-27


LOWER UPPER
M
22-27
II-




LOWER UPPER
r 1
-


18-21
ACCURACY
SOURCE OF
TRACEABILITY PRIMARY NO. OF
STANOARO2 AUDITS
PROBABILITY LIMITS
LEVEL 3
NO. OF AUDITS
AT LEVEL 4
A. CO
c


1
0


28
-33



B. N02
c
4
2
6
0
2
28
-33



c. o3
C
4
4
2
0
1

28
-33



D. S02
C
4
2
4
0
1 I
28
-33



E_
c





~
34
~
34
~
34
~
34
~
35
~
35
~
35
~
35
~
1 1
36-
36
1


36-
38
1


36-
38
1


36
-38
I


LOWER UPPER
36-38
M
139-44
II'-




LOWER UPPER
.'-1
II"-


39 -44J


LOWER UPPER
.'-1
139-44
ih




LOWER UPPER
h'
«"-


39-44


LOWER UPPER
39-44





LOWER UPPER
LOWER UPPER
c
LOWER UPPER
13

LOWER UPPER
EE
50
LOWER UPPER
LOWER UPPER
M

¦»/-


1-51-56




LOWER UPPER
\" \ | |h
LOWER UPPER
FFTT
51—56 	*-
LOWER UPPER
51-56^""^
LOWER UPPER
PTT
L ei cc —-U-
LOWER UPPER
FTT
"•ci e>'
l":L








LOWER
UPPER
63
-65
hi







b/-W
LOWER
UPPER
63
-65
hi
"1






b/-b£
LOWER
UPPER
63
-65
hi

I"-




b/—bz
LOWER
UP*
ER

63
-65

63-65
1 COUNT ONLY REFERENCE OR EQUIVALENT MONITORING METHODS
2 Identify according to the following code
A.	NBS SRM
B.	EMSL REFERENCE GAS
C.	VENDER CRM
O.	PHOTOMETER
E.	BAKI
F.	OTHER. SPECIFY	
FIGURE 1 FORM 1 (FRONT)

-------
DATA ASSESSMENT REPORT
OMB No. 158—R0012
Expires
REPORTING
STATE ORGANIZATION
YEAR
12 3 4 5	6 7
NAME OF REPORTING ORGANIZATION _
QUARTER SEND COMPLETED FORM
~	TO REGIONAL OFFICE
WITH COPY TO EMSL/RTP
•P*
I
ON
A. TSP
B. SO,
C. NO,
D. Pb
A. TSP
B. SO,
C. NO,
D. Pb
MANUAL METHODS
PRECISION
NO. OF	NO. OF
NO. OF COLLOCATED COLLOCATED
SAMPLERS' SITES	SAMPLES 
LOWER
UPPER
-i
v-


au-ab
LOWER
UPPER
M
Y'-


PROBABILITY LIMITS
LOWER UPPER
H
II"-


24-29


LOWER UPPER
r-i
S"-


24-29


LOWER UPPER
M
1"-


124-29
U




LOWER UPPER
H



LIMITS APPLICABLE
TO BLOCKS 20-23
TSP. 20 tig TSP/m3
SO2: 40/igSO2/m^
NO2: 30AigNO2/ni3
Pb: 0.15 Aiq Pb/m^
f;
PROBABILITY LIMITS
LEVEL 2
LOWER UPPER
46-51
0
LEVEL 3
LOWER UPPER
F
L46-51
LOWER UPPER
~ /-
Ei
LOWER UPPER
46-51
LOWER UPPER
FTTT
46—51 	
b2-b/
LOWER
UPPER
r 1
l"


b^-b/
LOWER
UPPER
-I



bZ-b /
LOWER
UPPER

r-


NO. OF VALID
COLLOCATED
DATA PAIRS



58-60



58-
60




58-60



58-60
1 COUNT ONLY REFERENCE OR EQUIVALENT MONITORING METHODS.
FIGURE 1 FORM 1 (BACK1

-------
Precision and Accuracy Data from State and Local
Air Monitoring Networks: Meaning and Usefulness
Raymond C. Rhodes
U.S. Environmental Protection Agency
Presented at 73rd APCA Annual Meeting and Exhibition in Montreal,
Quebec, Canada, June 1980.
Raymond C. Rhodes is a Quality Assurance Specialist for the U.S.
Environmental Protection Agency. He received a B.S. degree in Chemical
Engineering and M.S. degree in Statistics from Virginia Polytechnic
Institute. He has more than 30 years of experience in quality assurance
work. He is a fellow of the American Society for Quality Control (ASQC);
is a past chairman of the Chemical Division and is currently Chairman of
the Environmental Technical Committee.
Raymond C. Rhodes
Quality Assurance Specialist
Quality Assurance Division (MD-77)
Environmental Monitoring Systems Laboratory
U.S. Environmental Protection Agency
Research Triangle Park, North Carolina 27711
14-17

-------
V
Precision and Accuracy Data from State and Local
Air Monitoring Networks: Meaning and Usefulness
R. C. Rhodes
Introduction
Appendix A of the EPA air monitoring regulations of May 10, 1979
include requirements aimed at Improving the quality of air monitoring data
obtained by state and local networks. The requirements involve such
aspects as network design, site and probe.location, use of reference or
equivalent methodology, and the establishment and documentation of quality
assurance programs. State and local agencies are also required to perform
special checks to determine the precision and accuracy of their pollutant
measurement systems and to report the results of the checks to EPA Regional
Offices and to the Environmental Monitoring Systems Laboratory (EMSL) at
Research Triangle Park, North Carolina. The requirements for reporting
precision and accuracy data are effective January 1, 1981.
Precision and Accuracy
Precision and accuracy are two fundamental measures of the quality of
data from a measurement process. Simply stated, "precision" is a measure
of repeatability of the measurement process when measuring the same thing,
and "accuracy" is a measure of closeness of an observed measurement value
to the truth. Precision and accuracy of air monitoring or measurement data
cannot be ascertained from the data themselves, but require the use of
specially planned checks from which precision and accuracy can be estimated.
Precision
In general, precision can be determined under various conditions. For
example, precision will be better when repeated laboratory measurements are
made with a single instrument on the same day and by the same analyst than
when the repeated measurements are made on different instruments, on
different days, and by different analysts. The conditions under which
precision is measured are carefully defined in the regulation to properly
interpret and use the estimates and to assure comparability of the precision
estimates.
Because all components of a total measurement process contribute
error to a reported value, it is necessary to determine the precision
under conditions which involve all components of the measurement process.
For air monitoring systems, the best and easiest way to accomplish this is
to use duplicate, or collocated, measurement systems to obtain duplicate
results when sampling the same air. The agreement between the results is a
measure of precision of the entire measurement process.
For manual methods, the regulations specify the technique of using
collocated samplers for estimation of precision. Not only does this tech-
nique Involve all parts of the total measurement process, but 1t determines
the precision using actual concentrations of pollutants In the ambient air.
14-19
1
80.43.1

-------
For automated analyzers, the use of collocated sampling Instruments
would be best to measure repeatability. However, the cost would be pro-
hibitive. The next most desirable technique would be to perform "span"
checks at approximately ambient concentration levels at random points in
time between successive instrument adjustments. In this way, the precision
1s a measure of instrument drift from the time of the most recent Instru-
ment adjustment or calibration to the time of the precision check. The
regulations require the precision checks to be made at two-week'intervals
or more frequently. Although not stated in the regulation, following
introduction of the "precision" gas and after reaching equilibrium condi-
tions, an average of the instrument output should be obtained over some
relatively short period of time, e.g., five minutes. Thus, the precision
estimates have meaning only with respect to the time-averaging period over
which the average values are obtained. Precision estimates for other time-
averaging periods would have to be determined by knowing or assuming a
drift pattern between successive instrument adjustments/calibrations.
Accuracy
To measure the closeness of an observed measurement value to the
truth, some material or condition of known (true) property must be measured
by the measurement system being checked. The measurement system is
"challenged" with the "known" to obtain the observed measurement. For
automated analyzers, "known" gaseous pollutant concentrations, determined
using different standards and different equipment from those used for
routine calibration and spanning, are Introduced into the measurement
instruments. In this way, two different calibration systems are involved:
the one used for routine monitoring and the one used to assess the "known."
For manual methods, it is difficult to challenge the total measurement
system with "knowns." Therefore, an accuracy audit is made of only a
portion of the measurement system. The two major portions of manual
measurement systems are the flow and the analytical measurements. The
flow measurement portion of the TSP method, and the analytical measurement
portion of the NC^ and SC^ bubbler methods are audited for accuracy.
Regulation Requirements
Based on the above considerations, special checks/audits were devised.
Table I summarizes the minimun requirements specified in Appendix A of
the May 10, 1979 regulation.
Precision, Automated Analyzers
Precision checks are conducted at least biweekly and are made with
the following concentrations of gases: 0.08 - 0.010 ppm for SO-, 0,, NO-,
and 8-10 ppm for CO. These precision checks may be made using the same
materials, equipment, and personnel routinely used for instrument cali-
bration spanning.
14-20
L
80.43.

-------
Table I. Special checks and audits for estimation
of precision and accuracy.
Precision
Accuracy
(local audit)
Automated analyzers
(SO., CO, no2, o3)
Type check
Frequency
Scope
Manual methods
Type check
SO,
NO*
TSP
Frequency
Scope
Precision check at
one concentration
Biweekly
3 or 4 concentrations
25% of the analyzers each
quarter
At least 1 per quarter
All monitoring instruments All analyzers each year
Flow
Analytical
Collocated samplers
2 sites
Each monitoring day
2 sites (of high
concentration)
1 level
25% of the sites
each quarter
At least 1 per
quarter
All sites each
year
3 levels
Each analysis-
day
At least twice
per quarter
(Not applicable)
Precision, Manual Methods
Precision checks are made using collocated samplers at at least two
sites (of high concentration). One of the collocated samplers will be
randomly designated as the official sampler for routine monitoring; the
other shall be considered the duplicate. Results from the duplicate are to
be obtained each day the designated sampler is operated unless the samplers
are operated more frequently than every sixth day, in which case at least
one duplicate is required each week.
14-21
80.

-------
Accuracy, Automated Analyzers
Automated analyzers are challenged (audited) with known pollutant
concentrations at three levels (or four levels, in the case of episode
analyzers), in accordance with Table II:
Table II. .Automated analyzer audit concentrations (ppm)


Concentration range
-

Audit level
so2,
N02, 03

CO
1
0.03
- 0.08
3
- 8
2
0.15
- 0.20
15
- 20
3
0.40
- 0.45
40
- 45
4
0.80
- 0.90
80
- 90
Twenty-five percent of the automated analyzers of each type in the
monitoring network are to be audited once each calendar quarter so as
to represent a random sample for the entire network. Thus, for each
quarter, the results represent a random sample from all of the analyzers.
However, at least one analyzer shall be audited each quarter and all
analyzers shall be audited each year. Since the audits are to be conducted
with standards and equipment different from that used for calibration and
spanning (the analyst should also be different), when the audit is per-
formed within the quarter is not critical.
Accuracy, Manual Methods
For manual methods an accuracy audit is made of only a portion of the
measurement system. For TSP, only the flow measurement portion is audited;
for NOg and St^, only the chemical analytical portion is audited.
The flow rate audits for TSP are made at the normal operating level.
Twenty-five percent of the sites shall be audited each quarter, so as to
represent a random sample for the entire network. However, at least one
site shall be audited each quarter and all sites shall be audited each
year.
For the N0? and S0? methods, audit samples in the following ranges are
used: 0.2-0.3 fig/ml; 0:5-0.6 wg/ml; 0.8-0.9 pg/ml. An audit at each
concentration level shall be made on each day of analysis of routine
monitoring samples, and the audits shall be made at least twice each
quarter.
Computations
Signed Percentage Differences
The general form for computing individual signed percentage differences,
d^, whether for precision checks or for accuracy audits, is:
14-22
80.4

-------
i
0)
where, for accuracy audits (both automated analyzers and manual methods)
and for automated analyzer precision checks, Y represents the observed
value and X represents the known value. For manual method precision
estimates (collocated samplers), Y represents the duplicate sampler
value and X represents the designated sampler value.
Percentage differences instead of actual differences are used because
errors in precision and errors in accuracy are generally proportional to
concentration levels.
Signed percentage differences instead of absolute percentage differ-
ences are used to reveal or highlight any systematic errors that may need
to be investigated and corrected to further improve the precision and
accuracy of the monitoring data. Absolute percentage differences would
not enable a separation of the systematic errors from the random errors.
Precision and accuracy data are sunmarized and reported for each
calendar quarter.
Precision. For each analyzer or site, the individual signed percen-
tage differences are summarized by calculating an arithmetic average, 3.,
and a standard deviation, S.. Ninetyfive percent probability limits can b
calculated for each instrument or site for local network information,
using the following formula:
Although the regulations do not require such limits to be computed,
they should be of particular interest and value for the local network as
a supplement to their routine internal quality control. However, for
reporting to EPA, a consolidated set of 95 percent probability limits,
are computed for automated analyzers: Where the D is the weighted average
of the a., and S, is the pooled, weighted valued computed from the S^.
JO	J
The expression for the probability limits for precision for collocated
samplers is:
Data Summarization
3. ± 1.96 S..
J	J
(2)
D ± 1.96 S
a
(3)
D ± 1.96 Sj/2.
O
(4)
14-23
5
80.43.1

-------
This \Tl factor is introduced to correct for the statistical accumulation
of imprecision of results from both the duplicate and the designated
samplers. The probability limits are thereby put in terms of individual
reported values, the same as for the other probability limits.
Accuracy. From the d. values obtained from the accuracy audit checks
at a given concentration (or flow) level, an average D and S are computed.
For reporting to-EPA, 95 percent probability limits are competed using
Equation 3.
Meaning of Probability Limits
Average Value, Precision
Automated Analyzers. The 3. values for each instrument represent
the average bias of results due to instrument drift. The B simply rep-
resents, for the network, the average of the 3,= .
J
Manual Methods. The 3^ values at each collocated site represent the
average bias between the results from the collocated samplers. The B
simply represents, for the network, the average of the 3..
J
Probability Limits, Precision
Automated Analyzers. The width or spread of the limits represents
the variability of the individual instrument drift values for each instru-
ment. The spread of the limits, B ± 1.96 S , represents the average with-
in instrument variability for the network.
Manual Methods. The spread of the limits, 3. ± (1.96) Sj/T ,
represents 1/ fl times the variability of differences between the daily
results of the collocated samplers, or the expected variability of results
from a given site, if repeated daily values could be obtained when measur-
ing the same pollutant concentration. For the network, the spread of the
limits, B -± 1.96 S / fT , represents the corresponding values based on
the average within-site variability.
Average Value, Accuracy
Automated Analyzers. The d. represents for each instrument, the
bias at the concentration level audited. The D represents the average
bias for the network at the given concentration level.
Manual Methods. The d. represents either (a) for TSP, the bias of
the flow rate at each site audited, or (b) for N0„ and SCk, the bias of
the analytical results for a given audit at a given concentration. The
0 represents the average bias for the laboratory.
Probability Limits, Accuracy
Automated Analyzers. The width or spread of the limits, D ± 1.96 Sa,
represents the variability of accuracy of the audited analyzers, at a
given concentration level.
14-24

-------
Manual Methods. The width or spread of the limits, D-± 1.96 S ,
represents either (a) for TSP, the variability of accuracy for the audited
sites, or (b) for NCL and S02, the variability of accuracy for the
audits at each given concentration.
Use of Precision and Accuracy Data
The precision and accuracy data obtained by the networks and reported
to EPA will be of considerable value to various organizations. These
estimates will be helpful to the user of routine monitoring data by
providing the user information on the quality of the data with which he
is working. The estimates will be valuable to EPA in obtaining "real
world" information on the precision and accuracy of the reference and
equivalent methods. The data should also be of particular interest and
value to the originating agencies as a supplement to the routine quality
control system.
Originating Agencies
The measures of precision and accuracy are obtained by each network
in the form of probability or control chart-type limits that can and
should be used within each agency as supplementary information for in-
ternal quality control. The information obtained within a network on a
given site or instrument can be used for local quality control purposes
for the particular site or instrument. It is important to emphasize,
however, that the precision and accuracy checks required by Appendix A d£
not take the place of the need to maintain a routine quality control
system. Such checks are too infrequent to be adequate for day-to-day
control. Furthermore, the precision and accuracy results should not be
used to make any after-the-fact adjustments or corrections to monitoring
data.
Various control charts can be used for plotting the results of the
precision and accuracy data. As indicated above, the results of the
precision and accuracy checks, if used in a timely way, can provide a
valuable supplement to normal routine internal quality control checks.
Quality Control Charts. Although the prime objective of the precision
and accuracy audits is to obtain an assessment of data quality, a number
of statistical control charts can be maintained to provide some long-term
internal control. With control limits established on the basis of past
history (at least one quarter for precision, at least one year for accuracy),
future data values can be plotted to detect any significant change from
past experience.
In general, the control chart limits will be similar to the computed
probability limits except that the 1.96 value will be replaced by a 3.
(The 1.96 corresponds to an expected 95 percent probability—the 3
corresponds to an expected 99.7 percent probability.) In the case of
manual method precision, the / 2 factor is not included because the
points to be plotted will be the percentage differences, which include
variability from the imprecision of both samplers. Also, since the
Intuitively expected value for 3. is zero for precision and accuracy, the
centerllne for the control charts should be zero. Table III summarizes
the various control charts which can be plotted for the individual precision	7
checks and accuracy audits.
80.43.
14-25

-------
Table III. Recommended control charts and limits for state and local agencies.
Variability
or Bias to
be Controlled
Pollutant
Measurement Method
Control Charts
Number of
Control Charts
Control Limits
Automated methods for Precision-Single One control chart Zero ± 3 S
S02, N02, Og, and CO Instrument
for each instrument
Manual methods
TSP
so2
no£
TSP (flow rate)
S02 (analysis)
N02 (analysis)
Accuracy-Single	One control chart Zero + 3 S
Instrument, each	for each audit
audit level	level
Precision-Single	One control chart Zero + 3 S
Site
Accuracy-Single
Site
for each collo-
cated site
One control chart Zero ± 3 S,
per agency
Accuracy for each One for each audit Zero ± 3 S.
audit level
level
Frequency of Plotting
and Values to be Plotted
After each biweekly pre-
cision check, plot each
individual d. value
J
After each audit check,
plot each individual
dj value
Each day, plot d- for
each site	J
After each audit, plot
each individual dj
After each audit, plot
each individual d^
Excessive varia-
bility and drift
of each instrument
Excessive bias of
each instrument
Excessive lack of
agreement between
collocated samplers
Excessive bias
of each instrument
Excessive bias
for each audit
CD
00

-------
Other control charts could be plotted with the D values to detect
biases from quarter to quarter. Similarly, the quarterly values of S could
be plotted to control or display the variability aspects of the measurement
systems.
States and Regional Offices
The precision and accuracy reports will be helpful to the states in
comparing these measures of data quality from the networks within the
states. Similarly, the EPA Regional Offices will be able to make compari-
sons within and between Regions. These comparisons may point out parti-
cular organizations, states, or Regions in need of further improvement in
their quality assurance programs.
Environmental Protection Agency (EPA)
Evaluation of the precision and accuracy data is important to EPA
(EMSL, Research Triangle Park, North Carolina) in its role of responsibi-
lity for quality assurance of air pollution measurements. The precision
and accuracy data will be useful in (a) determining possible needs for
additional research efforts related to particular measurement methods,
(b) indicating measurement methods or portions thereof, which may require
improved quality control, and (c) indicating particular agencies, states,
or Regions that may require technical assistance or improved quality
control. In other words, the precision and accuracy information will
enable comparisons to be made across measurement methods, and across
networks or other organizational entities for purposes of identifying
possible areas in need of improvement of data quality. With knowledge of
the precision and accuracy Information, EPA can consider appropriate
statistical allowances or risks in setting and enforcing the standards,
and in developing control strategies.
User
After January 1, 1981, when the precision and accuracy reporting
becomes effective, users of monitoring data maintained in the National
Aerometric Data Bank (NADB) will receive along with the monitoring data,
the precision and accuracy data for the corresponding time periods and
locations. The availability of the precision and accuracy data will
assist the users in their interpretation, evaluation, and use of the
routine monitoring data.
Environmental Monitoring Systems Laboratory Reports
To assist Regions and states in making the above comparisons as well
as to perform other analyses of the reported precision and accuracy data,
EMSL/RTP will perform various types of statistical analyses and will
prepare evaluation and summary reports each quarter and each year.
14-27
9
80.43.

-------
Summary
The implementation of the May 10, 1979, regulation should result in
an improvement in the quality of air pollution data obtained from the
states and local agencies. Particularly ffom a quality assurance Standpoint,
the quality assurance plans of the states and local agencies will be
documented in detail, and quantitative estimates of precision and accuracy
will be available for users of air monitoring data.
80.43.1
14-28

-------
PRECISION WORK SESSION
LESSON GOAL:	To assure that students can perform precision
and accuracy calculations as described in Lesson
14, "Quality Assurance Requirements for SLAMS and
PSD."
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1. Calculate 95% precision probability limits
for a reporting organization using collocated
sampler data.
14A-1

-------
COLLOCATED SAMPLERS
S02(jug/m3)
SLAMS SITE DUPLICATE DESIGNATED
1
227
236

268
275

258
256
2
245
257

227
240

164
166

212
221
Compute 95% probability limits for the reporting organization

-------
DATA VALIDATION
LESSON GOAL:	To familiarize students with data validation
considerations.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Define data validation.
2.	Describe nine (9) characteristics of a data
validation system.
3.	Describe factors that affect the selection of data
validation techniques.
4.	List the levels of data validation.
5.	Explain the importance of having data validation
performed by the agency that generates the data.
15-1

-------
1
25 40
150 480
63 36
112
DATA
VALIDATION
QA
Handbook
DATA
VALIDATION
The process whereby data
are filtered and accepted
or rejected based on a set
of criteria.
DATA
VALIDATION
A systematic procedure of
reviewing a body of data
against a set of criteria to
provide assurance of its
validity prior to its
intended use.
RELATED TERMS
*	data editing
*	data screening
*	data auditing
*	data verification
*	data evaluation
*	data qualification
" data quality assessment
CHARACTERISTICS OF A
DATA VALIDATION SYSTEM
•	Is an after-the-fact review
•	Is applied to blocks of data
•	Is systematically/uniformly
applied
•	Uses set of criteria
•	Checks for internal consistency
15-3

-------
CHARACTERISTICS OF A
DATA VALIDATION SYSTEM
(continued)
•	Checks for temporal/spatial
continuity
•	Checks for proper identification
•	Checks for transmittal errors
•	Flags/rejects questionable data
• Is an after-the-fact
review A
MAY
• is applied to blocks of data
• Is uniformly/
systematically
applied
Data Validation
Protocol
ulrl.y .Jn, n.,1 ,11., l,„yi.pJI.. ,1), ,/m|
!• !"• •	Ii.i> /..II.,,.!, 		
.il.l./ .J... II,. vll,. 				 ,
'/•• '		I			,11-n
-JW. .t„ , ..L.i i.l. h / .J,.. 11.4 I
/lb. U	.I,,..,,,,..!!.,,.!, |		
..IImIU. ylta.tl.ilw, ,J,..1.11. Ill,i....7,( ,.
• Uses set
of criteria
CRITERIA
Maximum
span drift
Maximum
temperature
Concentration
limit

-------
• Checks for internal
consistency
•	uniform sampling methodology
•	uniform monitor siting
•	uniform data reduction and
reporting
•	pollutant relationships
•	pollutant/meteorological
relationships
• Checks for
temporal/
spatial
continuity
ppm
Temporal
(diurnal)
so2
Concentration
Mon Tues Wfd Thurs Frl Sat Sun
Temporal
(weekly)
Concentration
Spring Summer Fall
Temporal
(seasonal)
Spatial
15-5

-------



Aa""V
Parameter
Obwrvrd
Cily Name
Time Interval
Sllr
tdd




UnlU




























• Checks for
proper
identification
• Checks for transmittal errors
Flags/rejects
questionable
data
0.0298
0.0123
0.4555
0.0450
0.0600
0.0298


0.0123

0.4555
0.0450


0.0600


TECHNIQUES EMPLOYED
Monitoring Network Characteristics
*	Nature of Response Output
*	Data Reduction Methodology
*	Data Transmittal Methodology
*	Types and Amount of Ancillary Data
*	Computing/Plotting Capability
*	Intended Uses of Data
*	Amount of Data
• Nature of Response Output
15-6

-------
• Data Reduction Methodology
• Data Transmittal
Methodology
Data Form
.  L_4	^,11.1, h.
Types and Amount of
Ancillary Data
Maintenance
Records
• Computing/Plotting
Capability
160
120
Concentration Isopleths Plot
• Intended Uses of Data
15-7

-------
9pm
10pm 11pm
of Data
(continued)
• Amount of
Data

LEVELS OF DATA
VALIDATION

State
Agency
~ ~~
~ dq

lagog
EPA
Regional
Office
~~~
ODD
USdJd
NADB
IL.ji
.~ODD
Q a a d
ngpa
Validation should be performed
by originating agency because
it has more information
concerning:
*	local meteorology
*	local emissions sources
*	unusual events
*	site/instrument logbooks
•	personnel
•	equipment/supplies
•	operating procedures
•	calibration materials

-------
Validation should be
performed by someone
other than the person
who collected or
reported the data.
15-9

-------
QUALITY ASSURANCE AND DATA VALIDATION FOR THE
REGIONAL AIR MONITORING SYSTEM OF THE
ST. LOUIS REGIONAL AIR POLLUTION STUDY
By
Robert B. Jurgens*
Environmental Sciences Research Laboratory
Research Triangle Park, North Carolina 27711
Table 1. RAMS NETWORK MEASUREMENTS
And
Raymond C. Rhodes
Environmental Monitoring and Support Laboratory
Research Triangle Park, North Carolina 27711
The success of model development and evaluation
from a body of monitoring data depends heavily upon
the quality of that data. The quality of the
monitoring data in turn is dependent upon the various
quality assurance (QA) activities which have been
implemented for the entire system, commencing with
the design, procurement, and installation of the
system and ending with validation of monitoring data
prior to archiving. Of the many sources of aeromet-
tric and emissions data that exist, the St. Louis
Regional Air Pollution Study (RAPS) is the only known
study specifically designed for model development and
evaluation on an urban/rural scale
1.2
MEASU
INTER
AIR QUALITY:
METEOROLOGICAL:
SOLAR RAOIATION:
SULFUR OIOXIOE
TOTAL SULFUR
HYOROGENSULFIOE
OZONE
NITRIC OXIOE
OXIOES OF NITROGEN
NITROCEN OIOXIOE
CARBON MONOXIOE
METHANE
TOTAL HYDROCARBONS
WINO SPEEO
WIND DIRECTION
TEMPERATURE
TEMPERATURE GRADIENT
PRESSURE
OEW POINT
AEROSOL SCATTER
PVRANOMETER
PYRHELIOMETER
PYROEOMETER
MENT
. 
-------
3. Specific data quality control activities,
those activities which involve the calibration and
data output from the meteorological and pollutant
measurement instruments and are explicitly involved
in acquiring quality data.
Procurement and Management
Data Quality Objectives. A requirement of the
Initial contract stated that 90% valid data were to
be achieved. Valid data for pollutant measurements
were defined as the data obtained during periods
when the daily zero and span drifts were less than
2 per cent, with an allowance for the time required
to perform daily zero/span checks and periodic
multi-point calibrations.
Procurement. In planning to achieve the
objectives very stringent requirements were placed
on the suppliers of the various Instruments of the
system and extensive performance tests (with numerous
rejections) were conducted prior to final acceptance.
First Article Configuration Inspection (FACI).
The first remote station was installed and performance
tested by the contractor under EPA review. Various
Indicated corrections were made before proceeding
with the installation of the entire network.
System Acceptance Test (SAT). After installation
of the entire network, a one-month system performance
demonstration was required to assure satisfactory
operation with respect to obtaining data of adequate
quantity and quality. The SAT was completed 1n
December 1974.
Incentive Contract. The current contract has
introduced award fee performance incentives for manage- '
ment, schedule, and for quality. The quality portion
of the award fee provides a continual motivation for
obtaining and Improving data quality.
Quality Assurance Plans. An extensive QA plan has
been developed by the contractor, A point of emphasis
is that the QA plan (and Its Implementation) is dynamic
--continually being revised and Improved based upon
experience with the system. The QA plan outlines in
detail the activities of the various QA elements
previously mentioned.	*
Organization. To Implement the QA plan, one
full-time employee 1s assigned to overall QA
responsibilities reporting directly to the Program
Manager. In addition, two persons are assigned for QA
on a half-time basis, one for the remote monitoring
stations, and the other for the central computer
facility.
Operation and Maintenance
Document Control. Detailed operation and
maintenance manuals have been prepared for the remote
stations and for the central computer facility. These
manuals are issued in a loose-leaf revisable and
document-control format so that needed additions
and/or revisions can be made. A'lso-, a complete history
of changes are kept so that traceability to the
procedures 1n effect for any past period of time can
be made. A document control system also exists for
the computer programs.
Preventive Maintenance. Record-keeping and
appropriate analysis of the equipment failure records
by instrument type and mode of failure have enabled
more efficient and effective scheduling of maintenance
and optimum spare parts inventory with resultant
Improvement in Instrument performance. RAMS station
preventive maintenance is completed twice each week.
Normally, the remote stations are unattended except
for the weekly checks, for other scheduled maintenance,
or for special corrective maintenance.
Central Computer Monitors. Central computer
personnel, using a CRT display, periodically monitor
the output from all stations to detect problems as
soon as possible. To maximize the satisfactory opera-
tion of the network equipment, the assigned QA
personnel review the following activities associated .
with preventive maintenance:
1.	remote station logbook entries,
2.	remote station corrective maintenance reports,
3.	•. laboratory corrective maintenance reports,
and
4.	central computer operator log.
Additionally, the QA individuals are 1n frequent
verbal communication with field and laboratory
supervisors to discuss quality aspects of the . .
operations.
Reliability Records and Analysis
Telecommunications Status Summaries, Each
day, a summary of telecommunications operations is
prepared to determine which stations and/or telephone
lines are experiencing significant problems that
might require corrective action.
Dally Analog/Status Check Summaries. Each
day, the central computer prepares a sunmary of analog/
status checks by station so that major problems can be
corrected as soon as possible by available field
technicians. These analog/status checks are explained
in the section on data validation.
Configuration Control. Histories are kept
of the station assignment of specific instruments,
by serial number, so that possible future problems
with specific instruments can be traced back to the ,
stations. A logbook for each instrument Is maintained
for recording in a systematic manner the nature and
date of any changes or modifications to the hardware
design of the instruments.
Specific Data Quality Control Activities
Calibration
, Calibration References for Gaseous Pollutants.
NBS standard reference materials are used for calibra-
tion standards 1f available. Otherwise, commercial
gases are procured and certified at NBS for use as
standards.
Multipoint Calibrations. As a check on the
linearity of instrument response, an on-site, 5-point
calibration 1s scheduled at each station at 8-week
intervals. Originally, acceptability was determined
by visual evaluation of the calibration data plots;
more recently, quantitative criteria are being
established for linearity.
Measurement Audits. Independent measurement
audits for pollutant instruments are performed by the
contractor using a portable calibration unit and
independent calibration sources at each station once
each calendar quarter. Similar audits are performed
on the same frequency for temperature, radiation, and
15-12

-------
mass flowmeters; and independent chocki arc made on
relative humidity, windspeed, and wind direction
instruments. In addition to the internal audits per-
formed by the contractor on his own operation, a
number of external audits have been performed by EPA
and other contractors^ to check the entire measurement
system.
On-Site System Audit. A thorough, on-site quality
system audit of RAMS was performed for EPA by an
independent contractor.** The results of this audit
pointed out several areas of weakness for which
corrective actions have been implemented.
Data Validation. As a part of the overall QA
system, a number of data validation steps are
implemented. Several data validation criteria and
actions are built into the computer data acquisition
system:
Status Checks. About 35 electrical checks
are made to sense the condition of certain critical
portions of the monitoring system and record an
on-off status. For example, checks are made on power
on/off, valve open/shut, instrument flame-out, air
flow. When these checks are unacceptable, the
corresponding monitoring data are automatically
Invalidated.
Analog Checks. Several conditions including
reference voltage, permeation tube bath temperature,
and calibration dilution gas flow are sensed and
recorded as analog values. Acceptable limits for
these checks have been determined, and, if exceeded,
the corresponding affected monitoring are invalidated.
Zero/Span Checks. Each day, between' 8-12 pm,
each of the gaseous pollutant instruments in each ¦
station are zeroed and spanned by automatic, sequenced
coitmands from the central computer. The results of
the zero/span checks provide the basis for a two-point
calibration equation, which is automatically computed
by the central computer and is used for converting
voltage outputs to pollutant concentrations for the
following calendar day's data. In addition, the
instrument drift at zero and span conditions between
successive daily checks are computed by the central
computer and used as a basis for validating the
previous day's monitoring data. Originally, zero and
span drifts were considered as acceptable if less than
2 per cent, but the span drift criterion has recently
been increased to 5 per cent, a more realistic level.
If the criteria are not met, the minute data for the
previous day are flagged. Hourly averages are
computed during routine data processing only with data
which have not been flagged as Invalid.
DATA SCREENING IN RAMS
The tests which are used to screen RAMS data are
summarized In Table 2. Specific tests and associated
data base flags are listed. The types of screens that
have been employed or tested will be detailed, the
mechanisms for flagging will be reviewed, and then
the Implementation of screening within RAMS will be
discussed.
). icuiiik uiiGo»iM awo Aiiocutie rum* rua > \ro
C»t»9Pry
I. nodvt Operand I
MS UlCUMDl
(•llfcr«tlon
10
10
.0"
II, Continuity fc«l«tlon»l
A. Intrtttatlon
Ciiiovt tMlyitr 4Hft
Crott I lain
fr««u««cy tfl»tr lb»tIwt
••tatloAthifti
Wporil Continuity
(wtltni Ovlpwt
lucc«tilv« 4lH«r«rK«
H«t«oro109lC4l network tmlfpraity
01 aofi lUilo
Ml, A '
-------
Ttblt 3. 6W5S LIMITS MO RtUTIOML OICCU
nmmKinM. on
MTtnUl LIMITS
IHTtRPAAAICTER
COHD1TIOH

LOWER
UPPFR

(front
0 wm
S ppa
WOj 1:0.04
Nitric 0»1de
0 ppa
5 ppa
NO - NOk _< Nolte (NO)
0*1 del of
MltrogeA
0 ppa
S ppa
HO ¦ N0| i HoUe (NOj)
Carbon fenoilde
0 ppa
50 ppa

fcthant
0 ppa
50 ppa
CH4 - THC J KoHe (CH4)
Total Hydro-
carbon*
0 ppa
50 ppa
CH^ - THC 1 KoUe (THC)
Sulfur Oloalde
0 pp«
1 ppa
S02 * TS J Nolle (S02)
Total Sulfur
0 ppa
1 ppa
S02 • TS I Noise (TS)
Hydrogen Sulfide
0 ppa
1 PC*
H?S • TS j Noli* (m2S)
Aerosol Scatter
0.000001a"1
1 0.00099a*1

Wind Speed
0 aft
12.2 a/s

Wind Direction
0*
360*

T**per«tur«
•20*C
45 *C

Oew Point
•JO'C
4S*C
OP - 0.5 J. T
lenptntur*
Gradient
- 5'C
5'C

Baroaatrlc
Prvsiure
950 ¦*
1050 wt

Pyranooeterc
• 0.50
{.50 liftfltyi/Bln
Pyr^«ot*ten
0.30
0.75 langleyt/aln
Pyrtalioneter*
-o.»
2.50 Lanqltyt/aln
tested since 1t can remain constant (to the number of
digits recorded) for periods much longer than 10
minutes. The test was modified for other parameters
which reach a low constant background level during
night-time hours.
••••S •••
A) SINGLE OUTLIER
•••••••
•••••••
II ITEf FUNCTION
A
C) SFIKE
*••••••«



I) MISSINO
C) DRIFT
F) CALIBRATION

Figure 1. Irregular instrument response.
A refinement of the gross limit checks can be
nade using aggregate frequency distributions. With a
tnowledge of the underlying distribution, statistical
limits can be found which have narrower bounds than
the gross limits and which represent measurement
levels that are rarely exceeded. A method for fitting
a parametric probability model to the underlying
distribution has been developed by Dr. Wayne Ott of
EPA's Office of Research and Development.''. B.E.
O
Suta and G.V. Lucha have extended Dr. Ott's program
to estimate parameters, perform goodness-of-fit tests,
and calculate quality control limits for the normal
distribution, 2- and 3-parameter lognormal distribu-
tion, the ganma distribution, and the Welbull
distribution. These programs have been implemented
on the OSI computer 1n Washington and tested on
water quality data from STORET. This technique 1s
being studied for possible use 1n RAMS as a test for
potential recording Irregularities as well as a
refinement of the gross limit check currently
employed.
Under intrastation checks are specific tests
which examine the temporal continuity of the data as
output from each sensor. It.1s useful to consider,
in general, the types of atypical or erratic responses
that can occur from sensors and data acquisition
systems. Figure 1 illustrates graphically examples
of such behavior, all of which have occurred to some
extent within RAMS. Physical causes for these
reactions include sudden discrete changes in component
operating characterisites, component failure, noise,
telecommunication errors and outages, and errors in
software associated with the data acquisition system
or data processing. For example, it was recognized
early in the RAMS program that a constant voltage
output from a sensor indicated mechanical or electri-
cal failures in the sensor instrumentation. One of
the first screens that was implemented was to check
for 10 minutes of constant output from each sensor.
Barometric pressure is not among the parameters
A technique which can detect any sudden jump in
the response of an instrument, whether it is from an
individual outlier, step function or spike, 1s the
comparison of minute successive differences with
predetermined control limits. These limits are
determined for each parameter from the distribution
of successive differences for that parameter. These
differences will be approximately normally distributed
with.mean zero (and computed variance) when taken over
a sufficiently long time series of measurements.
Exploratory application of successive differences,
using 4 standard deviation limits which will flag 6
values in 100,000 if the differences are truly
normally distributed, indicate that there are abnormal
occurrences of "jumps" within certain parameters.
Successive difference screening will be Implemented
after further testing to examine the sensitivity of
successive difference distributions to varying
cpmputatlonal time-periods and to station location.
The type of "jump" can easily be Identified. A
single outlier will have a large successive difference
followed by another about the same magnitude but of
opposite sign. A step function will not have a return,
and a spike will have a succession of large successive
differences of one sign followed by those of opposite
sign.
The interstatlon or network uniformity screening
tests that have been implemented in RAMS will now be
described. Meteorological network tests are performed
on hourly average data and are based on the principle
that meteorological parameters should show limited
differences between stations under certain definable
conditions typically found in winds of at least
moderate speeds (>4 m/sec). Each station value is
compared with the network mean. The network mean is
defined as the average value for a given parameter
from all stations having reported valid data. (If
more than 50% are missing, a network mean is not
15-14

-------
computed and the test is not made.) Values exceeding
prescribed limits are flagged. The limits have been
set on the advice of experienced meteorologists. The
tested parameters and flagging limits are listed
below.
Maximum allowable deviations from network mean
under moderate winds (network mean > 4 m/sec)
Wind speed
Wind direction
Temperature
Temperature difference
Dew point
Adjusted pressure
2 m/sec or MEAN/3
(whichever 1s larger)
30°
3°C
. 5°C
3°C
5.0 millibars
In addition to network screening techniques
which are based on knowledge of underlying physical
processes, methods from statistical outlier
q iq
theory ' were also examined. Specifically, the
Dixon ratio test^ was implemented to determine
extreme observations of a parameter across the RAMS
network. The Dixon ratio test is based entirely on
ratios of differences between observations from an
assumed normal distribution and 1s easy to calculate.
The Dixon criteria for testing a low suspect value
from a sample size of n, n < 25, are shown in
Figure 2. Though the entire sample is shown as
ranked, only the extreme 2 or 3 values need to be
ordered. Associated with each value of n are
tabulated ratios for statistical significance at
various probability levels. For example, if n=25,
X, would be considered as an outlier at the IX level
of significance when ri .489. Since the under-
lying distribution may not be normal, the calculated
probabilities may not be exact, but are used as
Indicators of heterogeneity of the network observations
at a given time.
ORDERED SAMPLE: X] 10"
Xo-X)
X, X2....X#
I	I
'11 •-
*2-Xi
Xo-1-X,
'11*
X3X|
x„1. xt
X) Xj..., Xq.| Xq
I	I
X, Xj Xj....X,.,X,
I	I
'«¦
Xj X)
x«.:-xi
1	1
Xl X| X3 .... Xp-2 Xp., x.
Figure 2. Dixon ratio test for suspected low value.
The third screening category, a Posteriori, was
established to provide a mechanism for overriding the
automated flagging schemes which have been implemented
in the instrumentation at the remote sites and in the
data screening module. From a review of station logs
and preventive maintenance records, 4 knowledge of
unusual events, or through visual Inspection of data,
1t may be determined that previously valid data
should be flagged as questionable. Conversely.
it may be determined that previously invalid data
should be validated by removing existing flags. An
example of when data would be invalidated is when an
instrument, such as a wind direction indicator, becomes
misaligned or uncalibrated because of some non-linear
or unknown reason. Removal of flags or revalidation
can occur, for example, when the recording instruments
function properly, but the sense bit or analog status
circuity is known to have malfunctioned.
Implementation of a posteriori changes of RAMS
data requires special software, inserted in the data
flow during data processing, screening and archiving,
or during a special pass through the data after
archival.
Data Flagging
Embedded in the data base structure must be
a flagging mechanism which can distinguish the various
data screens. In general, data which have been
filtered by the various screens must be either
removed from the data base or qualified by attaching
a uniquely identifying flag.
Two details of the RAMS archival data base
are important to understanding the implementation of
the RAMS data flagging. First, all data are stored
1n Integer floating point numbers in Univac internal
binary representation. Floating point notation is a
natural representation of numerical data and can
readily accommodate a variety of flagging schemes.
Second, each potential measurement has a reserved
location in the data base. Thus, substitutions must
be made for missing and removed data. For instance,
RAMS data rejected by the gross limit checks are
removed and replaced by a value of 1034.
Data which have been screened and which are
not obviously impossible may have limited application
and should not be eliminated from the data base. If
each screening test can be associated with a unique
flag, then modelers and other users can establish
their own criteria for accepting or rejecting the
flagged data.
Three flagging mechanisms suggest themselves
when the value of the measurement is to be retained:
(1) exponent offset, (2) range offset, and (3) binary
bit encoding. These techniques are listed for
reference only. A full description and comparison of
these techniques is being prepared.
Exponent offsetting which is used for RAMS
data 1s accomplished by multiplying the value by a
power of 10. Special considerations must be given to
the dynamic range of the data and to values which are
identically zero. The flags which are associated
with each of the individual screens are listed in
Table 2.
Implementing the Screening Module Into
the Data Flow
We emphasize the importance of considering
the sequence in which screening is integrated into
the data flow by considering a generalized data flow
diagram, or basic system design, which is applicable
to any environmental measurement system. This flow
diagram is Illustrated in Figure 3.
15-15

-------
DA1A
Figure 3. Generalized data flow for environmental measurement lyjtems.
Data screening should take place as near to
data acquisition as possible either In data processing
which 1s traditionally concerned with laboratory
analysis, conversion to engineering units, transcribing
intermediate results, etc., or in a separate module,
as illustrated, designed specifically for the screening
process. Screening data soon after data acquisition
permits system feedback in the form of corrective
maintenance, changes to control processes, and even
to changes in system design. This feedback is
essential to minimize the amount of lost or marginally
acceptable data.
The RAMS screening tests, which have been
developed at Research Triangle Park {RTP), are now
part of the data processing carried out at the RAPS
central facility in St. Louis. Slow computation
speeds of the St. Louis PDP 11/40 computer required
restricting the intrastation screening tests to hourly,
average data. RAMS data is still passed through the
RTP screening module before archiving.
SUMMARY
The experiences gained 1n RAMS and applicable to
other monitoring systems are:
1.	Data validity 1s a function of quality
assurance and data screening.
2.	A QA plan and data screening rules should
be established initially and maintained throughout
the program.
3.	The QA plan and screening rules are dynamic,
being improved as additional knowledge and experience
is gained.
4.	Applied during data acquisition or shortly
thereafter, quality control and screening checks
constitute an important feedback mechanism, Indicating
a requirement for corrective action.
REFERENCES
1.	Burton, C.S. and 6.M. H1dy. Regional Air
Pollution Study Program Objectives and Plans,
EPA 630/3-75-009, Dec. 1974.
2.	Thompson, J.E. and S.L. Kopczynskl. The Role of
Aerial Platforms In RAPS, Presented at an EPA
meeting on Monitoring from Las Vegas, Nevada,
March 1975 (unpublished).
3.	Meyers, R.L. and J.A. Reagan. Regional Air
Monitoring System at St. Louis, Missouri,
International Conference on Environmental Sensing
and Assessment, Sept. 1975 (unpublished).
4.	Quality Assurance Handbook for A1r Pollution
Measurement Systems, Volume I, Principles,
EPA 600/9-76-005, March 1976.
5.	von Lehmden, D.J., R.C. Rhodes and S. Hochhelser.
Applications of Quality Assurance in Major Air
Pollution Monitoring Studies-CHAMP and RAMS,
International Conference on Environmental Sensing
and Assessment, Las Vegas, Nevada, Sept. 1975.
6.	Audit and Study of the RAMS/RAPS Programs and
Preparation of a Quality Assurance Plan for RAPS,
Research Triangle Institute, Research Triangle
Park, N.C. 27707, EPA Contract No. 68-02-1772.
7.	Ott, W.R. Selection of Probability Models for
Determining Quality Control Data Screening
Range Limits, Presented at 88th Meeting of the
Association of Official Analytical Chemists,
Washington, D.C., Oct. 1974.
8.	Suta, B.E. and G.V. Lucha. A Statistical
Approach for Quality Assurance of STORET-Stored
Parameters, SRI, EPA Control No. 68-01-2940,
Jan. 1975.
9.	Grubbs, F.E. Procedures for Detecting
Outlying Observations 1n Samples, Technometrlcs
11 (1), 1-21, 1969.
10.	Anscombe, F.J. Rejection of Outliers,
Technometrics 2 (2), 123-147, 1960.
11.	Dixon, W.J. Processing Data for Outliers,
Biometrics 9 (1), 74-89, 1953.
15-16

-------
QUALITY COSTS
LESSON GOAL:	To familiarize students with the concept of quality
costs and with considerations when establishing a quality
cost system.
LESSON OBJECTIVES:	At the end of this lesson, you should be able to:
1.	Recall the three (3) types of cost that
compose the total cost per measurement
result of an air quality measurement
system.
2.	Describe the relationship between unacceptable
data cost and quality assurance costs.
3.	Explain the purposes of a quality cost system.
4.	List and define the three (3) cost categories of
a quality cost system.
5.	Identify at least two (2) groups of activities
which are related to each of the three cost
categories.
6.	Describe a procedure for establishing a quality
cost system.
16-1

-------
$
QUALITY
COSTS
QUALITY €DS«S; PAYS
¦ Unacceptable
• Data
* Quality
poor
excellent
Quality
Prevention
Appraisal
QUALITY
RELATED
COSTS
0
Prevention Cost Groups
Preventive
Maintenance
Training
Procurement
Specs/
Acceptance
Planning/
Documentation
System
Calibration/
Operation
Appraisal Cost Groups
QA Status/
Reporting
Quality
Control
Validation
Audit Procedures
16-3

-------
Failure Cost Groups
Problem Investigation
ACCUMULATION
OF COSTS
•	lost data costs
•	othercosts
Lost
Data
• F„ = f x B
Where:
F,j = lost data cost
f = % lost data
B = part of
network budget
associated
with lost data
• Prorate
Personnel
Salaries
COST EFFECTIVENESS
16-4

-------
QUALITY COST
REPORTING
•	facts obtained from source
documents
•	reports understandable
at a glance
•	facts summarized
•	graphs preferred
QUALITY COST TREND CHART
cl E
e
e
p-i [7] p p p p p
1 2 3 4 1 2 3
Quarters - 1979-80

-------
80-43.3
GUIDELINES FOR IMPLEMENTING A QUALITY
COST SYSTEM FOR ENVIRONMENTAL MONITORING PROGRAMS
Presented at 73rd APCA Annual Meeting
and Exhibition in Montreal, Quebec,
Canada, June 1980
Ronald B. Strong
Research Triangle Institute
J. Harold White
Research Triangle Institute
Franklin Smith
Research Triangle Institute
Raymond C. Rhodes
U. S. Environmental Protection Agency
Messrs. Strong, White, and Smith are with the Research Triangle
Institute, P.O. Box 1294, Research Triangle Park, North Carolina
27709.
Mr. Raymond C. Rhodes is in the Quality Assurance Division,
Environmental Monitoring Systems Laboratory, U. S. Environmental
Protection Agency, Mail Drop 77, Research Triangle Park, North
Carolina, 27711.
16-7

-------
80-43.3
GUIDELINES FOR IMPLEMENTING A QUALITY
COST SYSTEM FOR ENVIRONMENTAL MONITORING PROGRAMS
Introduction
Program managers with Governmental agencies and industrial organiza-
tions involved in environmental measurement programs are concerned with
overall program cost-effectiveness including total cost, data quality and
timeliness. There are several costing techniques designed to aid the man-
ager in monitoring and controlling program costs. One particular tech-
nique specifically applicable to the operational phase of a program is a
quality cost system.
The objective of a quality cost system for an environmental monitor-
ing program is to minimize the cost of those operational activities
directed toward controlling data quality while maintaining an acceptable
level of data quality. The basic concept of the quality cost system is to
minimize total quality costs through proper allocation of planned expendi-
tures for the prevention and appraisal efforts in order to control the
unplanned correction costs. That is, the system is predicated on the idea
that prevention is cheaper than correction.
There is no pre-set formula for determining the optimum mode of oper-
ation. Rather, the cost effectiveness of quality costs is optimized
through an iterative process requiring a continuing analysis and evalua-
tion effort. Maximum benefits are realized when the system is applied to
a specific measurement method in a stable long term monitoring program.
For example, a monitoring program with a fixed number of monitoring sites,
scheduled to operate for a year or more, would be a desirable candidate
for a quality cost system.
Quality costs for environmental monitoring systems have been treated
by Rhodes and Hochheiser*. The purpose of this paper is to present
guidelines for the implementation of a quality cost system. The contents
of this paper are based on work performed by the Research Triangle Insti-
tute under contract to the U. S. Environmental Protection Agency^.
Structuring of Quality Costs
The first step in developing a quality cost system is identifying
the cost of quality-related activities, including all operational activi-
ties that affect data quality, and dividing them into the major cost cate-
gories.
Costs are divided into category, group, and activity. Category, the
most general classification, refers to the standard cost subdivisions of
prevention, appraisal, and failure. The category subdivision of costs
provides the basic format of the quality cost system. Activity is the
most specific classification and refers to the discrete operations for
which costs should be determined. Similar types of activities are summa-
rized in groups for purposes of discussion and reporting.
16-8

-------
80-43.3
Cost Categories
The quality cost system structure provides a means for identifica-
tion of quality-related activities and for organization of these activi-
ties Into prevention, appraisal, and failure cost categories. These cate-
gories are defined as follows:
o Prevention costs--Costs associated with planned activities whose
purpose 1s to ensure the collection of data of acceptable quality
and to prevent the generation of data of unacceptable quality,
o Appraisal Costs—Costs associated with measurement and evaluation of
data quality. This Includes the measurement and evaluation of
materials, equipment, and processes used to obtain quality data,
o Failure Costs—Costs Incurred directly by the monitoring agency or
organization producing the failure (unacceptable data).
Cost Groups
Quality cost groups provide a means for subdividing the costs within
each category into a small number of subcategories which eliminates the
need for reporting quality costs on a specific activity basis. Although
the groups listed below are common to all environmental measurement
methods, the specific activities included in each group may differ between
methods.
Groups within prevention costs. Prevention costs are subdivided
Into five groups:
o Planning and Documentation—Planning and documentation of procedures
for all phases of the measurement process that may have an effect
on data quality.
o Procurement Specification and Acceptance—Testing of equipment parts,
materials, and services necessary for system operation. This
Includes the Initial on-site review and performance test, if any.
o Training—Preparing or attending formal training programs, evaluation
of training status of personnel, and informed on-the-job train-
ing.
o Preventive Maintenance—Equipment cleaning, lubrication, and parts
replacement performed to prevent (rather than correct) failures.
o System Calibration—Calibration of the monitoring system, the fre-
quency of which could be adjusted to improve the accuracy of the
data being generated. This Includes intitial calibration and
routine calibration checks and a protocol for tracing the
calibration standards to primary standards.
Groups within appraisal costs. Appraisal costs are subdivided into
four groups:
o Quality Control (QC) Measures—QC-related checks to evaluate measure-
ment equipment performance and procedures,
o Audit Measures—Audit of measurement system performance by persons
outside the normal operating personnel,
o Data Validation—Tests performed on processed data to assess its
correctness.
16-9

-------
80-43.3
o Quality Assurance (QA) Assessment and Reporting—Review, assessment,
and reporting of QA activities.
Groups within failure costs. Under most quality cost systems, the
failure category is subdivided into Internal and external failure costs.
Internal failure costs are those costs incurred directly by the agency or
organization producing the failure.
Internal failure costs are subdivided into three groups:
o Problem Investigation—Efforts to determine the cause of poor data
quality.
o Corrective Action—Cost of efforts to correct the cause of poor data
quality, implementing solutions, and measures to prevent problem
reoccurrence.
o Lost Data—The cost of efforts expended for data which was either
invalidated or not captured (unacquired and/or unacceptable data).
This cost Is usually prorated from the total operational budget of
the monitoring organization for the percentage of data lost.
External failure costs are associated with the use of poor quality
data external to the monitoring organization or agency collecting the
data. In air monitoring work these costs are significant but are diffi-
cult to systematically quantize. Therefore, this paper will only address
failure costs Internal to the monitoring agency. However, external
failure costs are important and should be considered when making decisions
on additional efforts necessary for increasing data quality or for the
allocation of funds for resampling and/or reanalysis.
Examples of failure cost groups are:
o Enforcement actions—Cost of attempted enforcement actions lost due
to questionable monitoring data,
o Industry—Expenditures by Industry as a result of Inappropriate or
Inadequate standards established with questionable data,
o Historical Data—Loss of data base used to determine trends and
effectiveness of control measures.
Cost Activities
Examples of specific quality-related activities which affect data
quality are presented 1n Table I. These activities are provided as a
guide for implementation of a quality cost system for an air quality
program utilizing continuous monitors. Uniformity across agencies and
organizations 1n the selection of activities is desirable and encouraged,
however, there are variations which may exist, particularly between moni-
toring agencies and industrial/reseasrch projects.
Agencies should make an effort to maintain uniformity regarding the
placement of activities In the appropriate cost group and cost category.
This will provide a basis for future "between agency" comparison and
evaluation of quality cost systems.
Development and Implementation of the Quality Cost System
Guidelines are presented In this section for the development and
implementation of a quality cost system. These cover planning the system,
selecting applicable cost activities, identifying sources of quality cost
data, tabulating, and reporting the cost data.
16-10

-------
80-43.3
Planning
Implementation of a quality cost system need not be expensive and
time consuming. It can be kept simple 1f existing data sources are used
wherever possible. The Importance of planning cannot be overemphasized.
For example, Implementation of the quality cost system will require close
cooperation between the quality cost system manager and other managers or
supervisors. Supervisors should be thoroughly briefed on quality cost
system concepts, benefits, and goals.
System planning should Include the following activities:
o Determining scope of the Initial quality cost program
o Setting objectives for the quality cost program,
o Evaluating existing cost data.
o Determining sources to be utilized for the cost data,
o Deciding on the report formats, distribution, and schedule.
To gain experience with quality cost system techniques, an Initial
pilot program could be developed for a single measurement method or
project within the agency. The unit selected should be representative,
i.e., exhibit expenditure for each cost catetgory: prevention, appraisal,
and failure. Once a working system for the initial effort has been
established, a full-scale quality cost system can then be implemented.
Activity Selection
The first step for a given agency to implement a quality cost system
1s to prepare a detailed list of the quality-related activities most
representative of the agencies monitoring operation and to assign these
activities to the appropriate cost groups and cost categories. Worksheets
and cost summaries for collecting and tabulating cost data for specific
measurement methods will then need to be assigned and methods developed to
accumulate the costs as easily as possible. Ultimately and most Important
1s the analysis of the accumulated costs, discussed 1n the next section.
The general definitions of the cost groups and cost categories,
presented 1n the previous section, are applicable to any measurement
system. Specific activities contributing to these cost groups and
categories, however, may vary significantly between agencies, depending on
the scope of the cost system, magnitude of the monitoring network,
parameters measured, and duration of the monitoring operation. The
activities listed 1n Table I are provided as a guide only, and they are
not considered to be Inclusive of all qua!ity-related activities. An
agency may elect to add or delete certain activities from this list. It
is Important, however, for an agency to maintain uniformity regarding the
cost groups and categories the activities are listed under. As Indicated
previously, this will provide a basis for future cost system comparison
and evaluation.
Quality Cost Data Sources
Host accounting records do not contain cost data detailed enough to
be directly useful to the operating quality cost system. Some further
calculation 1s usually necessary to determine actual costs which may be
entered on the worksheets. The cost of a given activity 1s usually
16-11

-------
80-43.3
estimated by prorating the person's charge rate by the percentage of time
spent on that activity. A slightly rougher estimate can be made by using
average charge rates for each position Instead of the actual rates.
Failure costs are more difficult to quantize than either prevention
or appraisal costs. The Internal failure cost of lost data (unacquired
and/or unacceptable data), for example, must be estimated from the total
budget.
Cost Accumulation and Tabulation
Cost collection and tabulation methods should be kept simple and
conducted within the framework of the agency's normal reporting format
whenever possible. During initial system development, a manual approach
will allow needed flexibility, whereas, automatic quality cost data
tabulation would be complicated, since many of the quality-related
activities are not typical 1n existing accounting systems. Automatic
tabulation of costs may be practical after the basic quality cost system
has been developed.
Also, an effective cost system does not require precise cost account-
ing. Reasonable cost estimates are adequate when actual cost records are
not available.
Worksheets and summaries used to collect and tabulate the cost data
should be designed to represent expenditures by activity.
Quality Cost Worksheets
Worksheets for collecting and tabulating quality cost data should be
prepared for each specific measurement method. The worksheet should be
designed to allow cost tabulation for each quality-related activity
performed and to accomodate more than one personnel level per activity.
In addition, activities should be organized Into appropriate cost groups
and cost categories so that when total costs are computed, they can be
transferred directly to cost summaries later.
Quality Cost Analysis Techniques
Techniques for analyzing and evaluating cost data range from simple
charts comparing the major cost categories to sophisticated mathematical
models of the total program. Common techniques Include trend analysis and
Pareto analysis.
Trend analysis. Trend analysis compares present to past quality
expenditures by category. A history of quality cost data, typically a
minimum of 1-year, is required for trend evaluation. (An example 1s given
In Figure 1 of the next section).
Cost categories are plotted within the time frame of the reporting
period (usually quarterly). Costs are plotted either as total dollars (if
the scope of the monitoring program 1s relatively constant) or as
"normalized" dollars/data unit (1f the scope may change). 6roups and
activities within the cost categories contributing the highest cost
proportions are plotted separately.
16-12

-------
80-43.3
Pareto analysis. Pareto analysis identifies the areas with greatest
potential for quality Improvement by:
o Listing factors and/or cost segments contributing to a problem area,
o Ranking factors according to magnitude of their contribution,
o Directing corrective action toward the largest contributor.
Pareto techniques may be used to analyze prevention, appraisal, or
failure costs. They are most logically applied to the failure cost
category, since the relative costs associated with activities 1n the
failure category Indicate the major source of data quality problems.
Typically, relatively few contributors will account for most of the
failure costs.3*4 (An example 1s given 1n Figure 3 of the next
section.)
Quality Cost Reports
Quality cost reports prepared and distributed at regular Intervals
should be brief and factual, consisting primarily of a suranary discussion,
a tabulated data summary, and a graphic representation of cost category
relationships, trends, and data analysis. The summary discussion should
emphasize new or continuing problem areas and progress achieved during the
reporting period.
Written reports should be directed toward specific levels of
management. Managers and supervisors receiving reports should be
thoroughly briefed on the concepts, purpose, and potential benefits of a
quality cost system, I.e., Identification of quality-related problems,
potential input into problem solution, and quality cost budgeting.
Quality Cost System Example
A hypothetical case history of a quality cost system Is presented 1n
this section. In this example, a cost system 1s developed for an agency
operating sixteen sulfur dioxide monitoring stations. The stations are
located within a 50-mile radius and each Is equipped with a continuous
sulfur dioxide monitor. The monitoring network has been in operation for
2 years.
The QA Coordinator is given the responsibility for Implementing the
quality cost system. The QA coordinator plans the Implementation of the
pilot cost system. Planning for the system Includes selecting cost
activities, determining cost methods, and establishing procedures for
maintaining the system.
To establish a historical basis quality costs are estimated for the
past year. This allows for trend observation over an adequate period of
time. These costs are shown (see Figure 1) and discussed 1n the following
paragraphs.
Unacceptable data costs are a major cost group 1n the failure
category. In order to establish the value of "lost data", the overall
monitoring budget 1s determined from contracts, accounting documents, and
other source documents. Table II summarizes total monitoring costs for
the criteria pollutants and the sulfur dioxide costs are used in this
example quality cost system. The cost datas1nc1udes the maximum possible
number of data units and cost per data unit.
Quality-related costs are estimated for each quarter over the
preceding year. The estimated costs are subject to the following consid-
erations:
16-13

-------
80-43.3
o Estimates of time spent by an operator performing a specific activity
takes Into account the capability of the operator to perform
several activities simultaneously. For example, an operator
performing daily analyzer zero/span will have time to simultaneous-
ly perform other duties while the analyzers stabilize to the
zero/span Inputs.
o The activities are performed by three personnel types: manager,
supervisor, and operator. The cost per hour for each level is
consistent with "Cost of Monitoring Air Quality 1n the United
States."5
Analysis and evaluation of the collected cost data will determine
several facts about the example agency's quality effort. The cost data
should reflect the present status of the quality program, where major
problem areas exist, and what immediate goals should be established.
A graph of the expenditures for each cost category is shown in Figure
2. Throughout the preceding year prevention costs were relatively snail,
appraisal costs were moderate, and failure costs were significant. Also,
failure costs showed an Increasing trend throughout the year.
A Pareto distribution of the failure costs (Figure 3) shows that the
major cost contributor Is "lost" data. The "lost" data cost represents
over 80 percent of the total failure costs. Although the "lost" data cost
represents less than 20 percent of the total data possible, the cost of
this loss 1s significant.
An investigation determines the major cause of the problem to be a
shortage of station operators. The workload of the one fulltime operator
does not allow adequate time for an effective preventive maintenance
program. The lack of proper preventive maintenance increases the
frequency of analyzer/equipment failure resulting in an additional work-
load for the station operator, I.e., equipment repair.
The quality manager prepares a quality cost report covering the
Initial study results. The report presents several recommendations,
Including:
o Hire and train an additional operator,
o Increase prevention efforts for the monitoring operation,
o Reduce failure costs 50% by the end of the next reporting period.
During the following quarter, an additional operator was hired and
trained. Preventive maintenance procedures were reviewed and modified as
required. At the end of this reporting period, quality costs were
collected, analyzed, and evaluated. The quality cost report covering this
reporting period shows that failure costs were reduced 37%, prevention
costs were Increased 81% and appraisal costs increased 32%. A net
decrease 1n total quality cost, amounting to $2,584 (11%) was experienced
for the quarter as seen 1n Figure 1 when comparing the first quarter of
1979 with the fourth quarter of 1978.
The changes in category expenditures (Figure 2) reflect specific
corrective measures Initiated during the reporting period. These measures
Included hiring and training an additional operator and increasing the
preventive maintenance effort.
Although the unacceptable data costs were decreased significantly,
these costs are still excessive and a preliminary analysis of the last
sulfur dioxide data indicates that additional effort 1n preventive
maintenance is necessary to further reduce the networks operating costs.
16-14

-------


TOTAL QUALITY COST SUMMARY


(Combined network costs
, 1978-79)

COST GROUP
2nd Quarter
3rd Quarter
4th Quarter
1st Quarter
PREVENTION




Planning ft documentation
—
—
...
179
Procurement
—
—
---
179
Training
—
—
—
459
Preventive maintenance
588
559
587
1,046
System calibration




and operation
1 f 254
1,317
1,386
1,713
TOTAL PREVENTION COSTS
1,842
1,876
1,973
3,576
APPRAISAL




QC measures
768
806
742
1,631
Audits
1,308
1,508
1,470
1,913
Data validation
1,468
1,668
1,868
1,887
QA assessment 8 reporting
1,748
1,839
1,686
2,179
TOTAL APPRAISAL COSTS
5,292
5,821
5,766
7,610
FAILURE




Problem Investigation
1,579
1,886
1,760
704
Corrective action
1,361
1,334
1,365
546
Lost data (unacquired




data)
12.430
13,893
13,162
9,506
TOTAL FAILURE COSTS
15,370
17,113
16,287
10,256
TOTAL QUALITY COSTS
22,504
24,810
24,026
21,442
MEASUREMENT BASES




Total program cost per quarter 48,304



Total data units per quarter 33,792



Figure!. Total quality cost summary.

-------
80-43.3
Tota| Cost
Failure Cost
I «
Appraisal Cost
Prevention Cost
1
2
3
U
1
QUARTERS
f<	1978	—1979	
Figure 2. Quality cost trends.
100-,
§
Figure 3. Failure cost distribution.
16-16

-------
BU-43.3
1
ii b i * i
Is {'. ft! 9
Pi &! ifgi **z
*sfi?i ttfi is
pitgi 1211 |-J

I 1
4 * *9 ®
~ * fill
| | ||
|
* 4 «e i 1 I» . »' . Ill-'
11 Hi 1 f \ 1II if II i ! i
II iilfili 1 * 1 8s- 8* 1! « -1&s i
lllfillfii Hi ii|I ||||| fiifij |
fsIIIISil! Hal ||I| fill!
|o eoeeoo ° ° f.^|k f*EfE
1
|i
&
1 * * i
g> I p | 5
1 1 i l 1-

- l, ! II!
1 | llji I If |ii| 1 !!!
I nil, t| \iM\m li\
fifl'llli it hWiJi«M fill
|l|K|i«il |t ^ |o... ||-l
i
i <
i 1 I*
1 if l|
b 5 a s
16-17

-------
TABLE II. Total monitoring cost (dollars).
Pollutant
Annualized
Total Cost
Per Station
Maximum
Data Units
Per Station*
Cost Per
Data Unit
CO
9, $6$
8443
1.18
so2
12,076
8448
1.43
0,
8,713
8448
1.03
TSP
1,535
61
25.10
no2
8,757
8448
1.04
THC
9,231
8448
1.09
TOTAL FOR SO* <= $12,076 x 16 = $193,216
* Maximum data units for continuous analyzers based on
total possible hourly averages per year.
16-18

-------
80-43.3
Summary
The first step 1n Implementing a quality cost system for an environ-
mental monitoring program 1s to categorize quality-related activities Into
prevention, appraisal, and correction categories. An example listing for
measurement methods involving continuous gaseous analyzers 1s given 1n
this paper. Major Items to be considered when Implementing a system have
been discussed along with an example quality cost system. Emphasis should
be placed by management on preventive activities to decrease total cost of
quality related activities.
16-19

-------
80-43.3
References
1.	Rhodes, Raymond C. and Seymour Hochheiser. "Quality Costs for
Environmental Monitoring Systems," American Society for Quality
Control, Technical Conference, Vol. 77, 1971, p. 1D1. /177, f- 3*8 .
2.	Strong, R. B., J. H. White and F. Smith "Guidelines for the
Development and Implementation of a Quality Cost System for A1r
Pollution Measurement Programs", Research Triangle Institute,
Resesarch Triangle Park, North Carolina, 1980, EPA Contract No.
68-02-2722.
3.	American Society for Quality Control, Quality Dosts Technical
Committee. "Guide for Reducing Quality Costs," Milwaukee, Wisconsin,
1977.
4.-	American Society for Quality Control, Quality Cost-Effectiveness
Committee. "Quality Costs—What and How," Milwaukee, Wisconsin,
1977.
5.	PEDCo Environmental, Inc. "Cost of Monitoring Air Quality in the
United States," December 1978.
16-20

-------
APPENDIX

-------
EPA 600/4-80-009
February 1980
THE QUALITY ASSURANCE BIBLIOGRAPHY
by
Monitoring Technology Division
Office of Research and Development
Washington, D.C. 20460
OFFICE OF MONITORING AND TECHNICAL SUPPORT
OFFICE OF RESEARCH AND DEVELOPMENT
U.S. ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
A-1

-------
DISCLAIMER
This report has been reviewed by the Office of Research and
Development, EPA, and approved for publication. Approval does not signify
that the contents necessarily reflect the views and policies of the
Environmental Protection Agency, nor does mention of trade names or
commercial products constitute endorsement or recommendation for use.
A-2

-------
Section I
AIR - QUALITY ASSURANCE PUBLICATIONS - GENERAL
Environmental Monitoring Systems Laboratory
Research Triangle Park, North Carolina (EMSL-RTP)
Reference
EPA 600/4-77-027a
NTIS-PB-273518/AS
EPA 600/4-77-027b
EPA 600/9-76-005
NTIS-PB-254658/AS
Report Title
Quality Assurance Guidelines for Air Pollution
Measurement Systems, Volume II - Ambient Air
Specific Methods
May 1977
Quality Assurance Guidelines for Air Pollution
Measurement Systems, Volume III - Stationary
Source Specific Methods
August 1977
Quality Assurance Handbook for Air Pollution
Measurement Systems, Volume I - Principles
March 1976
EPA 650/4-74-016
NTIS-PB-235774/7BE
EPA 650/4-74-005q
Concepts for Development of Field Usable Test
Atmosphere Generation Devices
December 1973
Guidelines for Development of Quality Assurance
Program for Stationary Source Characterization
Techniques
November 1976
EPA 650/4-74-005p
Guidelines for Development of a Quality Assurance
Program, Volume XVI - Method for the Determination
of Nitrogen Dioxide in the Atmosphere (Sodium
Arsenite Procedure)
March 1976
EPA 650/4-74-005o
NTIS-PB-256859/OBE
Guidelines for Development of a Quality Assurance
Program, Volume XV - Determination of Sulfur
Dioxide Emissions from Stationary Sources by
Continuous Monitors
March 1976
A-3

-------
Section 1
Air - Continued
Reference
EPA 650/4-74-005n
NTIS-PB-244167/3BE
EPA 650/4-74-005m
NTIS-PB-241264/1BE
EPA 650/4-74-0051
NTIS-PB-240751/8BE
EPA 650/4-74-005k
NTIS-PB-256858/2BE
EPA 650/4-74-0051
NTIS-PB-257758/AS
EPA 650/4-74-005h
NTIS-PB-25 775 7/5BE
EPA 650/4-74-005g
NTIS-PB-257803/7BE
EPA 650/4-74-005f
NTIS-PB-257756/7BE
Report Title
Guidelines for Development of a Quality Assurance
Program, Volume XIV - Determination of Lead in
Gasoline by Colorimeter
February 1975
Guidelines for Development of a Quality Assurance
Program, Volume XIII - Determination of Lead
Gasoline by A.A.
November 1974
Guidelines for Development of a Quality Assurance
Program, Volume XII - Determination of Phosphorous
in Gasoline by A.A.
November 1974
Guidelines for Development of a Quality Assurance
Program, Volume VI - Determination of Beryllium
Emissions from Stationary Sources
April 1976
Guidelines for Development of a Quality Assurance
Program, Volume IX - Visual Determination of
Opacit Emissions from Stationary Sources
November 1975
Guidelines for Development of a Quality Assurance
Program, Volume VIII - Determination of CO Emis-
sions from Stationary Sources by NDIR Spectrometry
February 1975
Guidelines for Development of a Quality Assurance
Program, Volume VII - Determination of Sulfuric
Acid Mist and Sulfur Dioxide Emissions from
Stationary Sources
March 1976
Guidelines for Development of a Quality Assurance
Program, Volume VI - Determination of Nitrogen
Oxide Emissions from Stationary Sources
December 1975

-------
Section I
Air - Continued
Reference
EPA 650/4-74-005e
NTIS-PB-257755/9BE
EPA 650/4-75-005d
NTIS-PB-25685 7/4BE
EPA 650/4-74-005c
NTIS-PB-240354/1BE
EPA 650/4-74-005b
NTIS-PB-235755/6BE
EPA 650/4-74-005a
NTIS-PB-232437/4BE
EPA-R4-73-028e
NTIS-PB-237351/2BE
EPA-R4-73-028d
NTIS-PB-226486/9BA
EPA-R4-7 3-028c
NTIS-PB-222336/0BA
Report Title
Guidelines for Development of a Quality Assurance
Program, Volume V - Determination of Sulfur
Dioxide Emissions from Stationary Sources
November 1975
Guidelines for Development of a Quality Assurance
Program, Volume IV ^ Determination of Particulate
Emissions from Stationary Sources
August 1974
Guidelines for Development of a Quality Assurance
Program, Volume III - Determination of Moisture
March 1974
Guidelines for Development of a Quality Assurance
Program, Volume II - Gas Analysis for Carbon
Dioxide, Excess Air, and Dry Molecular Weight
February 1974
Guidelines for Development of a Quality Assurance
Program, Volume I - Determination of Stack Gas
Velocity and Volumetric Flow Rate (Type-S Pitot
Tube.)
February 1974
Guidelines for Development of a Quality Assurance
Program - Measuring Pollutants for Which National
Ambient Air Quality Standards Have Been Promul-
gated - Final Report
August 1973
Guidelines for Development of a Quality Assurance
Program - Reference Method for the Determination of
Sulfur Dioxide in the Atmosphere
August 1973
Guidelines of Development of a Quality Assurance
Program - Reference Method for the Measurement of
Photochemical Oxidants	,
June 1973
A-5

-------
Section I
Air - QA Continued
Reference	Report Title
EPA-R4-73-028b	Guidelines for Development of a Quality Assurance
NTIS-PB-223051/4BA	Program - Reference Method for the Determination
of Suspended Particulates in the Atmosphere
June 1973
EPA-R4-73-028a	Guidelines for Development of a Quality Assurance
NTIS-PB-222512/6BA	Program - Reference Method for the Continuous
Measurement of Carbon Monoxide in the Atmosphere
June 1973
A-6

-------
Section II
AIR - QUALITY ASSURANCE - STATIONARY SOURCES
(EMSL-RTP)
Reference	Report Title
EPA 600/4-78-023
NTIS-PB-283643/AS
EPA 600/4-77-050
NTIS-PB-278849/AS
EPA 600/4-77-048
NTIS-PB-276745/AS
EPA 600/4-77-030
NTIS-PB-273070/AS
EPA 600/4-77-026
NTIS-PB-271513/4BE
EPA 600/4-77-022
NTIS-PB-270959/AS
EPA 600/4-77-008b
NTIS-PB-266887/AS
EPA 600/4-77-008a
NTIS-PB-266886/AS
EPA 600/4-77-007
NTIS-PB-268240/AS
Survey of Transmissometers Used in Conducting
Visible Emissions Training Course
May 1978
Collaborative Study of EPA Methods 13A and 13B
December 1977
Methods for Determining the Polychlorinated
Biphenyl Emissions from Incineration and from
Capacitor and Transformer Filing Plants
November 1977
A Study on the Accuracy of Type-S Pitot Tubes
June 1977
Standardization of Stationary Source Methods for
Vinyl Chloride
May 1977
Survey of Continuous Emission Monitors - Survey
No. 1 NOx and S02
April 1977
Standardization of Method 11 at a Petroleum
Refinery, Volume II
January 1977
Standardization of Method 11 at a Petroleum
Refinery, Volume I
January 1977
Determination of Hydrogen Sulfide in Refinery
Fuel Gases
January 1977
A-7

-------
Section II
Air - Stationary Sources Continued
Reference
EPA 600/4-74-044
NTIS-PB-258847/AS
EPA 600/4-76-038
NTIS-PB-257104/AS
EPA 600/4-76-014
NTIS-PB-252028/AS
EPA 650/4-75-025
NTIS-PB-245045/AS
EPA 650/4-75-009
NTIS-PB-257948/OBE
EPA 650/4-75-003
NTIS-PB-240752/AS
Report Title
The EPA Program for the Standardization of
Stationary Source Emission Test Methodology
August 1976
Application of EPA Method 6 to High Sulfur
Dioxide Concentrations
July 1976
Collaborative Study of Particulate Emissions
Measurements by EPA Methods 2, 3, & 5 Using Paired
Particulate Sampling Trains (Municipal Incinerator.)
March 1976
A Method to Obtain Replicate Particulate Samples
from Stationary Sources
July 1975
Evaluation and Collaborative Study of the Method
for Visual Determination of Opacity of Emissions
from Stationary Sources
January 1975
Collaborative Study of the Method for the
Determination Sulfuric Acid Mist and Sulfur Dioxide
Emissions from Stationary Sources
November 1974
EPA 650/4-75-001
NTIS-PB-241284/AS
EFA 650/4-74-039
NTIS-PB-238267/AS
Collaborative Study of Method 10 - Reference Method
for Determination of Carbon Monoxide Emissions from
Stationary Sources - Report of Testing
January 1975
Laboratory and Field Evaluations of EPA Methods
2, 6, and 7
October 1973
A-8

-------
Section II
Air - Stationary Sources Continued
Reference
EPA 650/4-74-033
NTIS-PB-240342/6BE
EPA 650/4-74-029
NTIS-PB-237346/AS
Report Title
Collaborative Study of the Method for the
Determination of Stack Gas Velocity and Volumetric
Flow Rate in Conjunction with EPA Method 5
September 1974
Collaborative Study of the Method for the
Determination of Particulate Emissions for
Stationary Sources (Portland Cement Plants)
May 1974
EPA 650/4-74-028
NTIS-PB-236930/AS
Collaborative Study of the Method for the
Determination of Nitrogen Oxide Emissions from
Stationary Sources (Nitric Acid Plants)
May 1974
EPA 650/4-74-026
NTIS-PB-236929/AS
Collaborative Study of the Method for Stack Gas
Analysis and Determination of Moisture Fraction
with Use of Method 5
June 1974
EPA 650/4-74-025
NTIS-PB-238555/AS
Collaborative Study of the Method for the
Determination of Nitrogen Oxide Emissions for
Stationary Sources (Fossil Fuel-Fired Steam
Generators)
October 1974
EPA 650/4-74-024
NTIS-PB-238293/AS
Collaborative Study of the Method for the
Determination of Sulfur Dioxide Emissions from
Stationary Sources (Fossil Fuel-Fired Steam
Generators)
December 1973
EPA 650/4-74-023
NTIS-PB-245011/AS
Collaborative Study of Method 104 - Reference
Method for Determination of Beryllium Emissions
from Stationary Sources
June 1974

-------
Section II
Air - Stationary Sources Continued
Reference
EPA-650/4-74-022
NTIS-PB-234151/9EE
EPA-650/4-74-021
NTIS-234150/1BE
EPA-650/4-74-015
NTIS-PB-234326/AS
EPA-600/4-74-013
NTIS-PB-237695/AS
Report Title
Collaborative Study of the Method for the Deter-
mination of Particulate Matter Emissions from
Stationary Sources (Municipal Incinerators)
July 1974
Collaborative Study of the Method for the Deter-
mination of Particulate Matter Emissions from
Stationary Sources (Fossil Fuel-Fired Steam
Generators)
June 1974
Survey of Manual Methods of Measurements of
Asbestos, Beryllium, Lead, Cadmium, Selemium, and
Mercury in Stationary Source Emissions
September 1974
Collaborative Study of EPA Methods 5, 6, and 7 in
Fossil Fuel-Fired Generators
May 1974
A-10

-------
Section III
AIR - QUALITY ASSURANCE - AMBIENT AIR
(EMSL-RTP)
Reference
EPA-600/4-78-047
NTIS-PB-291386/AS
EPA-600/4-78/024
NTIS-PB-285171/5BE
EPA-600/4-78-018
NTIS-PB-279873/AS
EPA-600/7-77-128
NTIS-PB-276620/AS
EPA-600/4-77-021
NTIS-PB-269350/AS
EPA-600/4-77-005
NTIS-PB-267985/AS
EPA-600/4-77-003
NTIS-PB-262397/AS
Report Title
Investigation of Flow Rate Calibration Procedure
Associated with the High Volume Method for Deter-
mination of Suspended Particulates
August 1978
Use of the Flame Photometric Detector Method for
Measurement of Sulfur Dioxide in Ambient Air
May 1978
Improved Temperature Stability of Sulfur Dioxide
Samples Collected by the Federal Reference Method
April 1978
Comparison of Wet Chemical and Instrumental Methods
for Measuring Airborne Sulfate
November 1977
Methodology for Measurement of Polychlorinated
Biphenyls in Ambient Air and Stationary Sources
April 1977
Evaluation of 1 Percent Neutral Buffered Potassium
Iodide Procedure for Calibration of Ozone Monitors
January 1977
Evaluation of the EPA Reference Method for the
Measurement of Non-Methane Hydrocarbons - Final
Report
June 1977
EPA-600/4-76-044
NTIS-PB-258847/AS
EPA Program for the Standardization of Stationary
Source Emission Test Methodology - A Review
August 1976
A-ll

-------
Section III
Ambient Air - Continued
Reference
EPA 600/4-76-024
NTIS-PB-253778/AS
EPA 600/4-76-015
NTIS-PB-253349/AS
EPA 600/4-76-008
NTIS-PB-254387/AS
EPA 650/4-75-022
NTIS-PB-243462/AS
EPA 650/4-75-021
NTIS-PB-242294/AS
EPA 650/4-75-019
NTIS-PB-242285/AS
EPA 650/4-75-016
NTIS-PB-244105/AS
EPA 650/4-75-013
NTIS-PB-246843/AS
Report Title
Effect of Temperature on Stability of Sulfur
Dioxide Samples Collected by the Federal Reference
Method
May 1976
Measurement of Atmospheric Sulfates
March 1976
Measurement of Atmospheric Sulfates: Literature
Search and Methods Selection
March 1976
Evaluation of a Continuous Colorimetric Measurement
Method of Nitrogen Dioxide in Ambient Air
April 1975
Evaluation of Gas Phase Titration Technique as
Used for Calibration of Nitrogen Dioxide Chemi-
luminescence Analyzers
April 1975
Evaluation of Effects of NO, CO2, and Sampling Flow
Rate on Arsonite Procedure for Measurement of NO2
in Ambient Air
April 1975
Collaborative Study of the Reference Method for
Measurement of Ozone in the Atmosphere (Ozone
Ethylene Chemiluminescent Method)
February 1975
Collaborative Test of the Chemiluminescent Method
for Measurement of NO2 in Ambient Air
February 1975
A-l 2


-------
Section III
Ambient Air - Continued
Reference
EPA-650/4-75-011
NTIS-PB-253914/6BE
EPA-600/4-75-003
NTIS-PB-268456/AS
EPA-650/4-74-048
NTIS-PB-239727/AS
Report Title
Collaborative Test of the Continuous Colorimetric
Method for Measurement of Nitrogen Dioxide in
Ambient Air
February 1975
Technical Assistance Document for the Chemilumi-
nescence Measurements of Nitrogen Dioxide
October 1976
Evaluation of the Arsenite Procedure for the
Determination of Nitrogen Dioxide in Ambient
Air
November 1974
EPA-650/4-74-047
NTIS-PB-238097/AS
EPA-650/4-74-046
NTIS-PB-257799/7BE
EPA-650/4-74-031
NTIS-PB-237348/8BE
EPA-650/4-74-027
NT1S-PB-239731/3BE
EPA-650/4-74-020
NTIS-PB-245050/0BE
Evaluation of TGS-ANSA Procedure for Determina-
tion of Nitrogen Dioxide in Ambient Air
July 1974
Collaborative Test of the TGS-ANSA Method for
Measurement of Nitrogen Dioxide in Ambient Air
September 1974
Evaluation of the Triethanolamine Procedure for
the Determination of Nitrogen Dioxide in Ambient
Air
July 1974
Collaborative Study of the Reference Method for
Determination of Sulfur Dioxide in the Atmosphere
(Paraosaniline Method) (24-Hour Sampling)
December 1973
Development of Technical Specifications for
Standard Gas-Diluent Mixtures for Use in Measure-
ment of Mobile Source Emissions
June 1974
A-l 3

-------
Section III
Ambient Air - Continued
Reference
EPA 650/4-74-019a
NTIS-PB-244902/3BE
EPA-R4-72-009
NTIS-PB-211265/BA
Report Title
Collaborative Testing Methods for Measurements
of NC>2 in Ambient Air, Volume 1 - Report of
Testing
June 1974
Collaborative Study of the Reference Method for
the Continuous Measurement of Carbon Monoxide
in the Atmosphere (Non-Dispersive Infrared
Spectrometry)
May 1972
A-14

-------
Section IV
WATER - QUALITY ASSURANCE
Environmental Monitoring and Support Laboratory
Cincinnati, Ohio (EMSL-CI)
Reference
EPA 600/8-79-006
NTIS-PB-29 7164 /AS
EPA 600/4-79-020
NTIS-PB-297686/AS
EPA 600/4-79-019
NTIS-PB-297451/AS
EPA 440/1-79-102
EPA 600/8-78-017
NTIS-PB-290329/AS
EPA 600/8-78-008
NTIS-PB-287118/AS
EPA 600/4-78/038
NTIS-PB-288153/AS
EPA 600/4-78-019
NTIS-PB-281572/AS
Report Title
An EPA Manual for Organic Analysis Using Gas
Chromatography/Mass Spectrometry
March 1979
Methods for Chemical Analysis of Water and Wastes
March 1979
Handbook for Analytical Quality Control in Water
and Wastewater Laboratories	~
March 1979
Self-Monitoring Program Analytical Methods Package
(Organic Chemicals)
July 1979
Microbiological Methods for Monitoring the
Environment
December 1978
Manual for the Interim Certification of Laboratories
Involved in Analyzing Public Drinking Water
Supplies - Criteria and Procedures
August 1978
Arsenic Determination by the Silver Diethyldithi-
otarbamate Method and the Elimination of Metal Ion
Interference
July 1978
Comparison of Methods for the Determination of
Total Available Residual Chlorine in Various Sample
Matrices
March 1978
A-15

-------
Water - Continued
Reference
EPA-600/4-78-017
NTIS-PB-280718/AS
EPA-600/A-78-012
NTIS-PB-289605/AS
EPA-600/4-77-012
NTIS-PB-266270/AS
EPA-600/4-76-052
NTIS-PB-262250/AS
EPA-600/4-76-049
NTIS-PB-259946/AS
EPA-600/4-75-012
NTIS-PB-248733/AS
EPA-600/4-75-011
NTIS-PB-248286/AS
EPA-600/4-75-008
NTIS-PB-253258/AS
EPA-600/4-75-007
NTIS-PB-245823/0BE
Section IV
Report Title
Procedure for the Evaluation of Environmental
Monitoring Laboratories
March 1978
Methods for Measuring the Acute Toxicity of
Effluents to Aquatic Organisms
July 1978
EPA Method Study 8, Total Mercury in Water
February 1977
Development of Suspended Solids Quality Control
and Performance Evaluation Samples
October 1976
Handbook for Sampling and Sample Preservation
of Waters/Wastewaters
December 1976
Recommended Design of Sample Intake Systems
for Automatic Instrumentation
November 1975
Analysis of Carbon-14 and Tritium in Reactor
Stack Gas
September 1975
Interim Radiochemical Methodology for Drinking
Water
March 1976
Analytical Quality Assurance for Trace
Organics by Gas Chromotography/Mass Spectro-
metry
September 1975
A-16

-------
Section IV
Water - Continued
Reference
EPA-670/4-75-006
NTIS-PB-243256/5BE
EPA-670/4-75-005
NTIS-PB-241802/8BE
EPA-6 70/4-75-004b
NTIS-PB-241708/7BE
EPA-670/4-75-004a
NTIS-PB-241707/9BE
EPA-670/4-75-002
NTIS-PB-241086/8BE
EPA-670/4-75-001
NTIS-PB-240700/5BE
EPA-670/4-74-007
NTIS-PB-237561/6BE
EPA-670/4-74-001
NTIS-PB-232765/8BE
EPA-670/4-73-001
NTIS-PB-227183/1BE
Report Title
Activities and Needs Related to Radioactive
Standards for Environmental Measurements
August 1975
Investigation of the Orion Research Cyanide
Monitor
April 1975
Fortran Programs for Analyzing Collaborative
Test Date, Part II - Scatter Plots
April 1975
Fortran Programs for Analyzing Collaborative
Test Date, Part I - General Statistics
April 1975
Interfacing a 24-Point Analog Recorder to a
Computer Controlled Telemetry Line
February 1975
Performance of the ISCO Model 1391 Water/
Wastewater Sampler
February 1975
Evaluation of the Ryan's Waterproof Thermograph
(Model F-30)
April 1974
Literature Survey of Instrumental Measurements
of Biochemical Oxygen Demand for Control
Application
February 1974
Biochemical Field and Laboratory Methods for
Measuring the Quality of Surface Waters and
Effluents
July 1973
A-17

-------
Section V
RADIATION - QUALITY ASSURANCE
Reference
EPA-600/7-78-122
NTIS-PB-285435/AS
EPA-600/7-78-105
NTIS-PB-285438/AS
EPA-600/4-78/033
NTIS-PB-286981/AS
EPA-600/4-78-032
NTIS-PB-284850/AS
EPA-600/7-77-144
NTIS-PB-277377/AS
Environmental Monitoring Systems Laboratory
Las Vegas, Nevada (EMSL-LV)
Report Title
An Ion Exchange Method for the Determination
of Plutonium in Water: Single-Laboratory
Evaluation and Interlaboratory Collaborative
Study
June 1978
Intercomparison of Plutonium-239 Measurements
June 1978
Radioactivity Standards Distribution Program
1978-1979
June 1978
Environmental Radioactivity Laboratory Inter-
comparison Studies Program 1978-1979
June 1978
Quality Control for Environmental Measurements
Using Gamma-Ray Spectrometry
December 1977
EPA-600/7-77-088
TVA-E-EP-77-4
NTIS-PB-277254/9BE
EPA-600/7-77-078
NTIS-PB-271965/AS
EPA-600/4-77-047
NTIS-PB-276816/AS
Handbook for Analytical Quality Control and
Radioactivity Analytical Laboratories
August 1977
Fusion Method for the Measurement of Plutonium
in Soil - Singlfe-Laboratory Evaluation and
Interlaboratory Collaborative Test
July 1977
Status and Quality of Radiation Measurements -
Food and Human Urine
October 1977
A-18

-------
Section V
Radiation - Continued
Reference
EPA-600/4-77-043
NTIS-PB-276813/AS
EPA-600/4-77-001
NTIS-PB-263900/AS
EPA-600/4-76-054
NTIS-PB-262392/AS
EPA-600/4-76-053
NTIS-PB-261330/AS
EPA-600/4-76-017
NTIS-PB-255107/AS
EPA-600/4-76-012
NTIS-PB-251313/AS
EPA-600/4-76-011
NTIS-PB-251312/AS
EPA-600/4-75-014
NTIS-PB-248171/AS
EPA-600/4-75-013
NTIS-PB-251244/AS
Report Title
The Status and Quality of Radiation Measure-
ments for Air
October 1977
Environmental Radioactivity Laboratory Inter-
comparison Studies Program - FY 1977
January 1977
Interlaboratory Intercomparison of Plutonium-
210 Measurements
October 1976
Radioactivity Standards Distribution Program -
FY 1977
October 1976
Status and Quality of Radiation Measurements
of Water
April 1976
Measurement of Total Radium and Radium-226
Environmental Water - A Tentative Reference
Method
March 1976
Measurement of Strontium-89 and Strontium-90
Environmental Waters - A Tentative Reference
Method
March 1976
Radiation Quality Assurance Intercomparison
Studies 1974-1975
October 1975
Tentative Reference Method for Measurement of
Tritium in Environmental Waters
December 1975
A-19

-------
Section V
Radiation - Continued
Reference
EPA 680/4-75-007
NTIS-PB-245598/AS
EPA 680/4-75-005
NTIS-PB-245890/9BE
EPA 680/4-75-002b
NTIS-PB-243636/8BE
EPA 680/4-75-002a
NTIS-PB-243696/2BE
EPA 680/4-73-001b
NTIS-PB-240962/1BE
GP0-EP1.23/5:680/
4/73-00lb
EPA 680/4-73-001a
NTIS-PB-240955/5BE
GP0-EP1.23/5:680/
4-73-001a
Report Title
"Preliminary Milk Report"
June 1975
Tentative Reference Method for the Measurement of
Gross Alpha and Gross Beta Radioactivities in
Environmental Waters
June 1975
Environmental Radioactivity Laboratory Intercom-
parison Studies Program
May 1975
Radioactivity Standards Distribution Program
April 1975
Environmental Radioactivity Laboratory Intercom-
parison Studies Program 1973-1974
February 1974
Radioactivity Standards Distribution Program
1973-1974
February 1974
A-20

-------
Section VI
BIOLOGICAL TESTING (MULTIMEDIA) QUALITY ASSURANCE
(EMSL-LV)
Reference
EPA 600/4-78-043
NTIS-PB-285369/AS
EPA 600/4-78-051
NTIS-PB-288198/AS
Report Title
Quality Assurance Guidelines for Biological Testing
August 1978
Mercury, Lead, Arsenic, and Cadmium in Biological
Tissue - The Need for Adequate Standard Reference
August 1978
To request copies of any of the listed documents, please write:
U.S. Environmental Protection Agency
TIOS, CERI
Cincinnati, Ohio 45268
or
National Technical Information Service
5285 Port Royal Road
Springfield, Virginia 22161
Please include either the EPA report number or the PB number with
all requests.
A-21

-------
TECHNICAL REPORT DATA
(Please read Instructions on the reverse before completing)
1. REPORT NO. 2.
EPA 450/2-81-016
3. RECIPIENT'S ACCESSION NO.
4. TITLE AND SUBTITLE
APTI Course 470
Quality Assurance for Air Pollution Measurement Systems
Student Workbook
5. REPORT DATE
April 1Q81
6. PERFORMING ORGANIZATION CODE
7. AUTHOR(S)
R M. T?ay
8. PERFORMING ORGANIZATION REPORT NO.
9. PERFORMfNG ORGANIZATION NAME AND ADDRESS
Northrop Services, Inc,
P.O. Box 12313
Research Triangle Park, NC 27709
10. PROGRAM ELEMENT NO.
B18A2C
11. CONTRACT/GRANT NO.
68^-02-2374
12. SPONSORING AGENCY NAME AND ADDRESS
U.S, Environmental Protection Agency
Manpower and Technical Information Branch
Air Pollution Training Institute
Research Triangle Park, NC 27711
13. TYPE OF REPORT AND PERIOD COVERED

-------
INSTRUCTIONS
REPORT NUMBER
Insert the EPA report number as it appears on the cover of the publication.
LEAVE BLANK
3.	RECIPIENTS ACCESSION NUMBER
Reserved for use by each report recipient.
4.	TITLE AND SUBTITLE
Title should indicate clearly and briefly the subject coverage of the report, and be displayed prominently. Set subtitle, if used, in smaller
type or otherwise subordinate it to main title. When a report is prepared in more than one volume, repeat the primary title, add volume
number and include subtitle for the specific title.
6. REPORT DATE
Each report shall carry a date indicating at least month and year. Indicate the basis on which it was selected (e.g., date of issue, date of
approval, date of preparation, etc.).
6.	PERFORMING ORGANIZATION CODE
Leave blank.
7.	AUTHOR(S)
Give name(s) in conventional order (John R. Doe, J. Robert Doe, etc.). List author's affiliation if it differs from the performing organi-
zation.
8.	PERFORMING ORGANIZATION REPORT NUMBER
Insert if performing organization wishes to assign this number.
9.	PERFORMING ORGANIZATION NAME AND ADDRESS
Give name, street, city, state, and ZIP code. List no more than two levels of an organizational hirearchy.
10.	PROGRAM ELEMENT NUMBER
Use the program element number under which the report was prepared. Subordinate numbers may be included in parentheses.
11.	CONTRACT/GRANT NUMBER
Insert contract or grant number under which report was prepared.
12.	SPONSORING AGENCY NAME AND ADDRESS
Include ZIP code.
13.	TYPE OF REPORT AND PERIOD COVERED
Indicate interim final, etc., and if applicable, dates covered.
14.	SPONSORING AGENCY CODE
Leave blank.
15.	SUPPLEMENTARY NOTES
Enter information not included elsewhere but useful, such as: Prepared in cooperation with. Translation of. Presented at conference of,
To be published in, Supersedes, Supplements, etc.
16.	ABSTRACT
Include a brief (200 words or lest) factual summary of the most significant information contained in the report. If the report contains a
significant bibliography or literature survey, mention it here.
17.	KEY WORDS AND DOCUMENT ANALYSIS
(a)	DESCRIPTORS - Select from the Thesaurus of Engineering and Scientific Terms the proper authorized terms that identify the major
concept of the research and are sufficiently specific and precise to be used as index entries for cataloging.
(b)	IDENTIFIERS AND OPEN-ENDED TERMS • Use identifiers for project names, code names, equipment designators, etc. Use open-
ended terms written in descriptor form for those subjects for which no descriptor exists.
(c)	COSAT1 FIELD GROUP • Field and group assignments are to be taken from the 196S COSAT1 Subject Category List. Since the ma-
jority of documents are multidisciplinary in nature, the Primary Field/Group assignment(s) will be specific discipline, area of human
endeavor, or type of physical object. The application(s) will be cross-referenced with secondary Field/Group assignments that will follow
the primary posting(s).
18.	DISTRIBUTION STATEMENT
Denote releasability to the public or limitation for reasons other than security for example "Release Unlimited." Cite any availability to
the public, with address and price. :
19.	& 20. SECURITY CLASSIFICATION
DO NOT submit classified reports to the National Technical Information service.
21.	NUMBER OF PAGES
Insert the total number of pages, including this one and unnumbered pages, but exclude distribution list, If any.
22.	PRICE
Insert the price set by the National Technical Information Service or the Government Printing Office, if known.
ePA Form 2220-1 (9-73) (Reverie)

-------