United States
Environmental Protection
Agency
Air Pollution Training Institute
MO 20
Environmental Research Center
Research Triangle Park, NC 27711
EPA 450/2-83-008
May 1984
Air
APTI
Correspondence Course 471
General Quality Assurance
Considerations for Ambient
Air Monitoring
Guidebook
-------
Environmental Protection
Agency
Mir ruiiuuun i raininy
MD20
Environmental Research Center
Research Triangle Park, NC 27711
May 1984
Air
APTI
Correspondence Course 471
General Quality Assurance
Considerations for Ambient
Air Monitoring
Guidebook
Course Developed by:
B. M. Ray
Instructional Design by:
M. L. Loewy
Northrop Services, Inc.
P.O. Box 12313
Research Triangle Park, NC 27709
Under Contract No.
68-02-3573
EPA Project Officer
R. E. Townsend
United States Environmental Protection Agency
Office of Air and Radiation
Office of Air Quality Planning and Standards
Research Triangle Park, NC 27711
-------
Notice
This is not an official policy and standards document. The opinions and selections
are those of the authors and not necessarily those of the Environmental Protection
Agency. Every attempt has been made to represent the present state of the art as
well as subject areas still under evaluation. Any mention of products or organizations
does not constitute endorsement by the United States Environmental Protection
Agency.
-------
Table of Contents
Page
Course Introduction 0-1
Section 1. Quality Assurance Policy and Principles 1-1
Excerpts of Volume I of Quality Assurance Handbook for Air Pollution
Measurement Systems 1-11
Review Exercise 1 -65
Answers to Review Exercise 1 -69
Section 2. Quality Assurance for Air Quality Monitoring Systems 2-1
Review Exercise 2-7
Answers to Review Exercise 2-12
Section 3. Quality Assurance for SLAMS and PSD Air Monitoring
Networks 3-1
Review Exercise 3-5
Answers to Review Exercise 3-10
Section 4. Performance Auditing of Air Quality Monitoring Systems 4-1
Review Exercise 4-5
Answers to Review Exercise 4-10
Section 5. System Auditing of SLAMS Networks 5-1
Review Exercise 5-4
Answers to Review Exercise 5-5
in
-------
Course Introduction
Overview of Course
Course Description
This training course is a 30-hour correspondence course concerning general quality
assurance considerations for ambient air monitoring. It is a supplement for EPA
APTI Course 470 Quality Assurance for Air Pollution Measurement Systems.
Course topics include the following:
• quality assurance policy and principles,
• quality assurance for air quality monitoring systems,
• quality assurance for SLAMS and PSD air monitoring networks,
• performance auditing of air quality monitoring systems, and
• system auditing of SLAMS networks.
Course Goal
The purpose of this course is to familiarize you with general quality assurance con-
siderations for ambient air monitoring.
Course Objectives
Upon completion of this course, you should be able to —
1. describe general principles of quality assurance,
2. describe general quality assurance considerations for the acquisition, installa-
tion, and operation of air quality monitoring systems,
3. describe quality control programs and data quality assessment for SLAMS and
PSD air monitoring, and
4. describe audit criteria and procedures for air quality monitoring networks.
0-1
-------
Sections and Trainee Involvement Time
Section Trainee involvement
, Section title time (hours)
number g^
1 Quality Assurance Policy and Principles
2 Quality Assurance for Air Quality Monitoring
Systems
Quiz 1 ^
3 Quality Assurance for SLAMS and PSD Air ^
Monitoring Networks
4 Performance Auditing of Air Quality Monitoring
Systems
Quiz 2 ^
5 System Auditing of SLAMS Networks 6
Final Examination * ^
Requirements for Successful Completion of this Course
In order to receive three Continuing Education Units (CEUs) and a certificate of
course completion, you must fulfill the following requirements:
• take two supervised quizzes and a supervised final examination
• achieve a final course grade of at least 70 (out of 100), determined as follows:
• 20% from Quiz 1
• 20% from Quiz 2
• 60% from the final examination.
Use of Course Materials
Necessary Materials
• EPA 450/2-83-008, APTI Correspondence Course 471 General Quality Assurance
Considerations for Ambient Air Monitoring: Guidebook
• EPA 600/4-77-027a, Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume II—Ambient Air Specific Methods
• pencil or pen
• calculator
How to Use this Guidebook
Relationship Between Guidebook and Assigned Reading Materials
This guidebook directs your progress through Section 2.0 of Volume II of the
Quality Assurance Handbook for Air Pollution Measurement Systems. Excerpts from
Volume I are included in the guidebook to provide you with a basic introduction to
quality assurance principles.
0-2
-------
If you use the guidebook instructions with the provided reading material, we think
you will find the subject material both interesting and enjoyable. Review exercises
and problems focus on specific and important aspects of the quality assurance
manual.
Description of Guidebook Sections
This guidebook contains reading assignment sections that correspond to lessons of
the course.
Each section contains the following:
• reading assignment
• section's learning goal and objectives
• reading guidance
• review exercise.
Complete the review exercises immediately after reading the assigned materials.
You may find it helpful to look over the review questions before reading. By having
an idea of what to look for in the reading materials, your attention will be better
focused and your study will be more efficiently directed.
NOTE: If more than one person will be using these materials, we recommend that
you use a separate sheet of paper to record your answers to the review exercises.
Instructions for Completing the Quizzes and the Final Examination
• You should have received, along with this guidebook, a separate sealed envelope
containing two quizzes and a final examination.
• You must arrange to have someone serve as your test supervisor.
• You must give the sealed envelope containing the quizzes and final examination to
your test supervisor.
• At designated times during the course, under the supervision of your test super-
visor, complete the quizzes and the final exam.
• After you have completed a quiz or the exam, your test supervisor must sign a
statement on the quiz/exam answer sheet certifying that the quiz or exam was
administered in accordance with the specified test instructions.
• After signing the quiz/exam answer sheet, your test supervisor must mail the quiz
or exam and its answer sheet to the following address:
Air Pollution Training Institute
Environmental Research Center
MD20
Research Triangle Park, NC 27711
• After completing a quiz, continue with the course. Do not wait for quiz results.
• Quiz/exam and course grade results will be mailed to you.
0-3
-------
// you have questions, contact:
Air Pollution Training Institute
Environmental Research Center
MD 20
Research Triangle Park, NC 27711
Telephone numbers:
Commercial: (919) 541-2401
FTS: 629-2401
0-4
-------
Section 1
Quality Assurance Policy
and Principles
Reading Assignment
Read, in the following order:
• Pages 1-5 through 1-63 of this guidebook.
• Section 2.0.5 of Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume II—Ambient Air Specific Methods, EPA 600/4-77-027a.
Reading Assignment Topics
• USEPA's quality assurance policy
• Quality assurance principles
• Development and implementation of quality assurance programs
Learning Goal and Objectives
Learning Goal
The purpose of this section is to familiarize you with the USEPA's quality assurance
policy and with general principles of quality assurance.
Learning Objectives
At the end of this section, you should be able to —
1. explain why quality assurance procedures are a vital part of the USEPA's air
monitoring programs,
2. initiate the development of a quality assurance program for ambient air
monitoring,
3. list at least eight elements of an effective quality assurance program,
4. define quality assurance, quality control, accuracy, precision, quality assurance
program plan, quality assurance project plan, performance audit, and system
audit,
5. describe the importance of a quality assurance program to an organization,
6. recognize the contribution of training programs to the reporting of high-quality
data,
1-1
-------
7. list at least four items that should be included in a quality assurance report to
management,
8. distinguish between a quality assurance program plan and a quality assurance
project plan, and
9. describe the responsibilities of the USEPA and State and local air pollution
control agencies for the development and implementation of quality assurance
programs.
Reading Guidance*
This assignment reviews the United States Environmental Protection Agency's quality
assurance policy for air pollution data and guidance for the development of quality
assurance programs. Memos giving the rationale for the development of quality
assurance programs, and portions of Volume I of the USEPA Quality Assurance
Handbook are included in this section.
In-depth training in the principles discussed in Volume I can be obtained by
attending EPA APTI Course 470, Quality Assurance for Air Pollution Measurement
Systems. This lecture course is designed for quality assurance coordinators or
managers involved with quality assurance activities and for field and laboratory
personnel working with air pollution measurements.
Volume I of the Quality Assurance Handbook focuses on 23 elements of a quality
assurance program. Eight of these elements are considered essential for setting up a
new program. The discussion given in Volume I for each of these eight elements is
reproduced in this section for your review.
The following comments introduce you to the reading assignment. Read them
along with the assignment; they will help familiarize you with some of the main
ideas of quality assurance, and will make the reading assignment easier to
understand.
1. Definition of quality assurance (QA)
• Be sure to note the distinction between quality control and quality
assurance.
• What is the product with which USEPA quality assurance programs are
concerned?
2. Elements of quality assurance
• The elements have been placed on a wheel to indicate that the relative
importance of individual elements depends on the measurement program
objectives.
"This section has been adapted from J. A. Jahnke (1982), APTI Correspondence Course 414
Quality Assurance for Source Emission Measurement Methods: Guidebook, EPA 450/2-82-003.
1-2
-------
• The eight elements reviewed here should be considered from the very begin-
ning in the development of a program. They are as follows:
(1) document control and revisions
(2) policy and objectives
(3) organization
(4) training
(5) audit procedures
(6) quality reports to management
(7) quality assurance program plan
(8) quality assurance project plans.
• The 23 quality assurance elements can be grouped into four general
categories:
(1) management activities
(2) measurement activities
(3) routine systems for program operation and support
(4) statistical techniques.
3. Document control and revisions
• Note the indexing format of the Volume I excerpts included in the reading
assignment. Also note the indexing format in Volume II. The format pro-
vides a means of tracking and updating entries in QA plans.
• The purpose of document control is to provide the latest procedures to all
concerned.
4. Quality assurance policy and objectives
• The QA policy of an organization must have the support of upper-level
management if the QA program is to be effective. Note the memos repro-
duced in the reading assignment.
• When reading the discussion of this element, note that data quality objec-
tives should be specified for:
(1) completeness
(2) precision
(3) accuracy
(4) representativeness
(5) comparability.
However, data use should be kept in mind.
5. Organization
• Quality assurance is normally a separate function. The separation helps
prevent bias and provides easier access to upper-level management.
• A QA coordinator should be appointed and his function spelled out in a
position description.
6. Training
• You cannot produce high-quality data with people who do not know how to
do their jobs. It is the responsibility of management to see that the job is
done right.
1-3
-------
7. Audit procedures
• Audit procedures should be implemented as quickly as possible when setting
up a QA program.
• Audits are one of the best ways of checking the quality of data,
8. Quality reports to management
• Feedback is very important for managers. Anything that affects manage-
ment decisions should be included in the report.
• Reports should be understood at a glance. When possible, use charts and
graphs to present data, rather than tables.
9. The quality assurance program plan
• Note that a quality assurance program plan is general. It covers QA for all
of the projects of an organization.
10. The quality assurance project plan
• Note that a quality assurance project plan gives the specific QA require-
ments for a measurement project.
• In this sense, the USEPA Quality Assurance Handbook, Volume I, provides
guidance for the development of an organization's QA program plan.
Volume II provides guidance for the development of QA project plans for
ambient air monitoring.
11. Although not included in the excerpts, it is important to mention the QA
cycle. The cycle is shown below.
Plan
Take corrective action Implement
Assess
The effect of applying quality assurance techniques should result in the
development of an on-going corrective action system. Once a QA plan is writ-
ten and implemented, assessment procedures point out necessary corrective
action which, in effect, revises the plan. The cycle continues in this manner,
providing quality air monitoring data.
12. Final note: The number of air monitoring organizations having formal quality
assurance programs has greatly increased since the promulgation of Federal
quality assurance regulations for ambient air monitoring (40 CFR 58, Appen-
dixes A and B; May 10, 1979).
When you have finished the reading assignment, complete the review exercise that
begins on page 1-65 and check your answers. The correct answers are listed on the
page following the review exercise. After you have reviewed your incorrect answers (if
any), proceed to Section 2 of this guidebook.
1-4
-------
(m)
Attachment B
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON DC 20460
THE ADM'NISTRATOR
May 30, 1979
MEMORANDUM
TO: Deputy Administrator
Director, Science Advisory Board
Director, Office of Regional and Intergovernmental Operations
Regional Administrators
Assistant Administrators
General Counsel
SUBJECT: Environmental Protection Agency (EPA) Quality Assurance
Policy Statement
The EPA must have a comprehensive quality assurance effort to
provide for the generation, storage, and use of environmental data which
are of known quality. Reliable data must be available to answer
questions concerning environmental quality and pollution abatement
and control measures. This can be done only through rigorous
adherence to established quality assurance techniques and practices.
Therefore, 1 am making participation in the quality assurance effort
mandatory for all EPA supported or required monitoring activities.
An Agency quality assurance policy statement is attached which
gives general descriptions of program responsibilities and basic
management requirements. For the purpose of this policy statement,
monitoring is defined as all environmentally related measurements
which are funded by the EPA or which generate data mandated by the EPA.
A detailed implementation plan for a total Agency quality
assurance program is being developed for issuance at a later date.
A Select Committee for Monitoring, chaired by Dr. Richard Dowd, is
coordinating this effort, and he will be contacting you directly
for your participation and support. I know that each of you shares
my concern about the need to improve our monitoring programs and
data; therefore, I know that you will take the necessary actions
that will ensure the success of this effort.
Douglas M. Costle
Attachment
1-5
-------
DATE: 05-19-80
Page 1 of 15
Strategy for the Implementation
of the
Environmental Protection Agency's
Mandatory Quality Assurance (QA) Program
I. Introduction
The EPA must have a comprehensive QA program to provide for the generation,
storage, and use of environmental data. Valid data of verifiable quality
must be available to provide a sound basis for effective decisions concerning
environmental quality, pollution abatement, and control measures. The QA
program can succeed only through rigorous adherence to established QA tech-
niques and practices.
In the past, there has been a high degree of fragmentation, lack of coordination,
poorly identified needs and resources, and duplication of efforts in the QA pro-
gram. For these reasons, it is now Agency policy, as enunciated by the Adminis-
trator in memoranda of May 30, 1979 and June 14, 1979, that all Regional Offices,
Program Offices, EPA Laboratories, and those monitoring and measurement efforts
supported or mandated through contracts, regulations, or other formalized agree-
ments participate in a centrally managed QA program. Regional Offices should work
cooperatively with States to assist them in developing and implementing QA programs.
The mandatory QA program covers all environmentally-related measurements.
Environmentally—related measurements are defined as "essentially all field and
laboratory investigations that generate data involving the measurement of chemical,
physical, or biological parameters in the environment; determining the presence or
absence of pollutants in waste streams; health and ecological effect studies;
clinical and epidemiological investigations; engineering and process evaluations;
1-6
-------
DATE: 05-19-80
Page 2 of 15
studies involving laboratory simulation of environmental events; and studies
or measurements on pollutant transport, including diffusion models.
This document presents the strategy for the development of an Agency QA program
in accordance with the Agency policy. This strategy describes, in general, the
total program effort with respect to what must be done. This strategy does not
at tempt to describe how, in detail, the program is to be implemented within the
individual Program and Regional Offices, or the EPA Laboratories. Subsequent
guidance documents will enable the Program and Regional Offices and the EPA
Laboratories to develop detailed QA plans.
II. Quality Assurance Goals and Objectives
The primary goal of the QA program is to insure that all environmentally-related
measurements supported or required by the EPA result in data of known quality.
To meet this goal, the QA program must provide for the establishment and use of
reliable monitoring and measurement systems to obtain data of necessary quality
to meet planned Agency needs.
Initial objectives are the development and implementation of QA program plans by
each of the Program and Regional Offices and EPA Laboratories which will ensure
that the QA goal can be achieved nationally.
Long-term objectives include (1) providing quantitative estimates of the quality
of all data supported or required by the Agency, (2) improving data quality where
indicated, and (3) document ing progress in achieving data quality.
A continuing objective is to promote and develop optimally uniform approaches,
1-7
-------
DATE: 05-19-80
Page 3 of 15
procedures, techniques, reporting methods, etc., across media and across
Regional Offices, Program Offices, and EPA Laboratories. It is important (and
most efficient and effective) for all organizations within EPA to employ the
same QA language, consistent policies, procedures, and techniques when inter-
acting with the States, industry, the public, contractors, grantees, QA-involved
professional societies, other Governmental agencies, and national and inter-
national organizations.
1-8
-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON DC 20460
Attachment C
THE ADMINISTRATOR
June 14, 1979
MEMORANDUM
SUBJECT: Quality Assurance Requirements for all EPA Extramural
Projects Involving Environmental Measurements
FOR: The Deputy Administrator
Assistant Administrators
Regional Administrators
General Counsel
Over the past several years, the EPA has become more and more
dependent on extramural projects to provide the environmental measure-
ments we use as a foundation for our standards, regulations and
decisions. While in most instances these projects are providing
data of proven quality that is acceptable for the Agency's purposes,
there have been, regrettably, some instances of Agency funds paid
for poor quality, unusable data.
In order to assure that all environmental measurements done by
extramural funding result in usable data of known quality, I am
making the inclusion of the attached "Quality Assurance Requirements"
mandatory for all EPA grants, contracts, cooperative agreements, and
interagency agreements that involve environmental measurements. In
addition to these general requirements, I expect every Project Officer
to include whatever additional specific quality assurance requirements
are necessary in each extramural project under his control. Criteria
and guidelines in this area will be forthcoming from the Agency's
Quality Assurance Implementation Work Group. Further, I direct the
Assistant Administrator for Planning and Management to provide the
appropriate contract and grant regulations such that the attached
form "Quality Assurance Review for Extramural Projects Involving
Environmental Measurements" will be satisfactorily completed where
appropriate prior to the approval of any contracts or grants in FY-80.
I recognize that this may increase the cost per environmental
measurement, but the benefits of a credible Agency data base that
provides a level of quality that meets the needs of users far out-
weigh any such increases.
Douglas M. Costle
Attachment
1-9
-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON DC 20160
THE ADM'NISTRATOR
November 2, 1981
MEMORANDUM
TO: Associate Administrators
Assistant Administrators
Regional Administrators
SUBJECT: Mandatory Quality Assurance Program
One of the major concerns of this administration and myself is that we
support all of our actions and decisions with statistically representative
and scientifically valid measurement of environmental quality. To meet
this objective, it is essential that each of you continue to support and
implement the Agency's mandatory Quality Assurance program which is being
implemented by the Office of Research and Development. It is especially
essential that you assure that the appropriate data quality requirements
are included in all of your extramural and intramural environmental
monitoring activities. I also am particularly concerned that you do not
sacrifice quality for quantity when adjusting your program to meet our new
resource targets.
The attached Second Annual Quality Assurance Report demonstrates the
importance of this program in achieving our goals and objectives.
Recognizing its importance, I have asked Dr. Hernandez to closely monitor
this program's implementation and advise me of any problems that affect
the scientific data bases of the Agency.
Anne M. Gorsuch
Attachment
cc: Deputy Administrator
Office Directors
1-10
-------
EPA-600/9-76-005
January 1976
QUALITY ASSURANCE HANDBOOK
FOR
AIR POLLUTION MEASUREMENT SYSTEMS
Volume I — Principles
U.S. ENVIRONMENTAL PROTECTION AGENCY
Office of Research and Development
Environmental Monitoring and Support Laboratory
Research Triangle Park, North Carolina 27711
1-11
-------
Section HO. 1.3
Revision No. 1
Date January 9, 1984
Page 1 of 2
1.3 DEFINITION OF QUALITY ASSURANCE1-6
Quality assurance and quality control have been defined and
interpreted in many ways. Some authoritative sources differen-
tiate between the two terms by stating that quality control is
"the operational techniques and the activities which sustain a
quality of product or service (in this case, good quality data)
that meets the needs; also the use of such techniques and
activities," whereas quality assurance is "all those planned or
systematic actions necessary to provide adequate confidence that
a product or service will satisfy given needs."1
Quality control may also be understood as "internal quality
control;" namely, routine checks included in normal internal
procedures; for example, periodic calibrations, duplicate
checks, split samples, and spiked samples. Quality assurance
may also be viewed as "external quality control," those activi-
ties that are performed on a more occasional basis, usually by a
person outside of the normal routine operations; for example,
on-site system surveys, independent performance audits, inter-
laboratory comparisons, and periodic evaluation of internal
quality control data. In this Handbook, the term quality assur-
ance is used collectively to include all of the above meanings
of both quality assurance and quality control.
While the objective of EPA's air programs is to improve the
quality of the air, the objective of quality assurance for air
programs is to improve or assure the quality of measured data,
such as pollutant concentrations, meteorological measurements,
and stack variables (e.g., gas velocity and mass emissions).
Thus the "product" with which quality assurance is concerned is
data.
Since air pollution measurements are made by numerous
agencies and private organizations at a large number of field
stations and laboratories, quality assurance is also concerned
1-12
-------
Section No. 1.3
Revision No. 1
Date January 9, 1984
Page 2 of 2
with establishing and assessing comparability of data quality
among organizations contributing to data bases.
1.3.1 REFERENCES
1. Juran, J. M. Quality Control Handbook, 3rd Ed. McGraw-
Hill, 1974. Section 2.
2. ASTM. Designation E548-79, "Recommended Practice for
Generic Criteria for Use in the Evaluation of Testing and
Inspection Agencies."
3. ANSI/ASQC. Standard A3-1978. "Quality Systems Terminolo-
gy."
4. ANSI/ASQC. Standard Zl.15-1980. "Generic Guidelines for
Quality Systems."
5. Feigenbaum, A. V. Total Quality Control, Engineering and
Management. McGraw-Hill, 1961.
6. Canadian Standards Association. CSA Standard Z299,1-1978.
Quality Assurance Program Requirements.
1-13
-------
Section No. 1.4
Revision No. 1
Date January 9, 1984
Page 1 of 3
1.4 ELEMENTS OF QUALITY ASSURANCE
A quality assurance program for air pollution measurement
systems should cover a number of areas or elements. These
elements are shown- in Figure 1.4.1 in a "Quality Assurance
Wheel." The wheel arrangement illustrates the need for a
quality assurance system that addresses all elements and at the
same time allows program managers the flexibility to emphasize
those elements that are most applicable to their particular
program. Quality assurance elements are grouped on the wheel
according to the organization level to which responsibility is
normally assigned. These organizational levels are the quality
assurance coordinator (normally a staff function), supervisor
(a line function), and the operator. Together the supervisor
and quality assurance coordinator must see that all these
elements form a complete and integrated system and are working
to achieve the desired program objectives.
The three-digit numbers shown on the wheel show the loca-
tion in Section 1.4 where a description of the element is pro-
vided. Each element is described in three subsections as fol-
lows:
1. ABSTRACT - A brief summary that allows the program
manager to review the section at a glance.
2. DISCUSSION - Detailed text that expands on items
summarized in the ABSTRACT.
3. REFERENCES - List of resource documents used in prepa-
ration of the discussion. In addition, where applicable, a list
of resource documents for recommended reading is shown under
BIBLIOGRAPHY.
The DISCUSSION subsection is designed to be relatively
brief. In those cases where a topic would require considerable
detailed discussion, the reader is referred to the appropriate
1-14
-------
Section No. 1.4
Revision No. 1
Date January 9, 1984
Page 2 of 3
^ fc
L\ Y
Operator
and
Supervisor
Supervisor and
Quality Assurance
Coordinator
Statistical Analysis
ol Data 1.4.18
Proeurenwnt
Quality Control 1.4.11
Figure 1.4.1.
Quality assurance elements and responsibilities
(the quality assurance wheel).
1-15
-------
Section No. 1.4
Revision No. 1
Date January 9, 1984
Page 3 of 3
APPENDIX. A case in point is Section 1.4.18 on Statistical
Analysis of Data. In this section the statistical methods are
briefly summarized. For more details on the methods the reader
is referred to the appropriate APPENDICES. For example, for
statistical treatment of audit data the reader is referred to
Appendix G.
1-16
-------
Procurement
quality control
Statistical analysis
of data
1-17
-------
Revision No. 1
Date January 9, 1984
Page 1 of 3
1.4.1 DOCUMENT CONTROL AND REVISIONS
1.4.1.1 ABSTRACT
A quality assurance program should include a system for
documenting operating procedures and subsequent revisions. The
system used for this Handbook is described and is recommended.
1.4.1.2 DISCUSSION
A quality assurance program should include a system for
updating the formal documentation of operating procedures. The
suggested system is the one used in this Handbook and described
herein. This system uses a standardized indexing format and
provides for convenient replacement of pages that may be changed
within the technical procedure descriptions.
The indexing format includes, at the top of each page, the
following information:
Section No.
Date
Page
A digital numbering system identifies sections within the text.
The "Section No." at the top of each page identifies major
three-digit or two-digit sections, where applicable. Almost all
of the references in the text are to the section number, which
can be found easily by scanning the top of the pages. Refer-
ences to subsections are used within a section. For example,
Section 1.4.4 represents "Quality Planning" and Section 1.4.5
represents "Training." "Date" represents the date of the latest
revision. "Page No." is the specific page in the section. The
total number of pages in the section is shown in the "Table of
Contents." An example of the page label for the first page of
"Quality Planning" in Section 1.4.4 follows:
1-18
-------
Section No. 1.4.i
Revision No. I
Date January 9, 1984
Page 2 of 3
Section No. 1.4.4
Date January 9, 1984
Page !_
For each new three-digit level, the text begins on a new page.
This format groups the pages together to allow convenient revi-
sion by three-digit section.
The Table of Contents follows the same structure as the
text. It contains a space for total number of pages within each
section. This allows the Handbook user to know how many pages
are supposed to be in each section. When a revision to the text
is made, the Table of Contents page must be updated. For exam-
ple, the Table of Contents page detailing Section 1.4 might
appear as follows:
Pages Date
1.4.1 Document Control and Revisions 5 1-9-84
1.4.2 Quality Assurance Policy and 4 1-9-84
Objectives
1.4.3 Organization 7 1-9-84
A revision to "Organization" would change the Table of Contents
to appear as follows:
Pages Date
1.4.1 Document Control and Revisions 5 1-9-84
1.4.2 Quality Assurance Policy and 4 1-9-84
Objectives
1.4.3 Organization 9 6-2-88
A Handbook distribution record has been established and
will be maintained up to date so that future versions of exist-
ing Handbook sections and the addition of new sections may be
distributed to Handbook users. In order to enter the user's
name and address in the distribution record system, the "Distri-
bution Record Card" in the front of Volume I of this Handbook
must be filled out and mailed to the EPA address shown. (Note:
1-19
-------
Section No. 1.4.1
Revision No. 1
Date January 9, 1984
Page 3 of 3
A separate card must be filled out for each volume of the Hand-
book). Any future change in name and/or address should be sent
to the following:
U.S. Environmental Protection Agency
ORD Publications
26 West St. Clair Street
Cincinnati, Ohio 45268
Attn: Distribution Record System
Changes may be made by the issuance of (1) an entirely new
document or (2) replacement of complete sections. The recipient
of these changes should remove and destroy all revised sections
from his/her copy.
The document control system described herein applies to
this Handbook and it can be used, with minor revisions, to
maintain control of quality assurance procedures developed by
users of this Handbook and quality assurance coordinators. The
most important elements of the quality assurance program to
which document control should be applied include:
1. Sampling procedures.
2. Calibration procedures.
3. Analytical procedures.
4. Data analysis, validation, and reporting procedures.
5. Performance and system audit procedures.
6. Preventive maintenance.
7. Quality assurance program plans.
8. Quality assurance project plans.
1-20
-------
Procurement
quality control
-Oflfi
Statistical analysis
of data
1-21
-------
Section No. 1.4.2
Revision No. 1
Date January 9, 1984
Page 1 of 4
1.4.2 QUALITY ASSURANCE POLICY AND OBJECTIVES
1.4.2.1 ABSTRACT
1. Each organization should have a written quality assur-
ance policy that should be made known to all organization per-
sonnel .
2. The objectives of quality assurance are to produce
data that meet the users' requirements measured in terms of
completeness, precision, accuracy, representativeness and com-
parability and at the same time reduce quality costs.
1.4.2.2 DISCUSSION
Quality assurance policy - Each organization should have a
written quality assurance policy. This policy should be distri-
buted so that all organization personnel know the policy and
scope of coverage.
Quality assurance objectives1'2'3 - To administer a quality
assurance program, the objectives of the program must be de-
fined, documented, and issued to all involved in activities that
affect the quality of the data. Such written objectives are
needed because they:
1. Unify the thinking of those concerned with quality
assurance.
2. Stimulate effective action.
3. Are a necessary prerequisite to an integrated, planned
course of action.
4. Permit comparison of completed performances against
stated objectives.
Data can be considered to be complete if a prescribed per-
centage of the total possible measurements is present. Preci-
sion and accuracy (bias) represent measures of the data quality.
Data must be' representative of the condition being measured.
1-22
-------
Section Nc. 1.4.2
Revision Nc. 1
Date January 9, 1984
Page 2 of 4
Ambient air sampling at midnight is not representative of carbon
monoxide levels during rush hour traffic. Stationary source
emission measurements are .not representative if measured at
reduced load production conditions when usual operation is at
full load. Data available from numerous agencies and private
organizations should be in consistent units and should be cor-
rected to the same standard conditions of temperature and pres-
sure to allow comparability of data among groups.
Figure 1.4.2.1 shows three examples of data quality with
varying degrees of precision and bias. These examples hypothe-
size a true value that would result if a perfect measurement
procedure were available and an infinitely large number of
measurements could be made under specified conditions. If the
average value coincides with the true value (reference stan-
dard), then the measurements are not biased. If the measurement
values also are closely clustered about the true value, the
measurements are both precise and unbiased. Figure 1.4.2.2
shows an example of completeness of data.
Each laboratory should have quantitative objectives set
forth for each monitoring system in terms of completeness,
precision, and bias of data. An example is included below for
continuous measurement of carbon monoxide (nondispersive in-
frared spectrometry) to illustrate the point.
1. Completeness - For continuous measurements, 75 percent
or more of the total possible number of observations must be
present.4
2. Precision - Determined with calibration gases, preci-
sion is ±0.5 percent full scale in the 0 through 58 mg/m3
range.5'6
3. Accuracy - Depends on instrument linearity and the
absolute concentrations of the calibration gases. An accuracy
of ±1 percent of full scale in the 0 through 58 mg/m3 range can
be obtained.5'6
1-23
-------
Section No. 1.4.2
Revision No. 1
Date January 9 , 1934
Page 3 of 4
RECISION (c)
TRUE VALUE OF
CONCENTRATION
MEASURED
AVERAGE
BIAS-
Example of Positive Biased but Precise Measurements
-PRECISION (c)
TRUE VALUE
1 and
MEASURED AVERAGE
Example of Unbiased but Imprecise Measurements
PRECISION (a)
TRUE VALUE
and
MEASUREITAVERAGE
Example of Precise and Unbiased Measurements
Figure 1.4.2.1. Examples of data with varying degrees of precision
and bias (normal distribution assumed).
1-24
-------
Section No. 1.4.2
Revision No. 1
Date January 9, 1984
Page 4 of 4
Downtime (D)__».
Svstem ^
operation
System
1
Diagnostic
and
maintenance
-* — t i mp tori
1
|
Measurement
system _
— ma 1 •f i ir|r f i nr\ to
1 1
| |
down
10 15 20
Sampling periods
25
30
35
Figure 1.4.2.2. Example illustrating a measure of completeness
of data, U/(D + U).
For further discussion of completeness, precision, accuracy
and comparability, see the following:
Completeness and comparability, Section 1.4.
17 of this
1.
volume.
2. Precision and accuracy, Appendix G of this volume.
Employment of the elements of quality assurance discussed
in Section 1.4 should lead to the production of data that are
complete, accurate, precise, representative, and comparable.
1.4.2.3
1.
REFERENCES
5.
6.
Juran, J. M., (ed.). Quality Control Handbook.
McGraw-Hill, New York, 1974. Sec. 2, pp. 4-8.
3rd Ed.
Feigenbaum, A. V. Total Quality Control. McGraw-Hill, New
York, 1961. pp. 20-21.
Juran, J. M., and Gryna, F. M. Quality Planning and Ana-
lysis. McGraw-Hill, New York, 1970. pp. 375-377.
Nehls, G. J., and Akland, G. G. Procedures for Handling
Aerometric Data. Journal of the Air Pollution Control
Association, 23_ (3):180-184, March 1973.
Appendix A - Quality Assurance Requirements for State and
Local Air Monitoring Stations (SLAMS), Federal Register,
Vol. 44, Number 92, May 10, 1979.
Appendix B - Quality Assurance Requirements for Prevention
of Significant Deterioration (PSD) Air Monitoring, Federal
Register, Vol. 44, Number 92, May 10, 1979.
1-25
-------
Procurement
quality control
Statistical analysis
of data
1-26
-------
Section No. i.4.3
Revision No. 1
Date January 9, 198-
Page 1 of 7
1.4.3 ORGANIZATION
1.4.3.1 ABSTRACT
1. Organizing a quality assurance function includes
establishing objectives, determining the amount of emphasis to
place on each quality assurance activity, identifying quality
assurance problems to be resolved, preparing a quality assurance
program and/or project plan, and implementing the plan.
2. The overall responsibility for quality assurance is
normally assigned to a separate individual or group in the
organization.
3. Quality assurance has input into many functions of an
air pollution control agency. (See Figure 1.4.3.2 for details.)
4. The basic organizational tools for quality assurance
imp1ementati on are:
a. Organization chart and responsibilities.
b. Job descriptions. (See Figure 1.4.3.3 for job
description for the Quality Assurance Coordinator.)
c. Quality assurance plan.
1.4.3.2 DISCUSSION
Organizing the quality assurance function1 - Because of the
differences in size, workloads, expertise, and experience in
quality assurance activities among agencies adopting the use of
a quality assurance system, it is useful here to outline the
steps for planning an efficient quality assurance system.
1. Establish quality assurance objectives (precision,
accuracy, and completeness) for each measurement system (Section
1.4.2).
2. Determine the quality assurance elements appropriate
for the agency (Figure 1.4.1).
1-27
-------
section No. 1.4.-
Revision No. 1
Date January 9, 1984
Page 2 of 7
3. Prepare quality assurance project plans for all mea-
surement projects (Section 1.4.23).
4. Identify the quality assurance problems which must be
resolved on the basis of the quality assurance project plan.
5. Implement the quality assurance project plan.
Location of the responsibility for quality assurance in the
organization2 - If practical, one individual within an organiza-
tion should be designated the Quality Assurance (QA) Coordina-
tor. The QA Coordinator should have the responsibility for
coordinating all quality assurance activity so that complete
integration of the quality assurance system is achieved. The QA
Coordinator could also undertake specific activities such as
quality planning and auditing. The QA Coordinator should,
therefore, gain the cooperation of other responsible heads of
the organization with regard to. quality assurance matters.
As a general rule, it is not good practice for the quality
assurance responsibility to be directly located in the organiza-
tion responsible for conducting measurement programs. This
arrangement could be workable, however, if the person in charge
maintains an objective viewpoint.
Relationship of the quality assurance function to other
functions - The functions performed by a comprehensive air
pollution control program at the state or local level are shown
in Figure 1.4.3.I.3 The relationship of the quality assurance
function to the other agency functions is shown in Figure
1.4.3.2. The role of quality assurance can be grouped into two
categories:
1. Recommend quality assurance policy and assist its
formulation with regard to agency policy, administrative support
(contracts and procurements), and staff training.
2. Provide quality assurance guidance and assistance for
monitoring networks, laboratory operations, data reduction and
validation, instrument maintenance and calibration, litigation,
source testing, and promulgation of control regulations.
1-28
-------
Section Nc. 1.4.3
Revision No. 1
Date January 9, IS;
Page 3 of 7
Management Services
Agency policy
Administrative and clerical support
Public information and community relations
Intergovernmental relations
I 1 _____T
0 Legal counsel
0 Systems analy_._, ..,
0 Staff training and development
Technical Services
Legal counsel
Systems analysis, development of strategies, long-range planning
Staff trainina and develonment
0 Laboratory operations
0 Operation of monitoring network
0 Data reduction
0 Special field studies
0 Instrument maintenance and calibration
Field Enforcement Services
0 Scheduled inspections
0 Complaint handling
0 Operation of field patrol
0 Preparation for legal actions
0 Enforcement of emergency episode procedures
0 Source identification and registration
Engineering Services
0 Calculation of emission estimates
0 Operation of permit system
0 Source emission testing
0 Technical development of control regulations
0 Preparation of technical reports, guides, and criteria on control
0 Design and review of industrial emergency episode procedures
Figure 1.4.3.1. List of functions performed by comprehensive air
pollution control programs.
1-29
-------
Section Nc. 1.4.3
Revision No. 1
Date January 9, 1934
Page 4 of 7
Management Services
Quality assurance
Agency policy
OJ
c
fO
•M
c/1
t/1
;/i
>
fO (J
"2 r~
OJ 2L
o o-
o
Ol C
Technical Services
Laboratory operations
Operation of monitoring network
Data reduction
Special field studies
Instrument maintenance and calibration
Field Enforcement Services
Scheduled inspections
Complaint hand!ing
Operation of field patrol
Preparation for legal actions
Enforcement of emergency episode procedures
Source identification and registration
Engineering Services
Calculation of emission estimates
Operation of permit system
Source emission testing
Technical development of control regulations
Preparation of technical reports, guides, and
criteria on control
Design and review of industrial emergency episode
procedures
Figure 1.4.3.2.
Relationship of the quality assurance function to other
air pollution control program functions.
1-30
-------
Section No. 1.4.3
Revision No. 1
Date January 9, 1984
Page 5 of 7
Basic organizational tools for quality assurance implemen-
tation are:
1. The organization chart4 - The quality assurance orga-
nization chart should display line and staff relationships, and
lines of authority and responsibility. The lines of authority
and responsibility, flowing from the top to bottom, are usually
solid, while staff advisory relationships are depicted by dashed
lines.
2. The job description5 - The job description lists the
responsibilities, duties, and authorities of the job and rela-
tionships to other positions, individuals, or groups. A sample
job description for a Quality Assurance Coordinator is shown in
Figure 1.4.3.3.
3. The quality assurance plan - To implement quality
assurance in a logical manner and identify problem areas, a
quality assurance program plan and a quality assurance project
plan are needed. For details on preparation of quality assur-
ance program and project plans, see Sections 1.4.22 and 1.4.23,
respectively.
1-31
-------
Section No. 1.4.2
Revision No. 1
Date January 9, 1984
Page 6 of 7
TITLE: Quality Assurance Coordinator
Basic Function
The Quality Assurance Coordinator is responsible for the conduct of the
quality assurance program and for taking or recommending measures.
Responsibilities and Authority
1. Develops and carries out quality control programs, including statisti-
cal procedures and techniques, which will help agencies meet authorized
quality standards at minimum cost.
2. Monitors quality assurance activities of the agency to determine con-
formance with policy and procedures and with sound practice; and makes
appropriate recommendations for correction and improvement as may be
necessary.
3. Seeks out and evaluates new ideas and current developments in the field
of quality assurance and recommends means for their application
wherever advisable.
4. Advises management in reviewing technology, methods, and equipment,
with respect to quality assurance aspects.
5. Coordinates schedules for measurement system functional check calibra-
tions, and other checking procedures.
6. Coordinates schedules for performance and system audits and reviews re-
sults of audits.
7. Evaluates data quality and maintains records on related quality control
charts, calibration records, and other pertinent information.
8. Coordinates and/or conducts quality-problem investigations.
Figure 1.4.3.3. Job description for the Quality Assurance Coordinator.
1-32
-------
Section No. i.4.3
Revision No. 1
Date January 9, 1984
Page 7 of 1~
1.4.3.3 REFERENCES
1. Feigenbaum, A.V. Total Quality Control. McGraw-Hill, New
York. 1961. Chapter 4, pp. 43-82.
2. Covino, C.P., and Meghri, A.W. Quality Assurance Manual.
Industrial Press, Inc., New York. 1967. Step 1, pp. 1-2.
3. Walsh, G.W., and von Lehmden, D.J. Estimating Manpower
Needs of Air Pollution Control Agencies. Presented at the
Annual Meeting of the Air Pollution Control Association,
Paper 70-92, June 1970.
4. Juran, J.M., (ed.). Quality Control Handbook, 3rd Edition.
McGraw-Hill, New York. 1974.
5. Industrial Hygiene Service Laboratory Quality Control
Manual. Technical Report No. 78, National Institute for
Occupational Safety and Health, Cincinnati, Ohio. 1974.
BIBLIOGRAPHY
1. Brown, F.R. Management: Concepts and Practice. Indus-
trial College of the Armed Forces, Washington, D.C. 1967.
Chapter II, pp. 13-34.
1-33
-------
I Procurement
quality control
Statistical analysis
of data
Periodic)
1-34
-------
Sec-ion No. 14.5
Revision No. 1
Date January 9, 193-
Page 1 of 8
1.4.5 TRAINING
1.4.5.1 ABSTRACT
All personnel involved in any function affecting data
quality (sample collection, analysis, data reduction, and qual-
ity assurance) should have sufficient training in their appoint-
ed jobs to contribute to the reporting of complete and high
quality data. The first responsibility for training rests with
organizational management, program and project managers. In
addition, the QA coordinator should recommend to management that
appropriate training be available.
The training methods commonly used in the air pollution
control field are the following:
1. On-the-job training (OJT).
2. Short-term course training (including self-instruction
courses). A list of recommended short-term training courses is
in Figure 1.4.5.1.
3. Long-term training (quarter or semester in length).
Training should be evaluated in terms of the trainee and
the training per se. The following are techniques commonly used
in the air pollution control field to evaluate training.
1. Testing (pretraining and posttraining tests).
2. Proficiency checks.
3. Interviews (written or oral with the trainee's super-
visor and/or trainee).
1.4.5.2 DISCUSSION
All personnel involved in any function affecting data
quality should have sufficient training in their appointed jobs
to contribute to the reporting of complete and high quality
data. The first responsibility for training rests with organi-
zational management, program and project managers. In addition,
1-35
-------
Section No. 1.4.=
Revision No. I
Date January 9, 19S4
Page 2 of 8
Course number and title
iDays/h Contact
Quality Assurance /Quality Control Training
470 Quality Assurance for Air Pollution Measurement Systems
556 Evaluation and Treatment of Outlier Data
587 Industrial Hygiene Laboratory Quality Control
597 How to Write a Laboratory Quality Control Manual
Quality Management
9104 Quality Engineering
9108 Quality Audit-Development and Administration
9101 Managing for Quality
9114 Probability and Statistics for Engineers and Scientists
9113 Managing Quality Costs
514Y Practical Application of Statistics to Quality Control
210Y Quality Management
215Y Managing Quality Costs
138Y Quality Program - Preparation and Audit
919Y Software Quality Assurance
284 Operating Techniques for Standards and Calibration
641 Software Quality Assurance
Effective Quality Control Management
Corporate Quality Assurance
Air Pollution Measurement Method Training
413 Control of Particulate Emissions
415 Control of Gaseous Emissions
420 Air Pollution Microscopy
427 Combustion Evaluation
435 Atmospheric Sampling
444 Air Pollution Field Enforcement
450 Source Sampling for Particulate Pollutants
464 Analytical Methods for Air Quality Standards
4
3
5
3
5
5
3
5
5
3
3
5
3
5
4
5
3
4
3
4
4
4.5
5
4.5
3.5
4.5
5
APTIa
NIOSH
NIOSH
, NIOSH
UCC
ETId
; ETI
, ETI
ETI
ETI
SAMI6
SAMI
SAMI
SAMI
, SAMI
GWUf
GWU
CPA9
MCQR1?
APTI
APTI
APTI
APTI
i APTI
APTI
APTI
APTI
Figure 1.4.5.1.
Selected quality assurance and air pollution training
available in 1984. (continued)
1-36
-------
Air
468
474
Air
411
423
426
452
463
482
Self
406
422
448
473
472
475
409
410
412A
414
416
417
424
431
434
Sec-i
Re vis
Date
Page
Course number and title
Pollution Measurement Method Training
Source Sampling and Analysis of Gaseous
Pollutants
Continuous Emission Monitoring
Pollution Measurement Systems Training
Air Pollution Dispersion Models: Fundamental Concepts
Air Pollution Dispersion Models: Application
Statistical Evaluation Methods for Air Pollution
Data
Principles and Practice of Air Pollution Control
Ambient Air Quality Monitoring Systems: Planning
and Administrative Concepts
Sources and Control of Volatile Organic Air
Pollutants
Instruction, Video-Instruction, and Other Training
Effective Stack Height/Plume Rise
Air Pollution Control Orientation Course
(3rd Edition)
Diagnosing Vegetation Injury Caused by Air Pollution
Introduction to Environmental Statistics
Aerometric and Emissions Reporting System (AEROS)
Comprehensive Data Handling System (CDHS--AQDHS-II ,
EIS/P&R)
Basic Air Pollution Meteorology
Introduction to Dispersion Modeling
Baghouse Plan Review
Quality Assurance for Source Emission Measurements
Inspection Procedures for Organic Solvent Metal
Cleaning (Degreasing) Operations
Controlling VOC Emissions from Leak Process Equipment
Receptor Model Training
Introduction to Source Emission Control
Introduction to Ambient Air Monitoring
en Nc .
ion No.
January
3 of 3*
' Days/h
4
5
4.5
4.5
4.5
4.5
5
4
10 h
30 h
30 h
70 h
;
-
25 h
35 h
20 h
35 h
20 h
20 h
1 30 h
1
! 40 h
50 h
1.4.-
i
9 2_a~
Contact
APTI
APTI
APTI
APTI
APTI
APTI
APTI
APTI
APTI
APTI
APTI
APTI
APTI
APTI
APTI
APTI
APTI
APTI
APTI
: APTI
APTI
| APTI
APTI
Figure 1.4.5.1 (continued)
1-37
-------
436
437
Course number and title
Site Selection for Monitoring
Ambient Air
Site Selection for Monitoring
£ect.icn No - i.4.3
Revision Nc . 1
Date January 9, 1984
Page 4 of 8
Days/h '• Contact
of S02 and TSP in !
. 35 h APTI
of Photochemical
Pollutants and CO in Ambient Air 35 h : APTI
412B
412C
483A
476A
438
Electrostatic Precipitator PI
Wet Scrubbers Plan Review
an Review 20 h APTI
: - . APTI1
Monitoring the Emissions of Organic Compounds
to the Atmosphere
Transmissometer Operation anc
; APTIi
Maintenance ' - j APTI1
Reference and Automated Equivalent Measurement
Methods for Ambient Air Monitoring ; 30 h APTI
443
453
449
491A
491B
491C
491D
428A
Chain of Custody
2 h ; APTI
Prevention of Significant Deterioration 15 h APTI
Source Sampling Calculations
NSPS Metal -Coil Surface Coati
NSPS Surface Coating of Metal
NSPS Industrial Surface Coati
APTI
ng - APTI1
Furniture - | APTI1
ng j - | APTI1
NSPS Surface Coating Calculations - APTI1
NSPS Boilers
1 APTI1
1
Additional information may be obtained from:
Air Pollution Training Institute, MD-20, Environmental Research Center,
Research Triangle Park, North Carolina 27711, Attention: Registrar.
bR&R Associates, Post Office Box 46181, Cincinnati, Ohio 45246, Attention:
Thomas Rat!iff.
cThe University of Connecticut, Storrs, Connecticut 06268.
Education and Training Institute , American Society for Quality Control,
161 West Wisconsin Avenue, Milwaukee, Wisconsin 53203.
Stat-A-Matrix Institute, New Brunswick, New Jersey.
George Washington University, Continuing Engineering Education, Washington,
D. C. 20052.
9The Center for Professional Advancement, Post Office Box H, East Brunswick,
New Jersey 08816.
Paul D. Krensky Associates, Inc., Adams Building, 9 Meriam Street, Lexing-
ton, MA 02173.
Completion planned by October 1984.
Figure 1.4.5.1 (continued) _
-------
Section No . 1.4.z
Revision No. I
Date January 9, 1954
Page 5 of 8
the QA Coordinator should be concerned that the required train-
ing is available for these personnel and, when it is not, should
recommend to management that appropriate training be made avail-
able.
Training objective1'2 - The training objective should be to
develop personnel to the necessary level of knowledge and skill
required for the efficient selection, maintenance, and operation
of air pollution measurement systems (ambient air and source
emissions).
Training methods and availability - Several methods of
training are available to promote achievement of the desired
level of knowledge and skill required. The following are the
training methods most commonly used in the air pollution control
field; a listing of available training courses for 1984 is given
in Figure 1.4.5.1.
1. On-the-job training•(OJT) - An effective OJT program
could consist of the following:
a. Observe experienced operator perform the differ-
ent tasks in the measurement process.
b. Study the written operational procedures for the
method as described in this Handbook (Volume II or III), and use
it as a guide for performing the operations.
c. Perform procedures under the direct supervision
of an experienced operator.
d. Perform procedures independently but with a high
level of quality assurance checks, utilizing the evaluation
technique described later in this section to encourage high
quality work.
2. Short-term course training - A number of short-term
courses (usually 2 weeks or less) are available that provide
knowledge and skills for effective operation of an air pollution
measurement system. Some of the courses are on the measurement
methods per se_ and others provide training useful in the design
1-39
-------
Section No. -•4• ~
Revision No. 1
Date January 9, 195-
Page 6 of 8
and operation of the total or selected portions of the measure-
ment system. In addition, Figure 1.4.5.1 lists self-instruction
courses and video-tapes available from:
Registrar
Air Pollution Training Institute (MD-20)
U.S. Environmental Protection Agency-
Research Triangle Park, North Carolina 27711
(919) 541-2401
3. Long-term course training - Numerous universities,
colleges, and technical schools provide long-term (quarter and
semester length) academic courses in statistics, analytical
chemistry, and other disciplines. The agency's training or
personnel officer should be contacted for information on the
availability of long-term course training.
Training evaluation - Training should be evaluated in terms
of (1) level of knowledge and skill achieved by the trainee from
the training; and (2) the overall effectiveness of the training,
including determination of training areas that need improvement.
If a quantitative performance rating can be made on the trainee
during the training period (in terms of knowledge and skill
achieved), this rating may also provide an assessment of the
overall effectiveness of the training as well.
Several techniques are available for evaluating the trainee
and the training per se. One or more of these techniques should
be used during the evaluation. The most common types of evalua-
tion techniques applicable to training in air pollution measure-
ment systems are the following:
1. Testing - A written test before (pretest) and one
after (post-test) training are commonly used in short-term
course training. This allows the trainee to see areas of per-
sonal improvement and provides the instructor with information
on training areas that need improvement.
2. Proficiency checks - A good means of measuring skill
improvement in both OJT and short-term course training is to
assign the trainee a work task. Accuracy and/or completeness
1-40
-------
Section No. 1.4.5
Revision No. 1
Date January 9, 1984
Page 7 of 8
are commonly the indicators used to score the trainee's pro-
ficiency. The work tasks could be of the following form:
a. Sample collection - Trainee would be asked to
list all steps involved in sample collection for a hypothetical
case. In addition, the trainee could be asked to perform se-
lected calculations. Proficiency could be judged in terms of
completeness and accuracy.
b. Analysis - Trainee could be provided unknown
samples for analysis. As defined here, an unknown is a sample
whose concentration is known to the work supervisor (OJT) or
training instructor (short-term course training) but unknown to
the trainee. Proficiency could be judged in terms of accuracy
of analysis.
c. Data reduction - Trainees responsible for data
reduction could be provided data sets to validate. Proficiency
could be judged in terms of completeness and accuracy.
If proficiency checks are planned on a recurring basis, a
quality control or other type chart may be used to show progress
during the training period as well as after the training has
been completed. Recurring proficiency checks are a useful
technique for determining if additional training may be re-
quired.
3. Interviews - In some cases, a written or oral inter-
view with the trainee's supervisor and/or trainee is used to
determine if the training was effective. This interview is
normally not conducted until the trainee has returned to the job
and has had an opportunity to use the training. This technique
is most often used to appraise the effectiveness of a training
program (OJT or short-term course) rather than the performance
of the trainee.
1.4.5.3 REFERENCES
1. Feigenbaum, A. V. Total Quality Control. McGraw-Hill, New
York. 1961. pp. 605-615.
1-41
-------
Secrion No. -•-•-
Revision No. 1
Date January 9, 1924
Page 8 of 8
2. Feigenbaum, A. V. Company Education in the Quality Prob-
lem. Industrial Quality Control, X(6):24-29, May 1974.
BIBLIOGRAPHY
1. Juran, J. M., (ed.). Quality Control Handbook. 2nd edi-
tion. , McGraw-Hill, New York, 1962. Section 7, pp. 13-20.
2. Reynolds, E.A. Industrial Training of Quality Engineers
and Supervisors. Industrial Quality Control, X(6):29-32,
May 1954.
3. Industrial Quality Control, 2_3(12), June 1967. (All
articles deal with education and training.)
4. Seder, L. A. QC Training for Non-Quality Personnel.
Quality Progress, VII(7):9.
5. Reynolds, E. A. Training QC Engineers and Managers.
Quality Progress, _m(4):20-21, April 1970.
1-42
-------
Procurement
quality control
Statistical analysis
of data
1-43
-------
Section No. 1.4.1<=
Revision No. 1
Date January 9, 19 = 4
Page 1 of 7
1.4.16 AUDIT PROCEDURES
1.4.16.1 ABSTRACT
1. Performance audits are made to quantitatively evaluate
the quality of data produced by the total measurement system
(sample collection, sample analysis and data processing). The
individuals performing the audit, their standards and equipment
are different from the regular team (operating the measurement
system) and their standards and equipment in order to obtain an
independent assessment. The performance audit is commonly
limited to a portion of the total measurement system (e.g., flow
rate measurement, sample analysis) but may include the entire
measurement system (e.g., continuous ambient air analyzer).
2. A system audit is a qualitative on-site inspection and
review of the the total measurement system. The auditor should
have extensive background experience with the measurement system
being audited.
1.4.16.2 DISCUSSION
1.4.16.2.1 Performance Audits - The purposes of performance
audits include:
1. Objective assessment of the accuracy of the data col-
lected by a given measurement system,
2. Identification of sensors out-of-control,
3. Identification of systematic bias of a sensor or of
the monitoring network,
4. Measurement of improvement in data quality based on
data from previous and current audits.
The role of audits in the overall management program is
verification. While audits do not improve data quality if all
work is correctly performed, they do provide assurance that the
work prescribed for the measurement program has been conducted
1-44
-------
Section Nc . 1.4 -c
Revision No. i
Date January 9, 19B-.
Page 2 of 7
properly. Audits conducted by individuals not responsible for
the day-to-day operations provide a control and assessment mech-
anism to program managers. A performance audit procedure for
continuous ambient air analyzers is given herein to illustrate
items that must be considered in conducting a performance audit.
1. Select audit materials
a. Use high concentration (10 to 100 ppm) audit
cylinder gas in conjunction with a dilution system. Advantage--
better gas stability at high concentration; disadvantage — dilu-
tion system calibration errors are possible.
b. Use low concentration (<1 ppm except for CO)
audit cylinder gas. Advantage—no dilution system needed; dis-
advantages- -probability of gas instability and thus inaccurate
concentration, and number of cylinders.
c. Use permeation tubes. Advantage—better sta-
bility than low concentration cylinder gas; disadvantages--
permeation rate, which is temperature dependent, must stabilize
before performing audit and possibility of dilution system
calibration error.
d. Use materials traceable to NBS-SRM or com-
mercial CRM if possible.
e. Table 1.4.16.1 lists the primary standards appli-
cable to ambient audit equipment calibration. The list is not
all inclusive but includes the standards of high accuracy that
will fulfill the traceability requirements.
2. Select audit concentration levels - As a minimum, use
a low scale and a high scale point in order to check the ana-
lyzer's linearity, and use a third point near the sites' ex-
pected concentration level. Audit concentration levels are
specified in 40 CFR Part 58, Appendices A and B for a minimum QA
program.l'2
3. Determine auditor's proficiency - Auditor must analyze
audit materials (including the verification of their stability)
and his results compared with the known values prior to his per-
forming an audit.
-------
Section No. 1-4.16
Revision No. 1
Date January 9, 1984
Page 3 of 7
TABLE 1.4.16.1. PRIMARY STANDARDS
Parameter
Range
Usable standard
Primary standard
Flow rate
Flow rate
Time
S02
03
CO
0-3 2/min
0.5-3 £/min
Flow rate 0.1-2.5 mVmin
0-5 minutes
0-0.5 ppm
50-90 ppm
N0-N02-N0x I 0-0.5 ppm
50 ppm
0-1.0 ppm
10-100 ppm
Soap bubble flow
kit
1 £/revolution
wet test meter
3 ^/revolution
wet test meter
Positive displace-
ment Roots meter
Stopwatch
Permeation tube
Cylinder gas
(S02/N2)
NBS-traceable flow
kit or gravimetri'
cally calibrated
flow tubes
Primary standard
spirometer
Roots meter
NBS-time
NBS-SRM 1626
NBS-SRM 1693, 1694 or
commercial CRM
N02 permeation tube NBS-SRM 1629
NO cylinder gas
(NO/N2/GPT)
03 generator/UV
photometer
Cylinder gas
CO/N2 or CO/air
NBS-SRM 1683 or
commercial CRM
Standard laboratory
photometer
NBS-SRM 1677, 1678,
1679, 2635, 2612,
2613, 2614 or
commercial CRM
Note: Descriptions of NBS-SRM are shown in Figure 1.4.12.3. A
list of currently available CRM may be obtained from EPA at
address shown in Section 1.4.12.
1-46
-------
Section No. 1.4.15
Revision No. 1
Date January 9, 193-i
Page 4 of 7
4. Select analyzers out-of-control limits - Select the
maximum allowable difference between the analyzer and auditor
results. For gaseous analyzers, limits of 10 to 20% are com-
monly used.
5. Conduct the audit in the field
a. Record site data (address, operating organiza-
tion, type of analyzer being audited, zero and span post set-
tings, type of in-station calibration used, and general operat-
ing procedures.
b. Mark the data recording, indentifying the time
interval in which the audit was performed. A data stamp may be
used to document the station data system. This will ensure that
recorder traces cannot be switched in future reference.
c. Have the station operator make necessary nota-
tions on the data acquisition system prior to disconnecting a
monitor or sampler from the normal sampling mode. Initiate the
audit. Audit techniques are listed in Table 1.4.16.2.
d. Have the station operator convert all station
data to engineering units (ppm, m3/min, etc.) in the same manner
that actual data are handled.
e. All pertinent data should be recorded in an
orderly fashion on field data forms.
f. Return all equipment to normal sampling mode upon
completion of the audit, so that no data are lost.
g. Make data computations and comparisons prior to
vacating the test site. This is to ensure that no extraneous or
inconsistent differences exist that are found after vacating the
test site. It is often impossible to rectify a difference after
leaving the test site. Hence calculations and comparisons made
in the field are cost effective. Verbally relate as much infor-
mation as possible to the analyzer operator immediately after
the audit.
6. Verify the audit material stability after the audit
(e.g., reanalysis of audit material).
1-47
-------
Section No . 1.4.15
Revision No. 1
Date January 9, 1984
Page 5 of 7
TABLE 1.4.16.2. AUDIT TECHNIQUES
Pollutant/ Audit Audit
parameter ! technique standard
S02
Dynamic dilution 50 ppm
of a stock S02 in air
cylinder or N2
,
S02 Dynamic dilution
of a permeation
tube
Permeation
tube-
CO ; Dynamic dilution 900 ppm
of a stock CO in air
cyl inder or N2
CO
NO-NO -N02
x ^
NO-NO -N02
°3
TSP flow rate
Separate
cyl inders
Dynamic
dilution/gas
phase titration
Dynamic dilution
of stock cylin-
der/dynamic
permeation
dilution
03 generation
with verifica-
tion by UV
photometry
Simultaneous
flow rate
comparison
5, 20, 45
ppm CO in air
or N2 cyl inders
50 ppm NO/N2
with 0.5 ppm N02
impurity
50 ppm NO/N2
cylinder; N02
permeation tube
'Standard
photometer
ReF device
Traceability to
primary standard
NBS-SRM 50 ppm
S02/N2
standard
or
NBS-SRM permea-
tion tube
NBS-SRM
1000 ppm CO/N2
standard
NBS-SRM
50 ppm CO/N2
standard
NBS-SRM
50 ppm NO/N2
NBS-SRM 50 ppm
NO/N2 cylinder;
NBS N02 permea-
tion tube
Standard labora-
tory maintained
UV photometer
Primary standard
Roots Meter
system
1-48
-------
Section Nc. 1.4.15
Revision No. 1
Date January 9, 1984
Page 6 of 7
7. Prepare Audit Report - Prepare a written report and
mail to the pertinent personnel, it should include:
a. Assessment of the accuracy of the data collected
by the audited measurement system
b. Identification of sensors out-of-control
c. Identification of monitoring network bias
d. Measurement of improvement in data quality since
the previous audit(s).
8. Corrective Action - Determine if corrective actions
are implemented.
Detailed guidance to State and local agencies on how to
conduct performance audits of ambient air measurement systems
are described in Section 2.0.12 of Volume II of this Handbook.
System Audit - Detailed guidance to State and local agen-
cies for conducting a system audit of an ambient air monitoring
program are in Section 2.0.11 of Volume II of this Handbook.
Data forms are provided as an aid to the auditor. These forms
should be submitted to the agency being evaluated 4 to 6 weeks
prior to the on-site system audit. This allows the agency to
locate and enter detailed information (often not immediately
accessible) required by the forms. When the completed forms are
returned, they should be reviewed and the auditor should prepare
a list of specific questions he would like to discuss with the
agency. An entrance interview date should be arranged to dis-
cuss these questions.
The next step is the systems audit. A convenient method is
to trace the ambient data from the field measurement through the
submittal to EPA, noting each step in the process, documenting
the written procedures that are available and followed, and
noting the calibration and quality control standards that are
used.
After the auditor collects the information, an exit inter-
view is conducted to explain the findings of the evaluation to
1-49
-------
Section No. 1.4.16
Revision No. 1
Date January 9, 1984
Page 7 of 7
the agency representatives. A written report is then prepared
as soon as possible to summarize the results of the audit.
Guidance on how to evaluate the capabilities of a source
emission test team are described in Reference 3. Data forms are
included as an aid to the auditor.
1.4.16.3 REFERENCES
1. Appendix A - Quality Assurance Requirements for State and
Local Air Monitoring Stations (SLAMS), Federal Register,
Vol. 44, Number 92, May 19, p. 27574-27582.
2. Appendix B - Quality Assurance Requirements for Prevention
of Significant Deterioration (PSD) Air Monitoring, Federal
Register, Vol. 44, Number 92, May 1979, p. 27582-27584.
3. Estes, E. D. and Mitchell, W. J., Technical Assistance
Document: Techniques to Determine A Company's Ability to
Conduct A Quality Stack Test, EPA-600/4-82-018, March 1982.
1-50
-------
Procurement
quality control
Statistical analysis
of data
1-51
-------
Section No. .
Revision No. 1
Date January 9, 1934
Page 1 of 4
1.4.21 QUALITY REPORTS TO MANAGEMENT
1.4.21.1 ABSTRACT
Several reports are recommended in the performance of the
quality assurance tasks. Concise and accurate presentation of
the data and derived results is necessary. Some of the quality
assurance reports for management are:
1. Data quality assessment reports (e.g., those specified
in 40 CFR, Part 58, Appendices A and B),
2. Performance and system audit reports,
3. Interlaboratory comparison summaries,
4. Data validation reports,
5. Quality cost reports,
6. Instrument or equipment downtime,
7. Quality assurance program and project plans, and
8. Control charts.
Reports should be prepared with the following guidelines as
appropriate.
1. All raw data should be included in the report when
practical.
2. Objective of the measurement program, in terms of the
data required and an uncertainty statement concerning the re-
sults.
3. Methods of data analysis should be described unless
they are well-documented in the open literature.
4. A statement on any limitation and on applicability of
the results should be included.
5. Precision and accuracy of the measurement methods
should be stated.
6. Quality control information should be provided as
appropriate.
1-52
-------
Section No. 1.4.21
Revision No. 1
Date January 9, 19S4
Page 2 of 4
7. Reports should be placed into a storage system in
order that they may be retrieved as needed for future reference.
1.4.21.2 DISCUSSION
There are several quality assurance reports that should be
prepared periodically (quarterly or annually) summarizing the
items of concern. These reports will be briefly discussed
below.
1. Data Quality Assessment Reports
40 CFR Part 58, Appendices A and B require that reports of
the precision and accuracy calculations be submitted each quar-
ter along with the air monitoring data. See References 1 and 2
for details of the calculations and for specific data/ results
to be reported.
2. Performance and System Audit Reports
Upon completion of a performance and/or system audit, the
auditing organization should submit a report summarizing the
audit and present the results to the auditee to allow initia-
tion of any necessary corrective action.
3. Interlaboratory Comparison Summaries
EPA prepares annual reports summarizing the interlaboratory
comparisons for the National Performance Audit Program. In
addition, the results from this audit are submitted to the
participating labs as soon as possible after the audit. These
data can then be used by the participants to take any necessary
corrective action with regard to their measurement procedures.
See Appendix K for a further discussion of the contents of the
annual report.3'4
4. Data Validation Report
It is recommended in Section 1.4.17 that a data validation
process be implemented in order to minimize the reporting of
data of poor quality. A periodic report of the results of the
1-53
-------
Section No. 1.4.21
Revision No. 1
Date January 9, 1934
Page 3 of 4
data validation procedure should be made summarizing, for exam-
ple, the number of items (values) flagged as questionable, the
result of followup investigations of these anomalies, the final
number of data values rejected or corrected as a result of the
procedure, corrective action recommended, and effectiveness of
the data validation procedures.5'6
5. Quality Cost Report
A quality cost system is recommended in Section 1.4.14.
After the system has been implemented, a quality cost report
should be made periodically to include the prevention, apprai-
sal, and correction costs.7
6. Instrument or Equipment Downtime
In Section 1.4.7 it is recommended that records be main-
tained of the equipment in terms of failures, cause of failures,
repair time, and total downtime. These data should be summar-
ized periodically and submitted to management as an aid in
future procurement.
7. Quality Assurance Program (or Project) Plans
Although these are not reports on results, they are plans
for the QA activities for a QA program or project. They are the
reports which indicate which QA reports should be prepared.
8. Control Charts
The control charts are a visual report of the analytical
work and hence they are a significant part of the reporting
system. A summary of the results of the control chart applica-
tions should appear in the summary report to management.
Some guidelines in the preparation of these reports are
given in the Abstract portion of this section.
1-54
-------
Section No. _ .4 . 21
Revision No. 1
Date January 9, 1984
Page 4 of 4
1.4.21.3 REFERENCES
1. Appendix A - Quality Assurance Requirements for State and
Local Air Monitoring Stations (SLAMS), Federal Register,
Vol. 44, Number 92, May 1979-
2. Appendix B - Quality Assurance Requirements for Prevention
of Significant Deterioration (PSD) Air Monitoring, Federal
Register, Vol. 44, Number 92, May 1979.
3. Streib, E. W. and M. R. Midgett, A Summary of the 1982 EPA
National Performance Audit Program on Source Measurements.
EPA-600/4-88-049, December 1983.
4. Bennett, B. I., R. L. Lampe, L. F. Porter, A. P. Hines, and
J. C. Puzak, Ambient Air Audits of Analytical Proficiency
1981, EPA-600/4-83-009, April 1983.
5. Nelson, Jr., A. C., D. W. Armentrout, and T. R. Johnson.
Validation of Air Monitoring Data, North Carolina, EPA-600/
4-80-030, June 1980.
6. U.S. Environmental Protection Agency. Screening Procedures
for Ambient Air Quality Data. EPA-450/2-78-037, July 1978.
7. Strong, R.B., J.H. White and F. Smith, "Guidelines for the
Development and Implementation of a Quality Cost System for
Air Pollution Measurement Programs," Research Triangle
Institute, Research Triangle Park, North Carolina, 1980, EPA
Contract No. 68-02-2722.
1-55
-------
Procurement
quality control
Statistical analysis
of data
1-56
-------
Section No. 1.4.22
Revision No. i
Date January 9, 1984
Page 1 of 4
1.4.22 QUALITY ASSURANCE PROGRAM PLAN1
1.4.22.1 ABSTRACT
1. The QA Program Plan is a document which stipulates the
policies, objectives, management structure, responsibilities,
and procedures for the total QA programs for each major organi-
zation.1 The EPA policy requires participation by all EPA
Regional Offices, EPA Program Offices, EPA Laboratories, and
States in a centrally managed QA program, and includes all moni-
toring and measurement efforts mandated or supported by EPA
through regulations, grants, contracts, or other formalized
means not currently covered by regulation.
2. Each EPA Program Office, EPA Regional Office, EPA Lab-
oratory, and State and other organizations, is responsible for
the preparation and implementation of the QA Program Plan to
cover all environmentally-related measurement activities sup-
ported or required by EPA. A basic requirement of each plan is
that it can be implemented and that its implementation can be
measured.
3. Each QA Program Plan should include the following
elements:
a. Identification of office/laboratory submitting
the plan,
b. Introduction ,- brief background, purpose, and
scope,
c. QA policy statement,
d. QA management structure,
e. Personnel qualification and training needs,
f. Facilities, equipment, and services - approach to
selection, evaluation, calibration, operation, and maintenance,
g. Data generation - procedures to assure the gener-
ation of reliable data,
1-57
-------
Section Nc. 1.4.^
Revision Nc. 1
Date January 9, 1984
Page 2 of 4
h. Data processing - collection, reduction, valida-
tion, and storage of data,
i. Data quality assessment - accuracy, precision,
completeness, representativeness, and comparability of data to
be assessed,
j . Corrective action - QA reporting and feedback
channels established to ensure early and effective corrective
action, and
k. Implementation requirements and schedule.
4. Plans should be submitted through normal channels for
review and/or approval.
1.4.22.2 DISCUSSION
QA Program Plan is an orderly assembly of management poli-
cies, objectives, principles, and general procedures by which an
agency or laboratory outlines how it intends to produce quality
data. The content of the plan (outlined in 1.4.22.1) is briefly
described below; eleven essential elements should be considered
and addressed.
1. Identification - Each plan should have a cover sheet
with the following information: document title, document con-
trol number, unit's full name and address, individual respon-
sible (name, address, and telephone number), QA Officer, plan
coverage, concurrences, and approval data.
2. Introduction - Brief background, purpose and scope of
the program plan is set forth in this section.
3. QA policy statement - The policy statement provides
the framework within which a unit develops and implements its QA
program. It must emphasize the requirements and activities
needed to ensure that all data obtained are of known quality.
4. QA management - This section of the plan shows the
interrelationships between the functional units and subunits
which generate or manage data. This includes the assignment of
responsibilities, communications (organizational chart to indi-
cate information flow), document control, QA program assessment.
1-58
-------
Section No. 1.4 . 22
Revision No. 1
Dare January 9, 1964
Page 3 of 4
5. Personnel - Each organization should ensure that all
personnel performing tasks and functions related to data quality
have the needed education, training, and experience; personnel
qualifications and training needs should be identified.
6. Facilities, equipment, and services - The QA Program
Plan should address the selection, evaluation, environmental
aspects of equipment which might have an impact on data quality,
maintenance requirements, monitoring and inspection procedures,
for example.
7. Data generation - Procedures should be given to assure
the generation of data that are scientifically valid, defen-
sible, comparable, and of known precision and accuracy. QA
Project Plans (as described in Section 1.4.23) should be pre-
pared and followed. Standard operating procedures (SOP) should
be developed and used for all routine monitoring programs,
repetitive tests and measurements, and for inspection and main-
tenance of facilities, equipment, and services.
8. Data processing - The plan should describe how all
aspects of data processing will be managed and separately evalu-
ated in order to maintain the integrity and quality of the data.
The collection, validation, storage, transfers, and reduction of
the data should be described.
9. Data quality assessment - The plan should describe how
all generated data are to be assessed for accuracy, precision,
completeness, representativeness, and comparability.
10. Corrective action - Plans should describe the mecha-
nism(s) to be used when corrective actions are necessary.
Results from the following QA activities may initiate a correc-
tive action: performance audits, system audits, interlaboratory
comparison studies, and failure to adhere to a QA Program or
Project Plan or to SOP.
11. Implementation requirements and schedule - A schedule
for implementation is given in Reference 1.
1-59
-------
Secticr. No
Revision No.
Dare Januar:
Page 4 of 4
Date January 9, 195-
1.4.22.3 REFERENCE
Guidelines and Specifications for Preparing Quality Assur-
ance Program Plans, Quality Assurance Management Staff,
Office of Research Development, USEPA, Washington, B.C.,
QAMS-004/80, September 1980. This document (EPA-600/3-83-
024; NTIS PB 83-219667) may be obtained from the National
Technical Information Service, 5885 Port Royal Road,
Springfield, Virginia 22161.
1-60
-------
Procurement
quality control
Statistical analysis
of data
1-61
-------
Section No. 1.4.23
Revision No. 1
Date January 9, 1984
Page 1 of 2
1.4.23 QUALITY ASSURANCE PROJECT PLAN1
1.4.23.1 ABSTRACT
1. A QA Project Plan is an orderly assembly of detailed
and specific procedures by which an agency or laboratory delin-
eates how it produces quality data for a specific project. A
given agency or laboratory would have only one QA Program Plan,
but would have a project plan for each project or for each group
of projects using the same measurement methods, (e.g., a labora-
tory service group might develop a plan by analytical instrument
since the same service is provided to several projects). Every
project that involves environmentally-related measurements
should have a written and approved QA Project Plan.
2. Each of the 16 items listed below should be considered
for inclusion in each QA Project Plan.1
1. Title page, with provision for approval signatures
2. Table of contents
3. Project description
4. Project organization and responsibilities
5. QA objectives for measurement data in terms of preci-
sion, accuracy, completeness, representativeness and comparabil-
ity
6. Sampling procedures
7. Sample custody
8. Calibration procedures and frequency
9. Analytical procedures
10. Data analysis, validation, and reporting
11. Internal quality control checks and frequency
12. Performance and system audits and frequency
13. Preventive maintenance procedures and schedules
1-62
-------
Seen
Revision No. 1
Date January 9, 195-
Page 2 of 2
14. Specific procedures to be used to routinely assess
data precision, accuracy, and completeness of specific measure-
ment parameters involved
15. Corrective action
16. Quality assurance reports to management.
It is EPA policy that precision and accuracy of data must be
assessed on all monitoring and measurement projects. Therefore,
Item 14 must be described in all QA Project Plans.
1.4.23.2 DISCUSSION
The guidelines and specifications for preparing QA Project
Plans are in Appendix M. Appendix M also includes pertinent
references, definition of terms, availability of performance
audit materials/devices and QA technical assistance, and a model
QA Project Plan.
1.4.23.3 REFERENCE
1. Interim Guidelines and Specifications for Preparing Quality
Assurance Project Plans, Quality Assurance Management
Staff, Office of Research Development, USEPA, Washington,
D.C., QAMS-005/80, December 1980. This document (EPA-600/
4-83-004; NTIS PB-83-170514) may be obtained from the
National Technical Information Service, 5885 Port Royal
Road, Springfield, Virginia 22161.
1-63
-------
Review Exercise
Now that you've completed the assignment for Section 1, please answer the following
questions. These will help you determine whether you are mastering the material.
1. Valid and verifiable air quality monitoring data are needed by regulatory
agencies (?)
a. so that air monitoring costs can be increased
b. to provide a sound basis for regulatory decisions
c. to derive theoretical models for combustion processes
d. to ensure that data follow a log-normal distribution
2. Which one of the following is not a long-term objective of the USEPA's man-
datory QA program?
a. to provide quantitative estimates of data quality
b. to document progress in improving the quality of environmental measure-
ments reported to the agency
c. to improve data quality
d. to increase the number of quality assurance coordinators
3. The USEPA's mandatory QA program affects (?)
a. USEPA laboratories
b. organizations receiving USEPA grants or contracts
c. organizations having cooperative agreements with the USEPA
d. all of the above
4. True or False? The USEPA's mandatory QA program is applicable only to air
pollution measurements.
5. Quality control is (?)
a. a system of management equivalent to a quality assurance system
b. a system of data management used to ensure correct reporting of test results
c. the system of statistical procedures used to ensure data quality
d. the system of activities used to sustain a quality of product or service
6. Quality assurance is (?)
a. the system of activities used to provide assurance that a product or service
will satisfy given needs
b. the system of activities used to statistically determine confidence levels for air
quality data
c. the system of activities used to sustain a quality of product or service
d. the system of activities used to provide assurance that air quality is improving
7. The quality assurance wheel illustrates (?)
a. the elements that should be considered when planning a quality assurance
program
b. the costs associated with quality assurance programs
c. mandatory requirements for any quality assurance program
d. the management structure of the USEPA Quality Assurance Division
1-65
-------
8. The objectives of a quality assurance program should be to produce data that
are:
a.
b.
c.
d.
e.
9. The data illustrated below are
True value
a. both precise and accurate
b. precise but not accurate
c. both imprecise and inaccurate
d. accurate but not precise
10. The quality assurance coordinator of an air monitoring organization should
a. be the newest employee of the organization
b. be a supervisor
c. be independent from other organizational programs
d. report to the laboratory supervisor
11. List the three most common training methods used in the field of air pollution
control.
b.
c.
12. List three techniques that can be used to check the effectiveness of a training
program.
alii
• _ _ _ >• . '
b.
c.
13. An auditing procedure is one of the elements of a QA program that should be
implemented as soon as possible. What are two types of audits that could be set
up in an air monitoring QA program?
a. (?)
b (?)
1-66
-------
14. True or False? A performance audit is qualitative, whereas a system audit is
quantitative.
15. Which of the following should be included in a QA report to management?
a. performance and system audit results
b. major quality problems and planned or implemented corrective action
c. both a and b, above
16. True or False? A quality assurance program plan contains general QA
requirements and information for an organization, whereas a quality assurance
project plan contains specific QA requirements and information for a project of
the organization.
17. True or False? State and local air pollution control agencies are responsible for
the implementation of QA programs for their monitoring activities.
1-67
-------
Section 1
Answers to Review Exercise
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
b
d
d
False
d
a
a
a.
b.
c.
d.
e.
c
c
a.
b.
c.
a.
b.
c.
a.
b.
False
c
True
True
complete
precise
accurate
representative
comparable
on-the-job
short-term
long-term
written tests
proficiency checks
interviews
performance
system audit
audit
1-69
-------
Section 2
Quality Assurance for Air Quality
Monitoring Systems
Reading Assignment
Read, in the following order, sections 2.0.1, 2.0.2, 2.0.4, 2.0.7, 2.0.3, and 2.0.6 of
Quality Assurance Handbook for Air Pollution Measurement Systems, Volume II—
Ambient Air Specific Methods, EPA 600/4-77-027a.
Reading Assignment Topics
Sampling network design and site selection
Sampling considerations
Reference and equivalent methods
Gas traceability protocol for calibrations and audits of ambient air analyzers
Data handling and reporting
Chain-of-custody procedures for ambient air samples
Learning Goal and Objectives
Learning Goal
The purpose of this section is to familiarize you with quality assurance considerations
for the acquisition, installation, and operation of air quality monitoring systems.
Learning Objectives
At the end of this section, you should be able to —
1. recognize the need to define the objectives of a monitoring program before
designing the air quality monitoring network,
2. identify at least six factors that are involved in collecting air quality samples
to meet defined monitoring objectives,
3. recognize at least six site characteristics that should be included in the descrip-
tion of an air quality monitoring station,
4. identify at least four environmental parameters that must be controlled to
obtain a representative air quality sample,
5. describe the design, siting, and maintenance of sampling manifolds and
probes,
2-1
-------
6. recognize the need to carefully select air quality analyzers,
7. describe a protocol for certifying the concentrations of standard gases as
traceable to National Bureau of Standards (NBS) standard reference materials
(SRMs) or to certified reference materials (CRMs),
8. identify information that should be recorded on air quality data forms,
9. describe operational checks and maintenance of strip-chart recorders,
10. describe manual and automated techniques for data validation,
11. recognize the purpose of chain-of-custody procedures for ambient air samples,
and
12. identify air monitoring activities that should be subjected to chain-of-custody
procedures.
Reading Guidance
This assignment reviews general quality assurance and quality control guidance for
constructing and operating ambient air quality monitoring networks. Each section of
the Quality Assurance Handbook included in the assignment addresses an aspect of
network construction or operation.
Considerations for obtaining air quality data that represent what is intended to be
measured are addressed in Section 2.0.1. Data representativeness depends on design-
ing a monitoring network so that it meets the objectives of its associated monitoring
program. Therefore, the types and quantities of monitoring systems, the location of
monitoring systems, and the frequency of monitoring must correspond to specific
monitoring purposes.
The approximate numbers of SO2 and TSP National Air Monitoring Stations
(NAMS), stations selected from State and Local Air Monitoring Stations (SLAMS)
networks, required for urban areas are described in Table 1.4 of Section 2.0.1. Note
that as the population or pollutant concentration for an urban area increases, so
does the number of required NAMS. This is because the primary monitoring objec-
tive of NAMS networks is to monitor in areas where pollutant concentrations and
population exposure are expected to be the highest.
Also, considerations for siting monitors to meet monitoring objectives are
described in Section 2.0.1. In addition to information presented in this section, two
correspondence courses concerning the siting of ambient air quality monitors, APTI
Course 436, Site Selection for the Monitoring of SOZ and TSP in Ambient Air, and
APTI Course 437, Site Selection for the Monitoring of CO and Photochemical
Pollutants in Ambient Air, have been prepared for the USEPA. If you would like
information concerning these courses, contact the USEPA's Air Pollution Training
Institute at the address or phone number given in the Course Introduction section of
this guidebook. Also, a document concerning quality assurance for meteorological
measurements, Quality Assurance Handbook for Air Pollution Measurement
Systems: Volume IV—Meteorological Measurements (EPA 600/4-82-060), has been
published by the USEPA. For information concerning this document, contact
USEPA, ORD Publications, 26 West St. Glair Street, Cincinnati, Ohio 45268.
2-2
-------
The designing and siting of sampling manifolds and probes are discussed in Sec-
tion 2.0.2. In addition to the information presented in the section, the following
should be considered when designing a sampling manifold:
• suspending strips of paper in front of the blower's exhaust permits a visual
check of blower operation,
• positioning air conditioner vents away from the manifold reduces condensation
of water vapor in the manifold, and
• positioning sample ports of the manifold toward the ceiling reduces the poten-
tial for accumulation of moisture in analyzer sampling lines.
Also, probe-siting criteria for Pb NAMS and SLAMS have been added to 40 CFR 58
since the publication of Section 2.0.2 (July 1, 1979). The following information is
provided to update Table 4 of the section.
Table 2-1. Probe-siting criteria for Pb NAMS and SLAMS.
Pollutant
Pb
Scale
Micro
Middle, neighbor-
hood, urban, and
regional
Height
above
ground,
meters
2-7
2-15
Distance from
supporting structure,
meters
Vertical
—
Horizontal"
>2
>2
Other spacing criteria
1 . Should be > 20 meters from
trees.
2. Distance from sampler to
obstacle, such as a building,
must be at least twice the
height that the obstacle pro-
tudes above the sampler.
3. Must have unrestricted air-
flow 270° around the
sampler.
4. No furnace or incineration
flues should be nearby.*
5. Must be 5 to 15 meters from
major roadway.
1 . Should be > 20 meters from
trees.
2. Distance from sampler to
obstacle, such as a building,
must be at least twice the
height that the obstacle pro-
trudes above the sampler.
3. Must have unrestricted air-
flow 270° around the
sampler.
4. No furnace or incineration
flues should be nearby.*
5. Spacing from roads varies
with traffic (see Table 4 of
Appendix Ec).
When a probe is located on a rooftop, this separation is in reference to walls, parapets, or penthouses
located on the roof.
Distance is dependent on the height of the furnace or the incineration flue, the type of fuel or waste
burned, and the quality of the fuel (sulfur and ash content). This is to avoid undue influences from
minor pollutant sources.
40 CFR 58.
2-3
-------
Information concerning USEPA-designated reference and equivalent measurement
methods for air quality monitoring is presented in Section 2.0.4. Note that the
designation program does not guarantee that a designated analyzer will operate
properly; the program guarantees only that a designated analyzer is capable of
operating properly. It is the user's responsibility to ensure the adequate performance
of designated reference and equivalent methods by implementing a quality assurance
program. The quality assurance program should include quality control for the pro-
curement of air quality analyzers. An excellent discussion of procurement quality
control is given by M. J. Kopecky and B. Rodger (Wisconsin Department of Natural
Resources) in "Quality Assurance for Procurement of Air Analyzers," 1979 ASQC
Technical Conference Transactions—Houston, Texas (American Society for Quality
Control, 1979:35-40).
A protocol for certifying the concentrations of gases used to calibrate or audit air
quality analyzers as traceable to National Bureau of Standards (NBS) standard
reference materials (SRMs) is described in Section 2.0.7. The following list expands
on the partial listing of SRM cylinder gases provided in Table 7.1 of the section.
Table 2-2. SRM cylinder gases.
SRM
1658a
1659a
1660a
1661a
1662a
1663a
1664a
1665b
1666b
1667b
1668b
1669b
1670
1671
1672
1674b
1675b
1677c
1678c
1679c
1680b
1681b
1683b
1684b
1685b
1686b
1687b
1693
1694
1696
Type
Methane in air
Methane in air
Methane-propane in air
Sulfur dioxide in nitrogen
Sulfur dioxide in nitrogen
Sulfur dioxide in nitrogen
Sulfur dioxide in nitrogen
Propane in air
Propane in air
Propane in air
Propane in air
Propane in air
Carbon dioxide in air
Carbon dioxide in air
Carbon dioxide in air
Carbon dioxide in nitrogen
Carbon dioxide in nitrogen
Carbon monoxide in nitrogen
Carbon monoxide in nitrogen
Carbon monoxide in nitrogen
Carbon monoxide in nitrogen
Carbon monoxide in nitrogen
Nitric oxide in nitrogen
Nitric oxide in nitrogen
Nitric oxide in nitrogen
Nitric oxide in nitrogen
Nitric oxide in nitrogen
Sulfur dioxide in nitrogen
Sulfur dioxide in nitrogen
Sulfur dioxide in nitrogen
Certified
component
CH4
CHt
CH4/C,H8
SO2
SO2
SO2
SO2
C,H8
C,H.
C,H,
C,H,
C,H,
CO2
COj
CO2
CO2
CO2
CO
CO
CO
CO
CO
NO
NO
NO
NO
NO
SO2
SO2
SO2
Nominal concentration
1
10
4/1
500
1000
1500
2500
3
10
50
100
500
0.033
0.034
0.035
7.0
14.0
10
50
100
500
1000
50
100
250
500
1000
50
100
3500
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
percent
percent
percent
percent
percent
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
2-4
-------
Table 2-2. SRM cylinder gases (continued).
SRM
1805
1806
1808
1809
2612a
2613a
2614a
2619a
2620a
2621a
2622a
2623a
2624a
2625a
2626a
2627
2628
2629
2630
2631
2632
2633
2634
2635
2636
2637
2638
2639
2640
2641
2642
2643
2644
2645
2646
2647
2648
2649
2650
2651
2652
2653
2654
2655
2656
2657
2658
2659
Type
Benzene in nitrogen
Benzene in nitrogen
Tetrachloroethylene in nitrogen
Tetrachloroethylene in nitrogen
Carbon monoxide in air
Carbon monoxide in air
Carbon monoxide in air
Carbon dioxide in nitrogen
Carbon dioxide in nitrogen
Carbon dioxide in nitrogen
Carbon dioxide in nitrogen
Carbon dioxide in nitrogen
Carbon dioxide in nitrogen
Carbon dioxide in nitrogen
Carbon dioxide in nitrogen
Nitric oxide in nitrogen
Nitric oxide in nitrogen
Nitric oxide in nitrogen
Nitric oxide in nitrogen
Nitric oxide in nitrogen
Carbon dioxide in nitrogen
Carbon dioxide in nitrogen
Carbon dioxide in nitrogen
Carbon monoxide in nitrogen
Carbon monoxide in nitrogen
Carbon monoxide in nitrogen
Carbon monoxide in nitrogen
Carbon monoxide in nitrogen
Carbon monoxide in nitrogen
Carbon monoxide in nitrogen
Carbon monoxide in nitrogen
Propane in nitrogen
Propane in nitrogen
Propane in nitrogen
Propane in nitrogen
Propane in nitrogen
Propane in nitrogen
Propane in nitrogen
Propane in nitrogen
Propane and oxygen in nitrogen
Propane and oxygen in nitrogen
Nitrogen dioxide in air
Nitrogen dioxide in air
Nitrogen dioxide in air
Nitrogen dioxide in air
Oxygen in nitrogen
Oxygen in nitrogen
Oxygen in nitrogen
Certified
component
C6H6
C6H6
C2C14
c2cu
CO
CO
CO
CO2
CO2
CO2
CO2
C02
C02
CO2
CO2
NO
NO
NO
NO
NO
CO2
C02
C02
CO
CO
CO
CO
CO
CO
CO
CO
C,H8
C,H8
C,H8
C,H8
C,H8
C,H8
C,H8
C,H8
C,H8/O2
CSH8/02
NO2
NO2
NO2
NO2
02
02
02
Nominal concentration
0.25
10
0.25
10
10
20
45
0.5
1.0
1.5
2.0
2.5
3.0
3.5
4.0
5
10
20
1500
3000
300
400
800
25
250
2500
5000
1
2
4
8
100
250
500
1000
2500
5000
1
2
0.01/ 5.0
0.01/10.0
250
500
1000
2500
2
10
21
ppm
ppm
ppm
ppm
ppm
ppm
ppm
percent
percent
percent
percent
percent
percent
percent
percent
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
ppm
percent
percent
percent
percent
ppm
ppm
ppm
ppm
ppm
ppm
percent
percent
percent
percent
ppm
ppm
ppm
ppm
percent
percent
percent
2-5
-------
In addition to the permeation tubes listed in Table 7.1, a benzene permeation
device is available from NBS. Purchase orders or quotation requests for SRMs should
be addressed to the Office of Standard Reference Materials, Room B311, Chemistry
Building, National Bureau of Standards, Washington, DC 20234 (telephone
number: 301-921-2045).
Since the publication of Section 2.0-7, Federal air quality monitoring regulations
have been revised to allow certified reference materials (CRMs) to be substituted for
SRMs in calibration and audit activities that previously required traceability to only
SRMs. CRMs are prepared in batches by gas manufacturers. Each batch is audited
by the USEPA and approved by NBS. For a list of CRMs, including vendor informa-
tion, contact USEPA, EMSL, QAD, MD 77, Research Triangle Park, North
Carolina 27711.
Considerations for the processing and reporting of air quality data are discussed in
Section 2.0.3. Validation is a critical step in data processing because it helps to
ensure that only sound data are reported. In addition to data-validation information
presented in this section, an excellent document concerning this topic, Validation of
Air Monitoring Data (EPA 600/4-80-030), has been published by the USEPA. It is
available as document number PB 81 112534 from the National Technical Informa-
tion Service, 5285 Port Royal Road, Springfield, Virginia 22161.
Chain-of-custody procedures needed for proving the legal integrity of air pollution
samples and data are described in Section 2.0.6. Such procedures should be
established before sampling begins, and each individual involved should be aware of
his responsibilities in the chain.
Chain-of-custody procedures are used to document who did what to samples or
data, and when and how they did it. The chain of custody begins with the prepara-
tion of anything that becomes an integral part of the sample (such as a filter) and
continues through to the disposal of the sample and its associated sampling data.
When you have finished the reading assignment, complete the review exercise that
follows and check your answers. The correct answers are listed on the page following
the review exercise. After you have reviewed your incorrect answers (if any), take
Quiz 1. Follow the directions listed in the Course Introduction section of this
guidebook. After completing Quiz 1, proceed to Section 3 of this guidebook.
2-6
-------
Review Exercise
Now that you've completed the assignment for Section 2, please answer the following
questions. These will help you determine whether you are mastering the material.
1. The objective(s) of a monitoring program should be clearly defined (?)
the designing of the monitoring network that is associated with the monitoring
program.
a. before
b. during
c. before and, if necessary, revised during
d. after
2. The spatial scale represented by an air sample measured at a monitoring site
should be (?) the appropriate spatial scale for the site's monitoring
objective.
a. greater than
b. less than
c. the same as
3. True or False? The frequency of sampling can affect the representativeness of
the samples obtained.
4. True or False? A sampling site for one pollutant may be inappropriate for
another pollutant.
5. True or False? Noncontinuous sampling for TSP, SO2> and NO2 at the most
polluted sites in an urban area should take place every six days.
6. To calculate a pollutant's 24-hour average concentration using continuous
measurement methods, at least (?) hourly concentration averages for
the pollutant must be present.
a. 12
b. 16
c. 18
d. 22
7. To calculate a pollutant's monthly average concentration using continuous
measurement methods, at least (?) daily concentration averages
must be present.
a. 14
b. 18
c. 21
d. 26
8. Air quality is usually monitored continuously in areas (?)
a. having poor air quality
b. having high population densities
c. both a and b, above
9. True or False? The identification record for a monitoring site should include the
data acquisition objective for the site.
2-7
-------
10. True or False? Unless an emergency episode monitoring site has an automatic
data-transmission system, personnel should be present at the site during an
episode.
11. True or False? During operation, air quality analyzers do not need to be located
in a temperature-controlled environment.
12. Which of the following sampling manifolds should be constructed of inert
materials?
a. vertical laminar flow
b. conventional
c. both a and b, above
13. Significant losses of reactive gas concentrations have been observed in glass and
Teflon® sampling lines when the sample residence times exceeded Q)
second(s).
a. 1
b. 5
c. 20
14. Appendix E of 40 CFR 58 requires that the probe of an SO2 monitor be located
away from obstacles such as buildings, so that the distance between an obstacle
and the probe is at least (?) times the height that the obstacle pro-
trudes above the probe.
a. 2
b. 4
c. 5
d. 10
15. True or False? The flow rate of sample air through a manifold should equal the
sum of the individual flow rates required by the sampling systems connected to
the manifold.
16. Which of the following activities should be performed in obtaining reagent
water?
a. development of purchasing guidelines
b. testing water for conductivity
c. both a and b, above
17. True or False? The use of new calibration gases should be overlapped with the
use of old calibration gases.
18. True or False? Reference or equivalent methods are generally not required in
SLAMS monitoring networks.
19. Requiring the use of reference or equivalent methods helps to assure that air
quality measurements are made with methods that are capable of having
adequate (?)
a. accuracy
b. reliability
c. both a and b, above
2-8
-------
20. True or False? The designation status of an analyzer can be determined by
referring to its model number.
21. True or False? Any modification to a reference or equivalent method made by a
user must be approved by the USEPA if the method's designation status is to be
maintained.
22. True or False? No competitive differences exist among designated analyzers.
23. True or False? A quality assurance program is not needed for designated
reference or equivalent methods.
24. The USEPA's traceability protocol requires that true concentrations of standard
gases be determined using (?)
a. NBS Class S weights
b. air quality analyzers that have been calibrated with NBS SRMs or CRMs
c. both a and b, above
25. True or False? Gas manufacturers are required to provide traceability for all
standard gases that they sell.
26. For establishing traceability of commercial cylinder gases to NBS SRM or CRM
cylinder gases, the USEPA traceability protocol requires that an analyzer be
calibrated using zero gas and (?) NBS SRM or CRM cylinder gas(es).
a. 1
b. 2
c. 3
d. 6
27. True or False? The USEPA traceability protocol does not require zero and span
checks of analyzers having linear responses.
28. For establishing the true concentration of a cylinder gas, the USEPA traceability
protocol requires that the cylinder gas be analyzed (?) time(s).
a. 1
b. 2
c. 3
29. True or False? For establishing the true permeation rate of a permeation tube,
the USEPA traceability protocol requires the analysis of one pollutant concentra-
tion generated by the tube.
30. True or False? The USEPA traceability protocol requires that the stability of
cylinder gases be verified before they are used.
31. The USEPA traceability protocol requires reanalysis of (?) .
a. cylinder gases
b. permeation tubes
c. permeation devices
d. all of the above
2-9
-------
32. The USEPA traceability protocol requires that commercial permeation tubes or
devices be equilibrated at a specified temperature for at least (?) hours.
a. 3
b. 12
c. 24
d. 36
33. True or False? Identification numbers of NBS SRMs or CRMs used as primary
standards should be included in traceability records for cylinder gases and
permeation tubes.
34. Ambient air monitoring data forms should include (Z)
a. pollutant information
b. monitoring site identification
c. both a and b, above
35. Which of the following is an(are) advantage(s) of using preprinted data forms?
a. minimizes identification errors
b. ensures that everyone is using up-to-date forms
c. both a and b, above
36. Suspended paniculate matter concentrations (/xg/std m3) should be reported to
(?) decimal place(s).
a. 0
b. 1
c. 2
d. 3
37. True or False? The timing of the chart drive of a strip-chart recorder should be
verified periodically.
38. True or False? A strip-chart-recorder trace having a cyclic pattern could be
caused by analyzer-shelter temperature fluctuations that are beyond the
operating temperature limits of the analyzer.
39. True or False? Data validation should be performed by personnel who are
directly involved in collecting the data.
40. Manual scanning of data sets is most appropriate for detecting (?)
a. unusually high or low values
b. erroneous intermediate values
c. both a and b, above
41. True or False? Erroneous intermediate values of a data set are more easily
detected using automated validation procedures.
42. Systematic data management (?) duplicate reporting of data.
a. increases
b. decreases
c. does not affect
43. True or False? Chain-of-custody procedures are necessary to make a prima facie
showing of the representativeness of sampling data.
2-10
-------
44. Which of the following should be subjected to chain-of-custody procedures?
a. preparation of sampling equipment and reagents
b. samples
c. sampling data
d. both b and c, above
e. a, b, and c, above
45. True or False? Chain-of-custody procedures should be applied only during sam-
ple collection and transport to the laboratory.
46. True or False? Original sampling and analysis records may be discarded after
completion of their associated final report.
2-11
-------
Section 2
Answers to Review Exercise
1. c 24. b
2. c 25. False
3. True 26. b
4. True 27. False
5. False 28. c
6. c 29. False
7. c 30. True
8. c 31. d
9. True 32. d
10. True 33. True
11. False 34. c
12. b 35. a
13. c 36. a
14. a 37. True
15. False 38. True
16. c 39. False
17. True 40. a
18. False 41. True
19. c 42. b
20. False 43. True
21. True 44. e
22. False 45. False
23. False 46. False
2-12
-------
Section 3
Quality Assurance for SLAMS
and PSD Air Monitoring Networks
Reading Assignment
Read, in the following order, sections 2.0.9 and 2.0.8 of Quality Assurance Hand-
book for Air Pollution Measurement Systems, Volume II—Ambient Air Specific
Methods, EPA 600/4-77-027a.
Reading Assignment Topics
• Quality control programs for SLAMS and PSD air monitoring
• Precision and accuracy assessment for SLAMS and PSD air monitoring data
Learning Goal and Objectives
Learning Goal
The purpose of this section is to familiarize you with quality control programs and
data quality assessment for SLAMS and PSD air monitoring.
Learning Objectives
At the end of this section, you should be able to —
1. recognize the two principal objectives of quality control programs for SLAMS
and PSD air monitoring networks,
2. describe guidance for the calibration of automated air quality analyzers,
including the use of Level 1 and Level 2 zero and span checks,
3. describe guidance for calibration, performance evaluation, and operational
checks of manual air quality measurement methods,
4. recognize monitoring activities that require written operational procedures,
5. recognize that standard gases used in SLAMS or PSD air monitoring networks
must be certified as traceable to NBS SRMs or CRMs,
6. identify recommended recalibration frequencies for flow rate measuring devices,
7. recognize activities that are required for assessing precision and accuracy of
SLAMS and PSD air monitoring data, and
8. calculate precision and accuracy estimates for SLAMS and PSD air monitoring
data.
3-1
-------
Reading Guidance
This assignment reviews quality control guidance and required data quality assess-
ment for SLAMS and prevention of significant deterioration (PSD) air monitoring.
Recommendations for developing and implementing quality control programs are
given in Section 2.0.9. Section 2.0.8 contains procedures required for estimating the
precision and accuracy of air monitoring data.
The USEPA reference method for SO2 has been revised since the publication of
Section 2.0.9 (July 11, 1979). Because of these revisions, some of the recommenda-
tions described in Section 2.0.9 do not reflect present Federal requirements described
in the revised reference method. These recommendations are contrasted with present
Federal requirements below.
Table 3-1. Section 2.0.9 recommendations versus SOt reference method.
Recommendation described
in Section 2.0.9
(as of July 11, 1979)
Federal requirement described
in revised SOt reference method
(after July 11, 1979)
The wavelength calibration of the spec-
trophotometer should be verified when
the spectrophotometer is initially received.
The flow rate through the sampler may
be calibrated in the laboratory.
No specification is given for the correla-
tion coefficient of the calibration curve.
The difference between measured and
known values of control-standard solu-
tions should not be greater than
±0.07 jig SO2/mL.
The temperature of the sample during
sampling, storage, and transport follow-
ing sampling must be maintained
between 5° and 25 °C. During storage
and transport, the temperature should be
maintained as close to 5°C as possible.
If the difference between the initial and
final sampling flow rates is more than
10% of the initial flow rate, corrective
action should be performed.
Control-standard solutions containing
1.0 ng SO2/mL should be analyzed dur-
ing sample analysis.
The wavelength calibration of the spec-
trophotometer should be verified when
the spectrophotometer is initially received
and after each 160 hours of normal use
or every 6 months, whichever comes first.
The flow rate through the sampler must
be determined at the sampling site.
The correlation coefficient of the calibra-
tion curve must be greater than 0.998.
The difference between measured and
known values of control-standard solu-
tions should not be greater than 1 /tg SO2.
The temperature of the sample must be
maintained between 5° and 25 °C during
sampling and between 0° and 10°C after
sampling.
If the difference between the initial and
final flow rates is more than 5 % of the
final flow rate, the sample must be
invalidated.
Control-standard solutions containing
approximately 5 fig SOt and 15 /xg SO2
must be analyzed during sample analysis.
Also, Ambient Monitoring Guidelines for Prevention of Significant Deterioration
(PSD), given as a reference in Section 2.0.9, has been revised since the publication of
the section. Its present publication number and publication date are EPA 450/4-80-012
and November 1980.
3-2
-------
As described in Section 2.0.8, the accuracy of air monitoring data obtained from
air quality analyzers is estimated by auditing the analyzers within specific concentra-
tion ranges. Since the publication of Section 2.0.8 (July 1, 1979), the specified range
of 0.40 to 0.45 ppm for SO2) NO2, and O3 analyzers and the specified range of 40 to
45 ppm for CO analyzers have been changed to 0.35 to 0.45 ppm and 35 to 45 ppm,
respectively.
Procedures for assessing the precision and accuracy of lead monitoring data have
been added to 40 CFR 58 since the publication of Section 2.0.8. Estimates of preci-
sion and accuracy are obtained in different ways, depending on whether a SLAMS
or PSD air monitoring network is involved, and whether the reference method or an
equivalent method for lead monitoring is used.
For a SLAMS network, lead monitoring data are obtained by using either the lead
reference method or an equivalent measurement method. If the lead reference
method is used, the data's precision is estimated by analyzing duplicate strips of lead
filter samples collected at the monitoring site where the highest lead concentrations
are expected to occur. If an equivalent measurement method is used, the data's
precision may be estimated by analyzing duplicate aliquots of the lead samples. In
either case, the analytical results are used to calculate 95% probability limits for
precision. Probability limits for the precision of lead monitoring data are calculated
using the equations described in Section 2.0.8 for calculating the precision of TSP
monitoring data.
For a PSD network, lead monitoring data are obtained by using either the lead
reference method or an equivalent measurement method. In either case, the data's
precision is estimated by analyzing lead samples from two collocated samplers at one
monitoring site. The analytical results are used to calculate 95% probability limits
for precision. Probability limits are calculated using the equations described in Sec-
tion 2.0.8 for calculating the precision of TSP monitoring data.
The accuracy of lead monitoring data is estimated from audit results. Audit
results are obtained by auditing the flow rate measurements of lead samplers and
analyzing lead audit samples.
For a SLAMS network that uses the lead reference method, the number of lead
and TSP high-volume samplers are combined for auditing purposes. The flow rate
measurements of 25% of the combined number of samplers are audited once each
calendar quarter. Audit procedures and calculations described in Section 2.0.8 for
estimating the accuracy of TSP monitoring data are used. In addition, three audit
samples prepared by depositing a lead solution on unexposed filter strips and having
known lead concentrations from each of the two concentration ranges, 100 to 300 /xg
Pb and 600 to 1000 jig Pb, are analyzed each calendar quarter that lead air samples
are analyzed. The analytical results are used to calculate 95% probability limits for
accuracy. Probability limits are calculated using the equations described in Section
2.0.8 for calculating the accuracy of SO2 and NO2 monitoring data obtained from
manual measurement methods.
For a PSD network that uses the lead reference method, the flow rate measure-
ments of each lead sampler are audited once each sampling quarter. Audit pro-
cedures and calculations described in Section 2.0.8 for estimating the accuracy of
TSP monitoring data are used. In addition, audit samples prepared by depositing a
3-3
-------
lead solution on unexposed filter strips and having a known lead concentration from
each of the two concentration ranges, 100 to 300 /*g Pb and 600 to 1000 jig Pb, are
analyzed each day that lead air samples are analyzed. The analytical results are used
to calculate percentage differences. Percentage differences are calculated using the
equations described in Section 2.0.8 for calculating single-analysis-day accuracy for
SO2 and NO2 monitoring data obtained from manual measurement methods.
For both SLAMS and PSD air monitoring networks, the accuracy of a lead
equivalent measurement method is assessed in the same manner as described above
for the lead reference method. However, the flow rate auditing device and the lead
analysis audit samples must be compatible with the specific requirements of the
equivalent method.
In many cases, data quality assessment requirements for SLAMS and PSD air
monitoring are the same. However, there are some differences. These differences are
listed below.
Table 3-2. Differences between data quality assessment requirements for
SLAMS and PSD air monitoring.
Topic
SLAMS
PSD
Audit rates for accuracy assessment
• Automated measurement methods
• Manual measurement methods
Precision assessment
• Collocated sampling
• Pb precision assessment
Reporting period for data quality
assessment
Reporting of data quality assessment
25% per quarter
Hi-vol samplers: 25% per quarter
Pb analysis: 3 times per quarter
SO2 and NO2: Each analysis day
2 sites for SO2, NO2> and TSP
Analysis of duplicate filter strips
or duplicate aliquots of Pb
samples
Calendar quarter
Assessment reported for
organization
100% per quarter
100% per quarter
Each analysis day
Manual measurement meth-
ods for SO2 and NO2 are
not allowed.
1 site for TSP and Pb
Collocated sampling
Sampling quarter
Assessment reported for
each monitoring system
In addition to data quality assessment information presented in Section 2.0.8, the
USEPA has published a document, Guideline on the Meaning and Use of Precision
and Accuracy Data Required by 40 CFR Part 58, Appendices A and B (EPA
600/4-83-023), that contains a detailed discussion of the concepts and uses of preci-
sion and accuracy data required for ambient air quality monitoring. It is available as
document number PB 83 238949 from the National Technical Information Service,
5285 Port Royal Road, Springfield, Virginia 22161.
When you have finished the reading assignment, complete the review exercise that
follows and check your answers. The correct answers are listed on the page following
the review exercise. After you have reviewed your incorrect answers (if any), proceed
to Section 4 of this guidebook.
3-4
-------
Review Exercise
Now that you've completed the assignment for Section 3, please answer the following
questions. These will help you determine whether you are mastering the material.
1. The principal objective(s) of quality control programs for SLAMS and PSD air
monitoring is(are) to (D
a. provide data of adequate quality to meet monitoring objectives
b. increase the number of quality assurance coordinators
c. minimize loss of air quality data
d. both a and b, above
e. both a and c, above
2. Reference or equivalent methods are generally required in (?) monitor-
ing networks.
a. SLAMS
b. PSD
c. GEM
d. both a and b, above
e. both a and c, above
3. Automated air quality analyzers used for SLAMS or PSD air monitoring should
be recalibrated at least every (?)
a. two weeks
b. month
c. three months
d. year
4. Level 1 zero and span checks of automated air quality analyzers should be used
(?)
a. make analyzer zero and span adjustments
b. decide on the need for analyzer recalibration
c. invalidate monitoring data
d. all of the above
5. True or False? Level 2 zero and span checks of automated air quality analyzers
should be used to decide on the need for analyzer recalibration.
6. Control charts should be maintained for Level 1 zero and span checks of
automated air quality analyzers so that (?)
a. control limits can be determined
b. warning limits can be determined
c. drift patterns can be readily identified
d. all of the above
3-5
-------
7. Under the conditions described below, the span drift of an automated air qual-
ity analyzer is (?) percent.
a. 0.10
b. 10
c. 0.15
d. 15
Given: Unadjusted span reading: 0.900 ppm
Zero drift: 0.020 ppm
Span gas concentration: 0.800 ppm
8. True or False? Results from quarterly performance audits of automated air
quality analyzers should be directly used for data validation.
9. Operational procedures should be written for which of the following activities?
a. analyzer calibration
b. Level 1 and 2 zero and span checks
c. data validation
d. both a and b, above
e. a, b, and c, above
10. True or False? Air quality analyzers should be subjected to a performance audit
as soon as possible after an air pollution episode.
1 1 . True or False? After maintenance is performed on a high-volume sampler's
motor, the sampler's flow rate measuring device should be recalibrated.
12. The flow rate through an SO2 sampling train should be determined
a. before and after each sampling period
b. once per week
c. once per month
d. once per quarter
13. A high-volume sampler should be recalibrated if a performance audit results in
a relative difference of greater than ± (?) percent between the
sampler's measured flow rate and the audit flow rate.
a. 3
b. 7
c. 15
d. 20
14. True or False? Control standards should be periodically analyzed, using an
atomic absorption spectrophotometer, during the analyses of lead samples.
15. Which of the following is a(are) desirable characteristic(s) of glass-fiber filters
used to collect particulate lead?
a. minimal variation of lead content from filter to filter
b. high lead content
c. both a and b, above
3-6
-------
16. (?) should be used directly to validate air quality data collected by
manual measurement methods.
a. Results from quarterly performance audits
b. Results from operational checks
c. both a and b, above
17. Which of the following standard gases used in SLAMS or PSD air monitoring
networks must be certified as traceable to NBS SRMs or CRMs?
a. SO2
b. CO
c. O,
d. both a and b, above
e. a, b, and c, above
18. True or False? Mass flowmeters do not require periodic recalibration.
19. Results of (?) are used to assess the precision of TSP, SO2) and NO2
data collected by manual measurement methods in SLAMS monitoring
networks.
a. routine operational checks
b. collocated sampling
c. audits
20. Results of (?) are used to assess the accuracy of SLAMS and PSD air
monitoring data.
a. routine operational checks
b. collocated sampling
c. audits
21. Precision checks of SO2, NO2, O3, and CO automated analyzers of SLAMS and
PSD air monitoring networks must be performed at least once every (?)
a. week
b. two weeks
c. month
d. quarter
22. True or False? Equipment used for calibrating SLAMS and PSD air monitoring
methods may also be used for auditing the methods.
23. True or False? The routine operators of PSD automated air monitoring methods
may also audit the methods.
24. A precision check that is made in conjunction with a zero/span adjustment must
be made (?) the adjustment.
a. before
b. after
c. during
25. True or False? During a precision check, the test atmosphere must pass through
as much of the ambient inlet system as is practicable.
3-7
-------
26.
Under the conditions described below, the upper and lower 95% probability
limits for the precision of TSP monitoring data collected by the reporting
organization are (?) percent and (?) percent, respectively.
a. 2, 5
b. -1, -6
c. 1, -7
d. 0, -6
Given:
Collocated TSP Sampling Data
for the Reporting Organization
Sampling Site 1
Sampling period
1
2
3
Duplicate sampler results
(jig/std ms)
227
268
258
Official sampler results
(/ig/std ms)
236
275
256
Sampling Site 2
Sampling period
1
2
3
4
Duplicate sampler results
(/ig/std ms)
245
227
164
212
Official sampler results
(jig/std ms)
257
240
166
221
27. Under the conditions described below, the upper and lower 95% probability
limits for the precision of CO monitoring data collected by the CO analyzer are
(?) percent and (?) percent, respectively.
a. 4, -4
b. 6, -6
c. 0, -4
d. 0, -6
Given:
Precision Check Data for the CO Analyzer
Precision check
1
2
3
4
5
6
Measured CO concentration
(ppm)
9.1
8.9
9.2
9.3
8.6
8.8
Known CO concentration
(ppm)
9.0
9.0
9.0
9.0
9.0
9.0
3-8
-------
28. Under the conditions described below, the upper and lower 95% probability
limits for the accuracy of CO monitoring data within the 15- to 20-ppm range
for the reporting organization are (?) percent and (?) percent,
respectively.
a. 13, -20
b. 8, -16
c. 8, -20
d. 13, -16
Given:
Audit Data for CO Analyzers
Analyzer audit
1
2
3
Measured CO concentration
(ppm)
16
17
19
Known CO concentration
(ppm)
18
18
18
3-9
-------
Section 3
Answers to Review Exercise
1. e
2. d
3. c
4. d
5. False
6. c
7. b
8. False
9. e
10. True
11. True
12. a
13. b
14. True
15. a
16. b
17. d
18. False
19. b
20. c
21. b
22. False
23. False
24. a
25. True
*26. d
*27. b
*28. a
* Solutions for questions 26, 27, and 28 follow.
3-10
-------
Solution for Question 26
1 . For site 1 , calculate the percentage difference (d,) for each pair of results using
the following equation:
Where: y, = duplicate sampler result
x, = official sampler result
Calculated percentage differences:
2. Calculate the average percentage difference (d,) for site 1 using the following
equation:
n
Where: n = the number of pairs of results
Calculated average percentage difference:
^ -3.8 + (-2.5) + 0.8
= -1.8
3. Calculate the standard deviation of the percentage differences (s,) for site 1 using
the following equation:
r£d/-(Ed,)2/n
Where: n = the number of pairs of results
Calculated standard deviation of the percentage differences:
3-11
-------
4. Calculate percentage differences (d,), average percentage difference (d,), and
standard deviation of the percentage differences (s,) for site 2.
Calculated percentage differences:
Sampling period /245 - 257
Calculated average percentage difference:
- -4.7 + -5.
= -3.8
Calculated standard deviation of the percentage differences:
s2-
69.50-59.29
3
= 1.8
5. Calculate the average percentage difference (D) for sites 1 and 2 using the fol-
lowing equation:
D =
HI + n2
Calculated average percentage difference:
-_
3 + 4
= -2.9
3-12
-------
6. Calculate the pooled standard deviation (sa) for sites 1 and 2 using the following
equation:
Sa =
nj + n2 — k
Where: k = the number of collocated sampling sites
Calculated pooled standard deviation:
(3-l)(2.4)2 + (4-
3 + 4-2
= 2.1
7. Calculate the upper 95% probability limit using the following equation:
Upper 95% probability limit =D + - '
Calculated upper 95% probability limit:
Upper 95% probability limit = -2.9 +
V2
Upper 95% probability limit = 0.0 or 0
8. Calculate the lower 95% probability limit using the following equation:
Lower 95% probability limit = D -
Calculated lower 95% probability limit:
Lower 95% probability limit = - 2.9 - -^
V2
Lower 95% probability limit = — 5.8 or — 6
3-13
-------
Solution for Question 27
1. Calculate the percentage difference (d,) for each precision check using the follow-
ing equation:
' y, - x,-
d,=
Where: y, = measured CO concentration
x, = known CO concentration
Calculated percentage differences:
Precision check /
100
9.1-9.0
1
3
4
fi
GI !
d -1
Q-2 1
d,=(
d,.(
/
H.-l
^ 9.0 ,
'8.9-9.0'
V 9.0 ,
/ Q 9_ Q QN
C7 . ^ .7 . V
V 9.0 ,
'9. 3-9. 0s
V 9-0 ,
^8.6-9.0N
V 9.0 ,
' 8.8-9. 0N
1 1UU —
lioo —
)ioo-
)ioo-
inn
\
nnn-
1.1
i i
2.2
3.3
4. 4
9 9
2. Calculate the average percentage difference (d) using the following equation:
d = ^~
n
Where: n = the number of precision checks
Calculated average percentage difference:
1.1+(-1.1)+ 2.2+ 3.3 +(-4.4) +(-2.2)
6
= -0.2
3. Calculate the standard deviation of the percentage differences (s) using the
following equation:
"Ed,2 - (Ld.)Vn
n-1
• .
Where: n = the number of precision checks
3-14
-------
s=
Calculated standard deviation of the percentage differences:
T(l.I)2 + (- 1.I)2 + (2.2)2 + (3.3)2 + (- 4.4)2 + (- 2.2)21- ^ ' _2±i_
6-1
42.35-0.20V/S
5
= 2.9
4. Calculate the upper 95% probability limit using the following equation:
Upper 95% probability limit = d + (1.96)s
Calculated upper 95% probability limit:
Upper 95% probability limit = - 0.2 + (1.96)(2.9)
Upper 95% probability limit = 5.5 or 6
5. Calculate the lower 95% probability limit using the following equation:
Lower 95% probability limit = d — (1.96)s
Calculated lower 95% probability limit:
Lower 95% probability limit = - 0.2-(1.96)(2.9)
Lower 95% probability limit = — 5.9 or — 6
Solution for Question 28
1 . Calculate the percentage difference (d,) for each analyzer audit using the follow-
ing equation:
Where: y, = measured CO concentration
x, = known CO concentration
Calculated percentage differences:
Analyzer audit / 1 fi - 1 8 \
1 d,= (-^100= -11.1
/I 7 _ 1 BN
2 dz= 11J5 ]100= -5.6
3-15
-------
2. Calculate the average percentage difference (D) using the following equation:
Where: k = the number of analyzer audits
Calculated average percentage difference:
__ -ll.l + (-5.
5.6
= -3.7
3. Calculate the standard deviation of the percentage differences (sa) using the
following equation:
_r£d,2-(£d.)2/k'
S" [ k-1
Where: k = the number of analyzer audits
Calculated standard deviation of the percentage differences:
-ll.l)2 + (-5.6)2 + (5.6)2l-
3-1
185. 93-41.
= 8.5
4. Calculate the upper 95% probability limit using the following equation:
Upper 95% probability limit = D + (1. 96) sa
Calculated upper 95% probability limit:
Upper 95% probability limit= - 3.7 + (1.96)(8.5)
Upper 95% probability limit = 13.0 or 13
5. Calculate the lower 95% probability limit using the following equation:
Lower 95% probability limit = I3-(1.96) sa
Calculated lower 95% probability limit:
Lower 95% probability limit = - 3.7- (1.96)(8.5)
Lower 95% probability limit = -20.4 or -20
3-16
-------
Section 4
Performance Auditing of
Air Quality Monitoring Systems
Reading Assignment
Read, in the following order, sections 2.0.12 and 2.0.10 of Quality Assurance Hand-
book for Air Pollution Measurement Systems, Volume II—Ambient Air Specific
Methods, EPA 600/4-77-027a.
Reading Assignment Topics
• General guidance for conducting performance audits of air quality measurement
systems
• Performance audit procedures for SO2, NO2, CO, and Os analyzers
• Performance audit procedure for TSP high-volume samplers
• Interpretation of performance audit results
• USEPA's national performance audit program
Learning Goal and Objectives
Learning Goal
The purpose of this section is to familiarize you with performance audit procedures
for air quality monitoring systems and with the USEPA's national performance audit
program.
Learning Objectives
At the end of this section, you should be able to —
1. describe, in sequence, the activities involved in conducting a performance audit,
2. recognize equipment requirements, including critical specifications and
operating parameters, for conducting performance audits of SO2, NO2, CO,
and O3 analyzers and high-volume samplers,
3. describe performance-audit procedures for SO2, NO2, CO, and Os analyzers
and high-volume samplers,
4. describe techniques for the interpretation of performance audit data,
4-1
-------
5. recognize that organizations operating SLAMS or PSD air monitoring networks
are required to participate in the USEPA's national performance audit
program,
6. recognize the two purposes of the USEPA's national performance audit
program, and
7. identify audit materials for ambient air monitoring that are presently used in
the USEPA's national performance audit program.
Reading Guidance
This assignment reviews performance auditing of ambient air quality monitoring
systems. Audit procedures for SO2, NO2, CO, and O3 analyzers and TSP high-
volume samplers are described in Section 2.0.12. Section 2.0.10 contains information
concerning the USEPA's national performance audit program for ambient air quality
monitoring.
At first glance, Section 2.0.12 appears to be lengthy. However, much of it consists
of redundant information and audit data forms. Descriptions of flow controllers,
flowmeters, mixing chambers, output manifolds, and sampling lines are generally
the same for all of the analyzer audit systems discussed. Procedures for auditing SO2
and CO analyzers are the same except for the specification and calculation of audit
gas concentrations. Also, the calculation of audit results is the same for all of the
analyzers. After you have thoroughly read this information once, you can just skim it
in subsequent pans of the section.
The USEPA reference method for suspended paniculate matter and the reference
measurement principle and calibration procedures for CO have been revised since
the publication of Section 2.0.12 (July 1, 1980). Therefore, some of the recommen-
dations described in Section 2.0.12 do not reflect present Federal requirements
described in the revised reference method and in the revised reference measurement
principle and calibration procedure. These recommendations are contrasted with
present Federal requirements below.
• Section 2.0.12 states that TSP high-volume samplers not equipped with constant
flow controllers are usually calibrated in terms of actual flow rates. However,
the revised reference method requires that all high-volume samplers be
calibrated in terms of standard flow rates.
• Section 2.0.12 states that zero air, nitrogen, or helium may be used to dilute
CO used in auditing CO analyzers. However, the revised reference measurement
principle and calibration procedure requires that CO used in calibrating CO
analyzers be diluted with zero air.
In addition to the above revisions, note the following as you read Section 2.0.12:
• The nitrogen oxides analyzer that is included in the description of the gas phase
titration audit system is not needed for the described NO2 audit procedure.
• The second and third pages of the gas phase titration audit data report form
are reversed.
4-2
-------
In the description of the dilution air source for the carbon monoxide dynamic
dilution audit system, NO2 should be nitrogen.
The variance equations presented in Table 12.10 should be
£y'-
N
Ex2-
N-l
and
S2 =
•Jx —
N
N-l
• The y-intercept (b) of the line depicted in Figure 12.16 is -0.014, not-0.009
as indicated.
Materials that are used in the USEPA's national performance audit program for
auditing ambient air quality monitoring systems are described in Section 2.0.10. The
USEPA periodically sends these materials to air quality monitoring organizations,
where they are used to audit the performance of manual and automated air quality
measurement methods. The organizations report their audit results to the USEPA.
Subsequently, the USEPA analyzes the audit results and reports its findings back to
the organizations.
Since the publication of Section 2.0.10 (July 1, 1979), a device for auditing SO2
automated analyzers has been added to the performance audit program. It consists
of a cylinder of SO2, a cylinder of zero air, and a dynamic gas dilution system. Seven
SO2 concentrations, including a zero concentration, can be generated using the
device (Figure 4-1).
Figure 4-1. Audit device for SOt analyzers.
4-3
-------
Source emission measurement methods are also audited in the USEPA's national
performance audit program. If you are interested in participating in these audits,
contact the USEPA, Quality Assurance Division, EMSL, Research Triangle Park,
North Carolina 27711.
When you have finished the reading assignment, complete the review exercise that
follows and check your answers. The correct answers are listed on the page following
the review exercise. After you have reviewed your incorrect answers (if any), take
Quiz 2. Follow the directions listed in the Course Introduction section of this
guidebook. After completing Quiz 2, proceed to Section 5 of this guidebook.
4-4
-------
Review Exercise
Now that you've completed the assignment for Section 4, please answer the following
questions. These will help you determine whether you are mastering the material.
1 . True or False? The operator of an air monitoring site should be informed of
performance auditing results immediately following an audit.
2. True or False? During a performance audit, the auditor should convert analyzer
responses to pollutant concentrations.
3. Audit responses of an analyzer, whose responses are routinely telemetered to a
computer, should be obtained from (?)
a. a strip-chart recorder
b. a digital voltmeter
c. the computer
4. Audit gas manifolds for air quality analyzers should be composed of
a. polypropylene
b. glass
c. Teflon®
d. either b or c, above
5. True or False? Preparing an audit gas by dynamic dilution requires accurate
flow rate measurements.
6. Which of the following must be complied with when auditing an air quality
analyzer?
a. audit gas manifold vented to the outside atmosphere
b. audit gases introduced through analyzer's calibration port
c. gas flow to audit gas manifold 10 to 50% greater than analyzer's sample flow
demand
d. a and c, above
e. a, b, and c, above
7. Dilution air used in the audit of flame photometric SO2 analyzers should contain
which of the following?
a. 20.9 ±0.2% oxygen
b. nitrogen
c. approximately 350 ppm carbon dioxide
d. a and b, above
e. a, b, and c, above
8. Audit gases must be delivered to the inlet of an air quality analyzer - Q) -
ambient pressure.
a. at
b. below
c. above
4-5
-------
9. Gases used to audit SO2( NO2, and CO analyzers should be traceable to a
a. standard reference material of the National Bureau of Standards
b. certified reference material
c. either a or b, above
10. (?) cylinder regulators should be composed of stainless steel.
a. SO2
b. NO
c. CO
d. both a and b, above
e. both a and c, above
11. (?) must be removed from regulators of SO2, NO, and CO cylinders
that are used for auditing air quality analyzers.
a. N2
b. 02
c. Ar
12. True or False? In auditing an NO2 analyzer, data resulting from an audit of the
analyzer's NO response are used to determine the concentrations of NO2 audit
gases.
13. Which of the following is(are) evaluated during a performance audit of an NO2
analyzer?
a. calibration of the analyzer's NO* response
b. calibration of the analyzer's NO2 response
c. the efficiency of the analyzer's convener
d. both a and b, above
e. a, b, and c, above
14. In auditing O3 analyzers, audit gas concentrations are verified using
a. gas-phase chemiluminescence
b. ultraviolet photometry
c. flame photometry
d. nondispersive infrared spectroscopy
15. The identity(ies) of which of the following should be documented during a per-
formance audit?
a. the measurement system that is audited
b. the audit system
c. the auditor
d. both a and b, above
e. a, b, and c, above
4-6
-------
16. Which of the following equations is used to calculate percentage differences of
performance audit data?
_/CA-CM\
\ CA /
a. % Difference = ( _ J100
b. % Difference = I C/> CM) 100
c. % Difference = ( r^__^ ] iQO
d. % Difference =
CM
Where: CA — audit value
CM = measurement system response
17. True or False? A Reference Flow (ReF) device can be used to audit the flow rate
calibrations of high-volume samplers.
4-7
-------
18. Which of the following is a(are) possible cause(s) for the audit data described
below?
a. The audit system's zero air source has a positive bias.
b. The analyzer has a positive zero drift.
c. The audit system's zero air source has a negative bias.
d. either a or b, above
e. either b or c, above
Given:
Audit
point no.
1
2
3
4
5
6
Audit
concentration
(ppm)
0.000
0.056
0.116
0.221
0.276
0.405
Analyzer
response
(ppm)
0.013
0.064
0.132
0.235
0.282
0.409
% Difference
—
14.3
13.8
6.3
2.2
1.0
o.
EX
O
O.
ra
<
r = 0.9997
m = 0.980
= 0.014
0.1 0.2 0.3
Audit concentration (ppm)
0.4
0.5
4-8
-------
19. Organizations that operate _ (!) - monitoring networks must participate in
the USEPA's national performance audit program.
a. SLAMS
b. PSD
c. either a or b, above
20. The purpose(s) of the USEPA's national performance audit program is(are) to
a. provide participating organizations with a means of self -appraisal for specific
operations audited
b. increase the number of quality assurance coordinators
c. provide information concerning the quality of monitoring data reported to
the USEPA
d. both a and b, above
e. both a and c, above
21. True or False? CO cylinder gases are used in USEPA's national performance
audit program.
4-9
-------
1. True
2. False
3. c
4. d
5. True
6. d
7. e
8. a
9. c
10. d
11. b
12. True
13. e
14. b
15. e
16. c
17. True
18. d
19. c
20. e
21. True
Section 4
Answers to Review Exercise
4-10
-------
Section 5
System Auditing
of SLAMS Networks
Reading Assignment
Read Section 2.0.11 of Quality Assurance Handbook for Air Pollution Measurement
Systems, Volume II—Ambient Air Specific Methods, EPA 600/4-77-027a.
Reading Assignment Topics
• System audit criteria for SLAMS networks
• Laboratory operations
• Field operations
• System audit procedure
Learning Goal and Objectives
Learning Goal
The purpose of this section is to familiarize you with criteria and procedures for the
system auditing of SLAMS networks.
Learning Objectives
At the end of this section, you should be able to —
1. identify items that should be evaluated during a system audit,
2. recognize criteria that should be used for evaluating items during a system
audit, and
3. describe, in sequence, the activities involved in conducting a system audit.
Reading Guidance
Specific criteria and a procedure for conducting system audits of agencies that
operate SLAMS networks are presented in this assignment. Much of the audit
criteria described in Section 2.0.11 have been discussed in previous reading
assignments, so this part of the assignment serves as a review of this information. A
system auditing procedure is also described in Section 2.0.11. An important con-
5-1
-------
sideration that is not addressed in the procedure is the need to determine if correc-
tive action recommended in the audit report is actually performed by the audited
agency. This determination is necessary because unresolved problems may affect data
quality.
Questionnaires to aid in conducting system audits are also provided in Section
2.0.11. You do not need to read these thoroughly. However, skim them to become
familiar with their formats.
The USEPA reference methods for suspended paniculate matter and SO2 have
been revised since the publication of Section 2.0.11 (July 1, 1980). Therefore, some
of the information given in Section 2.0.11 does not reflect present Federal require-
ments described in the revised reference methods. These recommendations are con-
trasted with present Federal requirements below.
Table 5-1. Section 2.0.11 information versus reference methods for total suspended
paniculate matter and sulfur dioxide.
Information given in
Section 2.0.11
(as of July 1, 1980)
The high-volume filter conditioning room
or desiccator must be maintained between
15° and 35°C and at less than 50% rela-
tive humidity.
Glass-fiber filters should be used for TSP
sampling.
Equipment specifications for TSP and
lead sampling are described in the follow-
ing sections of 40 CFR 50, Appendix B:
Flow rate measuring device: 5.1.3
Orifice calibration unit: 5.1.4 and 5.1.5
Positive-displacement meter: 5.1.6
Timers: 2.2
Barometer: 5.1.7
Equipment specifications for SOZ sam-
pling are described in the following sec-
tions of 40 CFR 50, Appendix A:
Sample train: 5.1.1 and 5.1.2
Flow rate measuring device: 5.1.3
Temperature-control equipment: 7.1.2
Barometer: 5.1.7
Pump: 5.1.2
Vacuum gauge: 5.1.2
Impinger gauge: 5.1.1
Thermometers: 8.2.1, 8.2.2.1, and 9.1
Federal requirements described in
revised suspended paniculate matter
and SO, reference methods
(after July 1, 1980)
The high-volume filter conditioning
environment must be maintained between
15° and 30°C with less than a ± 3°C
variation and at less than 50% relative
humidity, constant within ±5%.
TSP sampling filters may be composed of
glass fibers or other inert, nonhygroscopic
material.
Equipment specifications for TSP and
lead sampling are presently described in
the following sections of 40 CFR 50,
Appendix B:
Flow rate measuring device: 7.4.1 and
7.4.2
Flow rate transfer standard: 7.8
Positive-displacement meter: 9.2.1
Timers: 7.7.2
Barometer: 7.6
Equipment specifications for SOZ sam-
pling are presently described in the follow-
ing sections of 40 CFR 50, Appendix A:
Sample train: 7.1.2 through 7.1.6 and
7.1.8
Flow rate measuring device: 7.1.7
Temperature-control device: 7.1.10
Barometer: No specifications are given.
Pump: 7.1.9
Vacuum gauge: 7.1.9
Impinger gauge: 7.1.3
Thermometers: No specifications are given
5-2
-------
Design criteria for lead SLAMS and NAMS have been added to 40 CFR 58 since
the publication of Section 2.0.11. At least two lead SLAMS are required in areas
where the national ambient air quality standard (NAAQS) for lead is currently being
exceeded, and may be required in areas where the lead NAAQS has been exceeded
since January 1, 1974. Two lead NAMS are required in urbanized areas having
populations greater than 500,000.
Because Table 11-6 of Section 2.0.11 contains several errors, refer to Table 4 of
Section 2.0.2 for correct probe-siting criteria. Also, note the following as you read
Section 2.0.11:
• Subsection ll.l.S.l.e should be "control checks and their frequency," not
"control checks for their frequency."
• In Table 11.7. <60,000 should be >60,000.
• In Table 11.9, > 10,000 should be < 10,000.
• The superscript that appears in the footnote of the questionnaire entitled
"V. Analytical Methodology Operations: Total Suspended Particulates*"
should be 21, not 18.
• The superscripts that appear in the footnotes of the questionnaire entitled
"VI. Analytical Methodology Operations: SO2* and NO2f Impinged Bubbler
Samples" should be 22 and 23, not 19 and 20.
When you have finished the reading assignment, complete the review exercise
that follows and check your answers. The correct answers are listed on the page
following the review exercise. After you have reviewed your incorrect answers (if
any), take the final examination for this course. Follow the directions listed in the
Course Introduction section of this guidebook. Your course grade results will be
mailed to you.
5-3
-------
Review Exercise
Now that you've completed the assignment for Section 5, please answer the following
questions. These will help you determine whether you are mastering the material.
1. Which of the following activities would not be performed during a system audit
of an air quality monitoring program?
a. reviewing standard operating procedures
b. evaluating laboratory facilities
c. challenging air quality analyzers with audit gases
d. reviewing records of monitoring sites
2. True or False? During a system audit of an air quality monitoring program, the
air pollution training and experience of the program's staff should be reviewed.
3. Which of the following should be evaluated during a system audit of an air
quality monitoring program?
a. laboratory space
b. office space
c. field monitoring facilities
d. both a and c, above
e. a, b, and c, above
4. Questionnaires should be submitted (?) a system audit.
a. 4 to 6 weeks before
b. during
c. immediately after
5. True or False? Answers to system audit questionnaires should be typed.
6. When should system audit interviews be conducted?
a. before the audits
b. after the audits
c. both a and b, above
7. A convenient method for conducting a portion of a system audit is to trace
ambient air quality data from field measurement through (?)
a. data recording
b. data validation
c. data reporting
8. True or False? During a system audit, the auditor should determine if written
procedures are available and followed.
9. True or False? A written system audit report is not necessary if an exit interview
is conducted.
10. During a system audit, the (?) of equipment should be determined.
a. quantity
b. condition
c. both a and b, above
5-4
-------
Section 5
Answers to Review Exercise
1. c
2. True
3. e
4. a
5. False
6. c
7. c
8. True
9. False
10. c
5-5
-------
TECHNICAL REPORT DATA
(rlease read Instructions on the reverse before complehi
1. REPORT NO.
F.PA 450/2-83-008
3. RECIPIENT'S ACCESSION-NO.
4. TITLE AND SUBTITLE
APTI Correspondence Course 471
General Quality Assurance Considerations for
Ambient Air Monitoring; Guidebook
5. REPORT DATE
May 1984
6. PERFORMING ORGANIZATION CODE
7. AUTHORISI
B. M. Ray
8. PERFORMING ORGANIZATION REPORT NO.
9. PERFORMING ORGANIZATION NAME AND ADDRESS
Northrop Services, Inc.
P.O. Box 12313
Research Triangle Park, NC 27709
10. PROGRAM ELEMENT NO.
B18A2C
11. CONTRACT/GRANT NO.
68-02-3573
12. SPONSORING AGENCY NAME AND ADDRESS
U.S. Environmental Protection Agency
Manpower and Technical Information Branch
Air Pollution Training Institute
Research Triangle Park, NC 27711
13. TYPE OF REPORT AND PERIOD COVERED
Student Guidebook
14. SPONSORING AGENCY CODE
EPA-OAR-OAOPS
15. SUPPLEMENTARY NOTES
Project officer for this publication is R. E. Townsend, EPA-ERC, RTP, NC
27711.
16. ABSTRACT
This guidebook was developed for use in the Air Pollution Training Institute's
Correspondence Course 471, "General Quality Assurance Considerations for
Ambient Air Monitoring." It contains reading assignments and review exercises
covering the following topics:
—Quality Assurance Policy and Principles
—Quality Assurance for Air Quality Monitoring Systems
—Quality Assurance for SLAMS and PSD Air Monitoring Networks
—Performance Auditing of Air Quality Monitoring Systems
—System Auditing of SLAMS Networks
This guidebook is designed for use in conjunction with "Quality Assurance
Handbook for Air Pollution Measurement Systems, Volume II—Ambient Air Specific
Methods" (EPA 600/4-77-027a).
7.
KEY WORDS AND DOCUMENT ANALYSIS
DESCRIPTORS
b.lDENTIFIERS/OPEN ENDED TERMS
COSATI Field/Group
Training
Air Pollution
Measurement
Quality Assurance
Ambient Air Monitoring
Training Course
13B
51
68A
B. DISTRIBUTION STATEMENT Unlimited.
Available from the National Technical
Information Service, 5285 Port Royal Road,
Springfield, VA 22161.
19. SECURITY CLASS (This Report)
Unclassified
21. NO. OF PAGES
121
20. SECURITY CLASS (Thispage)
Unclassified
22. PRICE
EPA Form 2220-1 (9-73)
5-7
------- |