EPA-600/2-76-159
June 1976
Environmental Protection Technology Series
IERL-RTP DATA QUALITY MANUAL
Industrial Environmental Research Laboratory
Office of Research and Development
U.S. Environmental Protection Agency
Research Triangle Park, North Carolina 2771T
-------
RESEARCH REPORTING SERIES
Research reports of the Office of Research and Development, U.S. Environmental
Protection Agency, have been grouped into five series. These five broad
categories were established to facilitate further development and application of
environmental technology. Elimination of traditional grouping was consciously
planned to foster technology transfer and a maximum interface in related fields.
The five series are:
1. Environmental Health Effects Research
2. Environmental Protection Technology
3. Ecological Research
4., Environmental Monitoring
5. ,Socioeconomic Environmental Studies
This report has been assigned to the ENVIRONMENTAL PROTECTION
TECHNOLOGY series. This series describes research performed to develop and
demonstrate instrumentation, equipment, and methodology to repair or prevent
environmental degradation from point and non-point sources of pollution. This
work provides the new or improved technology required for the control and
treatment of pollution sources to meet environmental quality standards.
EPA REVIEW NOTICE
This report has been reviewed by the U.S. Environmental
Protection Agency, and approved for publication. Approval
does not signify that the contents necessarily reflect the
views and policy of the Agency, nor does mention of trade
names or commercial products constitute endorsement or
recommendation for use.
This document is available to the public through the National Technical Informa-
tion Service, Springfield, Virginia 22161.
-------
EPA-600/2-76-159
June 1976
IERL-RTP
DATA QUALITY
MANUAL
by
Franklin Smith and James Buchanan
Research Triangle Institute
P.O. Box 12194
Research Triangle Park, North Carolina 27709
Contract No. 68-02-1398, Task 35
Program Element No. EHB524
EPA Project Officer: L. D. Johnson
Industrial Environmental Research Laboratory
Office of Energy, Minerals, and Industry
Research Triangle Park, NC 27711
Prepared for
U.S. ENVIRONMENTAL PROTECTION AGENCY
Office of Research and Development
Washington, DC 20460
-------
TABLE OF CONTENTS
SECTION PAGE
I INTRODUCTION 1
1.1 DEFINITIONS 1
1.2 BACKGROUND 2
II PURPOSE AND SCOPE OF DATA QUALITY MANUAL 3
2.1 PURPOSE 3
2.2 SCOPE 3
III QUALITY POLICIES AND OBJECTIVES 4
3.1 QUALITY POLICIES 4
3.2 QUALITY OBJECTIVES 5
IV ORGANIZATION FOR DATA QUALITY 7
4.1 ORGANIZATION STRUCTURE 7
4.2 FUNCTIONAL RESPONSIBILITIES 7
V IMPLEMENTATION PLAN AND SCHEDULE 13
5.1 IMPLEMENTATION SCHEDULE 13
5.2 ESTIMATED COST OF IMPLEMENTATION 15
VI COMPONENTS OF A COMPREHENSIVE QUALITY CONTROL
PROGRAM 16
6.1 GENERAL REMARKS 16
6.2 FACILITIES AND EQUIPMENT 16
6.3 CONFIGURATION CONTROL 18
iii
-------
TABLE OF CONTENTS (CON,)
SECTION PAGE
6.4 PERSONNEL TRAINING 18
6.5 DOCUMENTATION CONTROL 19
6.6 CONTROL CHARTS 20
6.7 IN-PROCESS QUALITY CONTROL 20
6.8 PROCUREMENT AND INVENTORY PROCEDURES 22
6.9 PREVENTIVE MAINTENANCE 22
6.10 RELIABILITY 22
6.11 DATA VALIDATION 23
6.12 FEEDBACK AND CORRECTIVE ACTION 23
6.13 CALIBRATION PROCEDURES 24
VII GUIDELINES FOR QUALITY ASSURANCE PROGRAMS 25
7.1 GENERAL STATEMENT 25
7.2 THE REQUEST FOR PROPOSAL - QUALITY CONTROL ASPECTS 25
7.3 EVALUATION OF QUALITY CONTROL IN THE PROPOSAL 26
7-4 EVALUATION OF QUALITY CONTROL IN THE WORK PLAN 27
7.5 THE ON-SITE QUALITATIVE SYSTEMS AUDIT 27
7.6 THE PERFORMANCE AUDIT 28
7.7 DATA QUALITY ASSESSMENT 28
VIII DEVELOPMENT OF QUALITY CONTROL AND QUALITY ASSUR-
ANCE PLANS FOR IERL PROJECT CATEGORIES 30
8.0 GENERAL 30
-------
TABLE OF CONTENTS (CON,)
SECTION PAGE
8.1 ENVIRONMENTAL ASSESSMENTS 30
8.2 INDUSTRY SYSTEM STUDIES/POLLUTANT SYSTEM STUDIES 33
8.3 FIELD STUDIES 36
8.4 RESEARCH AND BENCH-SCALE PROJECTS 38
8.5 DEVELOPMENT OR PILOT PROGRAM 39
8.6 DEMONSTRATION PROJECTS 42
IX REFERENCES
46
APPENDIX A QUALITATIVE ON-SITE SYSTEMS REVIEW
CHECKLIST 47
APPENDIX B STANDARD TECHNIQUES USED IN QUANTITATIVE
PERFORMANCE AUDITS 71
APPENDIX C DEFINITIONS AND STATISTICAL TECHNIQUES
USEFUL IN QUALITY ASSURANCE PROGRAMS 77
APPENDIX D SOME STANDARD AMBIENT AIR AND SOURCE
SAMPLING TECHNIQUES 85
-------
LIST OF FIGURES
FIGURE NO. PAGE
1 IERL-RTP ORGANIZATION CHART 8
2 DATA QUALITY PROGRAM ORGANIZATION 9
3 FLOWCHART OF FUNCTIONS AND RELATIONSHIPS 12
4 DATA QUALITY PROGRAM IMPLEMENTATION SCHEDULE 14
5 APPLICABILITY OF QUALITY CONTROL PROGRAM COMPONENTS TO
PROJECT CATEGORIES 17
6 STANDARD QUALITY CONTROL CHART 21
-------
SECTION I INTRODUCTION
1.1 DEFINITIONS
In order to facilitate the understanding of this manual, it is necessary
to define three terms: quality, quality control, and quality assurance. It
is sometimes difficult to distinguish quality control from quality assurance
unless the proper definitions are kept in mind. The following definitions are
based on those given by the American Society For Quality Control (ref. 1) and
those given by the Environmental Protection Agency (ref. 2).
Quality
The totality of features and characteristics of a product or service
that bear on its ability to satisfy a given purpose. For measurement systems,
the product is measurement data and the characteristics of major importance
are accuracy, precision, and completeness. For monitoring systems, complete-
ness—or the amount of valid measurements obtained relative to the amount
expected to have been obtained—is usually a very important measure of quality.
The relative importance of accuracy, precision, and completeness depends upon
the particular project requirements.
Qual-Lty control
The overall system of activities, the purpose of which is to provide
a quality of product or service that meets the needs of users; also, the use
of such a system. Maintaining a quality control program is the responsibility
of the organization/individual implementing the project.
The aim of quality control is to provide quality that is adequate, depend-
able, and economical. The overall system involves integrating the quality
aspects of several related steps, including the proper specification of what
is wanted; production to meet the full intent of the specification; inspection
to determine whether the resulting product or service is in accordance with
the specification; and review of usage to provide for revision of specification.
-------
Quality assurance
A system of activities, the purpose of which is to provide assurance
that the overall quality control job is in fact being done effectively. The
system involves a continuing evaluation of the adequacy and effectiveness of
the overall quality control program with a view to having corrective measures
initiated where necessary. For a specific product or service, this involves
verifications, audits, and the evaluation of the quality factors that affect
»
the specification, production, inspection, and use of the product or service.
Maintaining an IERL-RTP quality assurance program is a function of the
Process Measurements Branch.
1.2 BACKGROUND
The IEKL-RTP has long recognized the importance of quality control as an
integral part of its research and measurement activities. Heretofore, quality
control has been practiced on a project-by-project basis, with the preparation
and implementation of a quality control plan being the responsibility of the
EPA project officer and the contractor. However, due to recent increased
impetus in EPA energy and industrial processes programs and subsequent substan-
tial increase in the number, scope, and importance of environmental assessment
and technology development projects within IERL-RTP, the need for a formal,
comprehensive, and integrated laboratorywide data quality program has become
more critical. The development and implementation of a formal IERL-RTP data
quality program was initiated in December 1974 by the preparation and distri-
bution to senior staff members of the "Planning Document for an IERL-RTP
Quality Assurance Program" (ref. 3). This planning document groups all
IERL-RTP projects into five categories. Six categories are now existent, with
the inclusion of environmental assessments. Further progress in developing and
implementing an IERL-RTP data quality program has been realized through the
preparation and on-site trial implementation of quality assurance procedures
at IERL-RTP projects (refs. 4,5,6). Ultimately, quality assurance guideline
documents will be generated for each project category.
-------
SECTION II PURPOSE AND SCOPE OF DATA QUALITY MANUAL
2.1 PURPOSE
The purposes of this manual are:
1. To direct the establishment of and provide guidance for the imple-
mentation of an integrated data quality program for Industrial
Environmental Research Laboratory - Research Triangle Park (IERL-
RTP) projects.
2. To serve as a source of directive and instructive material for
IERL-RTP project officers and contractors in the establishment and
maintenance of project-specific quality control programs sufficient
to insure that project data objectives are realized in the most
economical manner (sections VI and VIII).
3. To serve as guidelines for the Process Measurements Branch within
IERL-RTP in establishing and maintaining a quality assurance program
to monitor, assess, and document the efficiency of the various
project quality control programs (sections VII and VIII).
2.2 SCOPE
This manual describes the administrative systems pertaining to the
establishment and maintenance of an IERL-RTP data quality program and presents
guidelines for developing or designing quality control and quality assurance
plans specific to given projects.
The administrative systems include: quality policies that provide both
guidance for the establishment and implementation of a data quality program
and quality objectives to guide in the designing of quality control and quality
assurance plans (section III); organization, naming key quality personnel and
groups (section IV); and a plan and schedule for implementing the quality
program (section V).
Guidelines for developing project-specific quality control plans are given
in sections VI and VIII and for quality assurance plans in sections VII and
VIII.
-------
SECTION III QUALITY POLICIES AND OBJECTIVES
This section contains the IERL-RTP policies to be followed in the estab-
lishment and maintenance of a data quality program. The objectives to be
realized through a well-planned and conscientiously applied data quality program
are also given here. A time schedule and details for implementation of these
policies are given in section V. The organizational structure for establishing
and maintaining the data quality program is given in section IV.
3.1 QUALITY POLICIES
IERL-RTP policies pertaining to the development, implementation, and
maintenance of a data quality program are described by category below.
3.1.1 Coverage of the Data Quality Program
The IERL-RTP data quality program will have the following characteristics.
It will be complete in nature, encompassing both in-house and contract experi-
ments, tasks, and projects that either generate or use experimental data. It
will be integrated, in that all experiments, tasks, and projects must have a
quality control plan delineating the practices and procedures to be implemented
at each level (e.g., operator, bench chemist, project leader, etc.) and each
phase of the project. This plan will be evaluated and approved by the Process
Measurements Branch. All experiments, tasks, and projects will also have a
quality assurance plan for monitoring the effectiveness of the quality control
program to be implemented by the Process Measurements Branch. Finally, the
data quality program will be applied on a project-by-project basis according
to project objectives and requirements.
3.1.2 Levels of Quality Application
Quality practices and procedures will be implemented at two levels.
1. Quality control The design and implementation of quality control
practices and procedures required to assure that data quality is
sufficient to meet project requirements are the responsibility of
the individual or organization conducting the project. For example,
on projects conducted under contract, the quality control plan will
be prepared by the contractor, and reviewed and approved by the
-------
EPA project officer with assistance from the Process Measurements
Branch if desired. Inhouse projects will have quality control plans
prepared by the responsible EPA staff member, with assistance from
the Process Measurements Branch if desired.
2. Quality Assurance Quality assurance procedures for independently
monitoring and assessing the efficiency and adequacy of individual
quality control programs will be established and administered by the
IERL-RTP director through the Process Measurements Branch. The quality
assurance procedures shall be applied uniformly throughout the
duration of the project. However, at any specific time during the
project life, either at the request of the subject project officer
or if deemed necessary by the Process Measurements Branch, the
Process Measurements Branch may, using accepted quality assurance tech-
niques, assess the IERL-RTP project's ongoing quality control program.
3.2 QUALITY OBJECTIVES
The primary objective of the IERL-RTP data quality program is to assure,
assess, and document that the quality (i.e., precision, accuracy, and complete-
ness) of measurements made by and/or experimental data used in IERL-RTP
activities and publications is commensurate with the end use of the data.
Management, administrative, statistical, investigative, preventive, and
corrective techniques will be employed to maximize the end effectiveness of the
data.
Specific data quality objectives are:
1. To establish acceptable limits on data quality as a function of
project objectives, available resources, and measurement method
capabilities;
2. To establish recommended procedures and require their use to insure
the comparability of like data between projects;
3. To establish guidelines for the selection and use of additional
measurement methods necessary to assure the collection of data of
acceptable quality (i.e., of acceptable precision, accuracy and
completeness) on a project-by-project basis;
-------
To develop and implement quality control programs on each specific
IERL-RTP project;
To develop and implement the quality assurance procedures necessary
to independently monitor the efficiency of the individual project
quality control programs;
To identify areas requiring new or improved measurement methods in
order to achieve the level of quality required to satisfy project
objectives.
-------
SECTION IV ORGANIZATION FOR DATA QUALITY
The organizational structure, functional responsibilities, levels of
authority, and lines of internal and external communication for management,
direction, and execution of the quality program are given here. Individuals
and groups or organizations discussed include the quality assurance manager,
the quality assurance coordinator, the project officer, the Process Measure-
ments Branch, and contractors.
4.1 ORGANIZATION STRUCTURE
IERL-RTP organizational structure is shown in figure 1. Figure 2 presents
the laboratory's data quality program organization.
The chief of the Process Measurements Branch (PMB) is designated as the
quality assurance manager and on data quality matters reports to the IERL-RTP
director. The quality assurance coordinator directs the activities of the
quality assurance group, which is composed of the Process Measurements Branch
staff members and contract support. The coordinator is directly responsible
to the quality assurance manager.
The quality control organizational hierarchy shows the control line moving
from the appropriate division director, to branch chief, project officer, then
contractor.
4.2 FUNCTIONAL RESPONSIBILITIES
The functional responsibility assignments for individuals and organizational
components are given in this section.
4.2.1 Quality Assurance Manager
The chief of the Process Measurements Branch is the quality assurance
manager. This person is responsible for the design development, implementation,
and maintenance of the IERL-RTP data quality program. The chief directs the
efforts of the quality assurance coordinator and thus the data quality
activities of the Process Measurements Branch.
-------
oo
UTILITIES AND INDUSTRIAL POWER
DIVISION
Mr. Everett Plyler
—
Emissions/Effluent Technology Branch
Mr. Mike Maxwell (Act.)
Process Technology Branch
Mr. Richard Stern
•Paniculate Technology Branch
a • .. 1- _ A t_ L -.4. A
ENVIRONMENTAL PROTECTION AGENCY
INDUSTRIAL ENVIRONMENTAL RESEARCH LABORATORY
RESEARCH TRIANGLE PARK, NORTH CAROLINA
IERL-HTP
Director
Or. John Burcbard
-L
Deputy Director
Dr. Norfaert Jaworski
OFFICE OF PROGRAM OPERATIONS
Dr. John 0. Smith
Special Studies Staff
Or. VV. Gens Tucker
Planning, Management and
Administration Staff
Mr. William Rice
ENERGY ASSESSMENT AND CONTROL
DIVISION
Mr. Robert Hangebrauck
Combustion Research Branch
Dr. Joshua Bowen
Fuel Process Branch
Mr. T. Kelly Janes
Advanced Process Branch
Mr. P.P. Turner, Jr.
INDUSTRIAL PROCESSES
DIVISION
Mr. Alfred B. Craig
-
Chemical Processes Branch
Dr. Dale Denny
Metallurgical
Processes Branch
Mr. Norman Plaks
Process Measurements
Branch
Mr. James Dorsey
Figure 1. IERL-RTP organization chart.
November 1975
-------
IERL-RTP
Quality Assurance Line
Quality Control Line
DIVISION
DIRECTOR
Quality
Assurance
Manager
CHIEF, PROCESS
MEASUREMENTS
BRANCH
BRANCH
CHIEF
PMB
QUALITY
ASSURANCE
COORDINATOR
PROJECT
OFFICER
Quality
Assurance
Group
STAFF, PROCESS
MEASUREMENTS
BRANCH
CONTRACTOR
Figure 2. Data quality program organization.
-------
4.2.2 Quality Assurance Coordinator
The quality assurance coordinator is responsible for overseeing all IERL-
RTP data quality efforts. This person coordinates the activities of the
Process Measurements Branch in the data quality program. He/she works with the
appropriate project officers in designing and implementing project-specific
quality control and quality assurance plans.
4.2.3 Process Measurements Branch
The Process Measurements Branch is responsible for coordinating
all IERL-RTP quality activities. It initiates measures to insure the fulfill-
ment of the overall quality objectives of the laboratory, and for carrying out
the data quality policies in the most efficient and economical manner commen-
surate with insuring continuing acceptable levels of completeness, accuracy,
and precision of experimental data produced. A summary of the data quality
responsibilities and authority of the Process Measurements Branch is as follows:
1. It develops quality control guidelines and quality assurance programs,
including statistical procedures and techniques, which will help the
laboratory to meet desired quality standards at minimum cost; and
coordinates the implementation of such programs with the appropriate
project officer.
2. It reviews all measurement programs and insures that appropriate
methods have been selected for data acquisitions.
3. It reviews quality control activities of the various projects and
makes appropriate suggestions to the project officer regarding
corrections and improvement.
4. It seeks out and evalutes new ideas and current developments in the
field of quality assurance/control and recommends implementation
wherever advisable.
5. It advises project officers in preparing and/or reviewing requests
for proposals, work plans, project implementation, and
reports of work with respect to quality aspects of technology,
methods, and equipment.
6. It advises on packaging materials and procedures for sample handling
and on requirements for maintaining sample integrity.
7. It advises project officers concerning schedules for system checks,
calibrations, and other checking procedures.
10
-------
8. It evaluates data quality statistically and maintains related quality
assurance records and other pertinent information.
9. It coordinates the IERL-RTP program with the Environmental Monitoring
and Support Laboratory/Quality Assurance Branch (EMSL/QAB) program.
10. It prepares and issues periodic quality assurance reports on specific
projects to the project officer and the IERL-RTP director.
11. It prepares and issues periodic quality assurance summaries to the
IERL-RTP director and to the EMSL-RTP director.
The quality assurance function is in the Process Measurements Branch and is
under the direction of the quality assurance manager. The branch will have in-house
and contract technical support as required to establish and maintain the data
quality program.
4.2.4 Project Officer
The IERL-RTP project officer is ultimately responsible for the success or
failure of the project. This person thus has the responsibility for determining
the optimum level of quality control for the project and for seeing that the
program is implemented and maintained. The project officer is responsible for
quality control practices, beginning with the preparation of the request for
proposal (RFP) and extending through the final report, as discussed in detail
in section VIII.
4.2.5 IERL-RTP Contractors
For IERL-RTP projects conducted under contract, the contractor is respon-
sible for designing, developing, implementing, and maintaining a quality control
program to insure that the experimental data generated will be of suitable
quality to satisfy the project requirements, and also for documenting that
quality. The quality control plan must be approved by the IERL-RTP project
officer.
4.2.6 Functional Relationships
Once a project has been defined or a contract negotiated the time sequence
of events and interrelations between the Process Measurements Branch, the project
officer, and contractor are illustrated in figure 3. The first column in the
figure shows the data quality functions of the Process Measurements Branch. The
second column gives the functions of the project officer (in-house project) or the
project officer/contractor (contracts). The lines across columns show the points
of interaction between different quality components of the data quality program.
11
-------
QUALITY ASSURANCE GROUP
(Process Measurements Branch)
QUALITY CONTROL GROUP
(Project Officer/Contractor)
Develop standard
measurement
methods
Establish
data quality
specifications
Revi ew
measurements
program
Select
measurement
methods
Define
quality control
guidelines
Develop
quality control
program
Develop
quality
assurance
program
Implement
quality control
Perform
quality
assurance
audit
I
Issue
quality control
reports
I
Issue
quality
assurance
report
Figure 3. Flowchart of functional relationships.
12
-------
SECTION V IMPLEMENTATION PLAN AND SCHEDULE
The plan for developing and Implementing a comprehensive data quality
program for IERL-RTP is predicated on the philosophy that a quality program
should be implemented gradually and in phases such that the results from one
phase can be used constructively in implementing the next phase. Also, the
differences involved in implementing data quality programs to ongoing projects
as compared to new projects must be considered.
The implementation plan requires that quality control and quality assur-
ance procedures be applied to new projects, both in-house and contract, from
the programs inception. That is, the Process Measurements Branch is presently
available to assist the project officer (contract projects) in specifying
quality control requirements for inclusion in the RFP and evaluating quality
aspects of proposals and work plans. For new in-house projects, the Process
Measurements Branch is available to help project officers prepare quality
control plans for their projects.
The general plan and schedule for developing and implementing a laboratory-
wide data quality program, starting with the preparation of a planning
document, are given below.
5.1 IMPLEMENTATION SCHEDULE
A schedule for developing and implementing a laboratorywide data quality
program is given in figure 4. In December 1974, a planning document for an
IERL-RTP quality assurance program was prepared under the direction of the
Process Measurements Branch and distributed to the laboratory's senior staff
members. The date of the distribution of the plan-ning document is indicated
as the first milestone on the implementation schedule.
Milestones 2, 3 and 4 in figure 4 show, respectively, the preparation of
a general quality assurance plan for demonstration-size projects (EPA-600/2-
76-081); making the plan specific for the EPA Wet-Limestone demonstration
project at the Shawnee Steam-Electric Plant in Paducah, Kentucky (EPA-600/2-76-
082); and on-site implementation of that plan for evaluation purposes (EPA-
600/2-76-083). These three milestones occured in the last two quarters of
1975.
13
-------
TASKS
1. Planning document for
IERL-RTP quality
assurance
2. General QA plan for
demonstration size
project category
3. QA plan for EPA SHAWNEE
Scrubber Demonstration
Project
4. Trial implementation
of QA plan at SHAWNEE
Scrubber Demonstration
Project
5. QA group available to
implement QA to existing
demonstration projects
as requested
6. All new demonstration
projects require QC
and QA programs
7. General QC/QA plans
for the environmental
assessment project
category
8. General QC/QA plans for
development or pilot
programs
9. Implement and maintain
QA programs for
development or pilot
programs not covered
by 6 alone
10. General QC/QA plans
for field studies
11. General QC/QA plans
for research and
bench-scale projects
12. General QC/QA plans
for industry system
studies/pollutant
system studies
1974
4
A
1975
1234
A
A
/p
L
1976
1234
5^
A
ZeX
A
A
*.. nun
1977
1234
k
A
A
A
A
/.A
A
Figure 4. Data quality program implementation schedule.
14
-------
Since completing the general quality assurance plan and testing it under
field conditions, the primary effort has been directed toward implementing
quality assurance procedures to other existing demonstration-size projects.
Effective September 1, 1976, all new demonstration projects and those
existing demonstration projects having 1 year or longer to run must have quality
control and quality assurance plans approved by the Process Measurements Branch
(milestone 6). These plans include both quality control procedures to be carried
out by the individual or organization conducting the project and quality
assurance procedures to be administered through the Process Measurements Branch
for monitoring the efficiency of the quality control procedures.
After September 1, 1976, general quality assurance plans will be prepared
for the remaining five project categories, shown as milestones 7, 8, 10, 11, and
12 in figure 4. Because of the current interest in environmental assessments,
the second general guideline document will be for this category. As figure 4
illustrates, a complete data quality program covering all IEBL-RTP projects
should be in effect by early 1978.
5.2 ESTIMATED COST OF IMPLEMENTATION
A definitive cost estimate for implementing and maintaining a data
quality program cannot be made at this point in time. However, with the degree
of importance placed on data quality by IEKL-RTP, a realistic estimate of
average costs per project are: (1) quality control costs in the range of 8
to 10 percent of the total project budget, and (2) quality assurance costs
between 2 to 5 percent of the total project budget. From these estimates,
then, the costs of the IERL-RTP data quality program as described in this manual
when fully implemented should be in the range of 10 to 15 percent of the total
laboratory project budget.
The above cost estimates apply to new projects. The net additional cost,
if any, for implementing a data quality program to an ongoing project will
depend upon the project's current level of quality control and quality assurance
activities.
15
-------
SECTION VI COMPONENTS OF A COMPREHENSIVE QUALITY
CONTROL PROGRAM
6.1 GENERAL REMARKS
It is highly desirable that quality control and quality assurance programs
be built into an IERL-RTP project from its inception; i.e., that quality control
requirements be stated in the RFP, and that the contractor's provision for
meeting those requirements be evaluated in the proposal and in the ensuing work
plan. In this way, quality control is not treated as an extra-cost or add-on
problem, but rather as an integral part of the total project.
Similarly, the project officer must be aware of the desirability of main-
taining an adequate quality assurance program in order to insure the validity
of the data from the project. Individual project categories, from environmental
assessments through demonstration projects, have varying quality control and
quality assurance requirements. Section VIII will treat some of these require-
ments individually, especially with regard to the RFP, proposal evaluation,
and work plan review.
For efficiency of presentation, the major components of a quality control
program are treated in a general way in the following subsections. Figure 5
is a matrix that attempts to show the applicability of each of these components
to each project category. (A description of each category is given in section
VIII.) The matrix is obviously arbitrary, and should serve as a general guide
only, since individual projects will each have unique data requirements. The
coding of the various quality control components is by the appropriate sub-
section number; e.g., component 6.2 in the matrix refers to subsection 6.2,
Facilities and Equipment.
6.2 FACILITIES AND EQUIPMENT
A good beginning point in the assessment of an ongoing project is a
general survey of the facilities and equipment available for day-to-day opera-
tion. Are they adequate for the job at hand? Do standards exist for evalua-
tion of facilities, equipment, and materials?
The laboratories and data processing and other operational areas should
be neat and orderly, within common-sense limits imposed by the nature of the
16
-------
QUALITY CONTROL PROGRAM COMPONENTS
ff
ft
PROJECT CATEGORIES
ENVIRONMENTAL*
ASSESSMENTS
INDUSTRY/POLLUTANT
SYSTEM STUDIES
FIELD STUDIES
RESEARCH AND BENCH-
SCALE PROJECTS
DEVELOPMENT OR
PILOT-PLANT PROGRAMS
DEMONSTRATION
PROJECTS
1**
X
X
X
X
8
X
X
X
7_r / / W/ 7 / */"/ ,
X
X
X
X
X
X
X
X
X
1
X
x
x
X
x
X
X
X
X
x
X
x
X
x
Level I only; Levels II & III require more extensive QC programs, with Level III requiring all program
components, as with demonstration projects.
Figure 5. Applicability of quality control program components to project categories.
-------
operation. Laboratory benches, particularly areas where critical operations
such as weighting are carried out, should be kept clear of all but necessary
tools, glassware, etc. Personal items (coats, hats, lunch boxes) should not
be left in the work area. Provision should be made for storage of these items
in personal lockers. A neat, well-organized laboratory area serves to inspire
neatness and organization among the laboratory workers.
Good laboratory maintenance, particularly for certain types of instrumen-
tation, requires complete manuals that are kept in a convenient place so they
are readily available to appropriate personnel. Responsibility for keeping up
with any necessary manuals should be given to an individual, with the under-
standing that he/she must devise a system (check-in/checkout) for quick
location of each document.
6.3 CONFIGURATION CONTROL
For IERL projects of moderate to long-term duration, the documentation of
design changes in the system must be carried out unfailingly. Procedures for
such documentation should be written and be accessible to any individual
responsible for configuration control. It is all too easy, as the system is
modified repeatedly, to allow one key person to hold, largely by memory, great
amounts of vital information. Much of this information would be lost if this
person were no longer available. Engineering schematics should be maintained
current on both the system and subsystem level, and all computer programs
should be listed and flow charted. Changes in computer hardware and software
must be documented, even when such changes are apparently trivial. Significant
design changes must be documented and forwarded to the EPA project officer by
way of established procedure.
6.4 PERSONNEL TRAINING
For long-term projects, it is highly desirable that there be a programmed
training system for new employees. This system should include motivation
toward producing data of acceptable quality standards. A part of the program
should involve "practice work" by the new employee. The quality of the work
can be immediately verified and discussed with the supervisor, with appropriate
corrective action taken. This system is to be preferred to on-the-job
18
-------
training, which may be excellent or slipshod, depending upon a number of
circumstances.
Key personnel (laboratory supervisors, senior engineers) should be
required to document their specialized knowledge and techniques so far as
possible. They should each be required to develop an assistant, if the
program personnel situation allows, who could take responsibility when the
senior person is unavailable. A most undesirable situation arises when
replacement personnel must be brought in and forced to gain knowledge of the
program through the experience of trial and error (see subsection 3.3 above).
This is not an infrequent occurrence, however, when budgeting constraints
override other priorities.
A thorough personnel training program should focus particular attention
on those people whose work directly affects data quality (calibration personnel,
bench chemists, etc.). These people must be cognizant of the quality standards
fixed for the project and the reasons for those standards. They must be made
aware of the various ways of achieving and maintaining quality data. As these
people progress to higher degrees of proficiency, their accomplishments should
be reviewed and then documented. A motivating factor for high performance
could be direct and obvious rewards (monetary, status, or both), offered in a
manner visible to other comparable personnel.
6.5 DOCUMENTATION CONTROL
If the project is of the type that generates a number of documents, pro-
cedures for making revisions to these documents must be clearly written out.
The revisions themselves should be written and distributed to all affected
parties, thus insuring that the change will be implemented and become permanent.
If a technical document change pertains to an operational activity, that change
should be analyzed for side effects. The change should not be rendered
permanent until any harmful side effects have been controlled.
Revisions to computer software should be written with reasons for the
changes clearly spelled out. The revisions should be distributed to all
affected parties.
19
-------
6.6 CONTROL CHARTS
For demonstration or pilot-plant programs, or any project where data is
taken on a long-term basis, control charts are essential as a routine check on
the consistency or "sameness" of the data precision. A control chart should
be kept for each measurement that directly affects the quality of the data.
Typically, control charts are maintained for duplicate analyses, percent iso-
kinetic sampling rates, calibration constants, and the like. An example control
chart is given as figure 6. The symbol a (sigma) represents a difference, d,
of one standard deviation unit in two duplicate measurements, one of which is
taken as a standard or audit value. Two a is taken as a warning limit and 3a
as a control limit. If a laboratory measurement differs from the audit value
by more than 3(7, the technique is considered out of control. Control charts
are dealt with in depth in a number of standard texts on quality control of
engineering processes*.
6.7 IN-PROCESS QUALITY CONTROL
During routine operation, critical measurement methods should be checked
for conformance to standard operating conditions (flow rates, reasonableness
of data being produced, and the like). The capability of each method to pro-
duce data within specification limits should be ascertained by means of
appropriate control charts. When a discrepancy appears in a measurement method,
it should be analyzed and corrected as soon as possible.
For all standard methods, the operating conditions must be clearly defined
in writing, with specific reference to each significant variable. Auxiliary
measuring, gaging, and analytical instruments should be maintained operative,
accurate, and precise by regular checks and calibrations against stable
standards that are traceable to a primary standard, preferably furnished by
the U.S. Bureau of Standards (if available).
Quality Assurance Handbook for Air Pollution Measurement Systems, Vol. lj
Principles. EPA-600/9-76-005.
2Q
-------
2CT
-2a
-3o
CHECK NO.
ACTION LIMIT
WARNING LIMIT
•UCL
-CL
WARNING LIMIT
ACTION LIMIT
•LCL
8
10
DATE/TIME
OPERATOR
PROBLEM AND
CORRECTIVE
ACTION
Figure 6. Standard quality control chart.
21
-------
6.8 PROCUREMENT AND INVENTORY PROCEDURES
There should be well-defined and documented purchasing guidelines for all
equipment and reagents having an effect on data quality. Performance specifica-
tions should be documented for all critical items of equipment. Chemical
reagents considered critical to an analytical procedure are best procured from
suppliers who agree to submit samples for testing and approval prior to initial
shipment. In the case of incoming equipment, there should be an established
and documented inspection procedure to determine if procurements meet the
quality control and acceptance requirements. The results of this inspection
procedure should be documented.
Whenever discrepant materials are detected, the materials are either
returned or disposed of, at the discretion of the quality control supervisor.
Once an item has been received and accepted, it should be documented in
a receiving record log giving a description of the material, the date of the
receipt, results of the acceptance test, and the signature of the responsible
individual. It is then placed in inventory, which should be maintained on a
first-in, first-out basis. It should be identified as to type, age, and
acceptance status. In particular, reagents and chemicals that have limited
shelf life should be identified as to shelf expiration date and issued from
stock only if they are still within that date.
6.9 PREVENTIVE MAINTENANCE
For long-term projects, it is important that preventive maintenance
procedures be clearly defined and written for each measurement system and its
support equipment. When maintenance activity is necessary, it should be
documented on standard forms maintained in logbooks. A history of the
maintenance record of each system serves to throw light on the adequacy of its
maintenance schedule and parts inventory.
6.10 RELIABILITY
The reliability of each component of a measurement system relates directly
to the probability of obtaining valid data from that system. It follows that
procedures for reliability data collection, processing, and reporting should
be clearly defined and in written form for each system component. Reliability
22
-------
data should be recorded on standard forms and kept in a logbook. If this
procedure is followed, the data can be utilized in revising maintenance and/or
replacement schedules.
6.11 DATA VALIDATION
Data validation procedures must be defined for each project. An environ-
mental assessment and a demonstration project will have entirely different
procedures, since in one case the data are taken on a "one-point" basis and in
the other a great quantity of data is accumulated over a long period of time,
usually years. Whatever the nature of the project, it is important that the
criteria for data validation be documented. Whenever practical, acceptance
limits should be established, these limits being subject to modification as
the program continues. Any required data validation activities should be
recorded in standard form in a logbook. Where possible (as in most demonstra-
tion projects), validation criteria should be programmed, so that routine data-
taking will include automatic flagging of invalid data.
Most projects should, on a random but regular basis, be subjected to
quality audits. These audits must be independent of normal project operations,
preferably performed by an independent organization. Such audits should
include both systems reviews and independent measurement checks. Section 4.0
will discuss these elements of a quality assurance program.
6.12 FEEDBACK AND CORRECTIVE ACTION
Closely tied to the detection of invalid data is the problem of establish-
ment of a closed loop mechanism for problem detection, reporting, and correc-
tion. Here it is important that the problems are reported to those personnel
who can take appropriate action. For most projects, a feedback and corrective
action mechanism should be written out, with individuals assigned specific
areas of responsibility. Again, documentation of problems encountered and
actions taken is most important. Standard forms, kept in a logbook, are
recommended. If appropriate, a periodic summary report on problems and correc-
tive action should be prepared and distributed to the appropriate levels of
management. This report should include: a listing of major problems for the
reporting period; names of persons responsible for corrective action;
23
-------
criticality of problems; due dates; present status; trend of quality performance
(i.e., response time, etc.); and a listing of items still open from previous
reports.
6.13 CALIBRATION PROCEDURES
All IERL-RTP project categories except paper studies involve the taking of
experimental data. The quality of these data relates directly to the care with
which calibration procedures are carried out. It is not an exaggeration to say
that calibration procedures are the crux of any attempt to produce quality data
from a measurement system. For this reason it is extremely important that the
procedures be technically sound and consistent with whatever data quality re-
quirements exist for that system. Calibration standards must be specified for
all systems and measurement devices, with written procedures for assuring, on
a continuing basis, traceability to primary standards. Since calibration
personnel may change from time to time, the procedures must be in each instance
clearly written in step-by-step fashion. Frequency of calibration should be
set and documented, subject to rescheduling as the data are reviewed. Full
documentation of each calibration and a complete history of calibrations per-
formed on each system are absolutely essential. This permits a systematic
review of each system's reliability.
Good calibration techniques and procedures are also vital to short-term
experimental projects, since data taken from these studies may be used as
justification for expanded efforts in certain directions.
24
-------
SECTION VII GUIDELINES FOR QUALITY ASSURANCE PROGRAMS
7.1 GENERAL STATEMENT
The objective of a quality assurance program is to independently evaluate
the quality control program of a project. The quality assurance program must
be appropriate to the work being done and to the quality control program for
the project data. As mentioned earlier, it is most important that each project
have provision for an adequate quality control program. Section 7.2 will give
appropriate quality control statements for the RFP, while sections 7.3 and 7.4
will deal with the evaluation of quality control in the proposal and the work
plan. The initial stage of a quality assurance program should be assistance
in planning and development of the project data quality control program.
Short-term experimental projects must have rigorous quality control of the
ensuing data. Quality assurance on such projects may consist of "one-shot"
audits, or there may be no formal quality assurance because of time constraints.
A minimum quality assurance approach would involve sampling a percentage of the
raw data collected and verifying the calculations. The techniques used and
general approach could also be reviewed for appropriateness, and an attempt
made at comparing the work being done with that of other investigators. This
can be done off-site, if necessary, and at minimal expense.
For IERL projects of moderate to long duration, the assessment of quality
control should normally consist of a series of systems and performance audits.
The frequency of such audits obviously should be dictated by the specific
project. It is recommended that a minimum frequency be once each calendar
year. The initial systems and performance audit should take place within the
first quarter of the first project year. Subsequent scheduling should be
dependent on the requirements of management and the apparent quality of the
day-to-day data being obtained. More frequent auditing may be necessary in
the initial stages of the project.
7.2 THE REQUEST FOR PROPOSAL - QUALITY CONTROL ASPECTS
The design of the RFP is predicated on stating as clearly as possible what
the objectives of the project are; e.g., to design, construct, and maintain a
given control system, systematically examining the interaction of appropriate
25
-------
system parameters. The quality of the data obtained from the project will
depend upon numerous factors—instrumentation, personnel, sampling technique,
sample size, statistical expertise. It is therefore critical that the RFP be
as explicit as possible in delineating two things—what quality of data is
expected, and how that quality is to be insured.
Generally speaking, the RFP should require that the bidding organizations
address each of the major areas of quality control discussed in sections 6.2
through 6.13 of this manual. Reference to figure 3 gives those areas of
quality control considered particularly appropriate for each project category.
Since most RFP's are limited in length, it would in most cases be inappro-
priate to include more than a brief (one- or two-paragraph) statement of
quality control requirements. Nevertheless, it is most important that the bid
solicitation be as explicit as possible concerning quality control. In those
cases where an RFP is quite lengthy, the quality control statement may be
several pages long.
7.3 EVALUATION OF QUALITY CONTROL IN THE PROPOSAL
The proposal should contain a statement as to the precise position the
bidder's company takes regarding quality control programs. This should include
past projects and the quality control program effectiveness in that project.
In particular, there should be a clear and explicit response to the quality
control requirements stated in the RFP. This response must be compared directly,
item-by-item, with other proposals submitted against the RFP. The evaluation
should result in a determination of a "figure of merit" for the bidder's
quality control organization and the competence of the staff.
There should be provision for changes in procedures when it is evident
that data being obtained are not sufficiently accurate or appropriate for the
intent of the project as outlined by the project officer.
If a contractor has a good proposal but is unclear on some phases of data
quality, it would seem worthwhile to have him clarify his proposal by asking
him to answer specific questions. If the answers to these questions are still
vague, it is a good indication that the quality for these phases of the project
may be questionable if this contractor carries out the project.
26
-------
7.4 EVALUATION OF QUALITY CONTROL IN THE WORK PLAN
The work plan should be a detailed accounting of the actual steps to be
taken to complete the work delineated in the proposal and should be in direct
accord with the requirements of the KFP and other agreements with the project
officer. Particular attention should be placed on the areas considered
critical with respect to quality control, in order to realize the collection
of data having acceptable precision, accuracy, representativeness, and complete-
ness.
In cases where the submitted proposal has been accepted but lacks the
completeness required by the project officer, the problem areas should be
directly addressed in the work plan, showing the details of the work to be
done.
The work plan must be submitted to the project officer before any work is
begun by the contractor. The plan can be accepted in draft form, which will
allow for minor changes prior to the final plan's acceptance and approval.
7.5 THE ON-SITE QUALITATIVE SYSTEMS AUDIT
The objective of the on-site qualitative systems audit is to assess and
document (1) facilities; (2) equipment; (3) systems; (4) recordkeeping;
(5) data validation; (6) operation, maintenance, and calibration procedures;
and (7) reporting aspects of the total quality control program for a project.
The review should accomplish the following:
1. Identify existing system documentation; i.e., maintenance manuals,
organizational structure, operating procedures, etc;
2. Evaluate the adequacy of the procedures as documented; and
3. Evaluate the degree of use of and adherence to the documented
procedures in day-to-day operations, based on observed conditions
and a review of applicable records on file.
To aid the auditor in performing the review, a checklist is included as
appendix A. This checklist should be modified, as appropriate, for various
projects.
27
-------
7.6 THE PERFORMANCE AUDIT
In addition to a thorough on-site systems review, quantitative performance
audits should be periodically undertaken. The objective of these audits is to
evaluate the validity of project data by independent measurement techniques.
It is convenient to classify the major measurement methods into three areas:
physical measurements, gas stream measurements, and liquid stream measurements
(the latter including analysis of any suspended solids). Appendix B lists in
table form a number of standard techniques for auditing in the three major areas
just mentioned. Table 1 (of appendix B) is a compilation of commonly measured
physical properties, with a selection of possible measurement, calibration, and
audit techniques. Table 2, concentrating on analysis of gas effluent streams,
lists the material to be analyzed, and measurement, calibration, and audit
techniques for that material. Finally, table 3 very briefly and generally
deals with measurement methods appropriate to liquids and solids. The specific
techniques vary widely from project to project, but for the analytical phase
the audit technique generally involves use of reference samples of known
composition and/or splitting a sample among several laboratories for independent
analyses. It is desirable to perform calibration checks on individual system
components, and/or do side-by-side sampling runs to compare both sampling and
analysis technique precision and accuracy.
7.7 DATA QUALITY ASSESSMENT
Standard methods exist for estimation of the precision and accuracy of
measurement data. Efficient usage of the audit data requires that a rationale
be followed which gives the best possible estimates of precision and accuracy
within the limits imposed by timing, number of samples taken, and the general
situation at the project site.
Appendix C lists statistical definitions and techniques often used in
quality assurance work. Other statistical techniques exist which may apply to
specific projects (or to highly specialized areas of a given project). For
projects of sufficient duration, it is usually worthwhile to acquire the
services of a statistical consultant to most effectively treat the available
data.
28
-------
As a general guide to expected data quality for a number of reference
methods, appendix D lists in table form both ambient air and source sampling
methods. An estimate of the method bias and precision with comments on major
error sources is given, and the appropriate EPA quality assurance guideline
document for each method is referenced.
29
-------
SECTION VIII DEVELOPMENT OF QUALITY CONTROL AND QUALITY
ASSURANCE PLANS FOR IERL PROJECT CATEGORIES
8.0 GENERAL
One approach to developing quality control and quality assurance plans
within IERL is to divide all projects into six categories, with projects within
a given category having common characteristics as to size, duration, objectives,
and data quality requirements. This makes them amenable to the same general
set of quality control and quality assurance practices and procedures. Sugges-
tions as to applicable practices and procedures are given for each of the
project categories in the following subsections.
The primary characteristics and objectives of projects within each project
category are given, followed by a discussion of quality control and quality
assurance applications to the different phases of a project cycle. These
phases, when the project is performed by a contractor, include (1) request for
proposal, (2) proposal evaluation, (3) work plan review, (4) project implementa-
tion, and (5) final report. Quality control practices for in-house projects
could begin with the work plan review. The categories 1 through 6 are:
1. Environmental assessments,
2. Industry system studies/pollutant system studies,
3. Field studies,
4. Research and bench-scale projects,
5. Development or pilot programs, and
6. Demonstration projects.
8.1 ENVIRONMENTAL ASSESSMENTS
8.1.1 Description of Project Category
An environmental assessment involves "(1) a systematic evaluation of the
physical, chemical, and biological characteristics of the streams associated
with a process; (2) predictions of the probable effects of the streams on the
environment; (3) the prioritization of the streams; and (4) identification of
3Q
-------
any necessary control technology programs."* The objectives of an assessment
include identification and measurement of pollutants for which specific
standards have been set, and also those suspected to have harmful effects on the
environment. The ultimate goal of environmental assessments is insurance that
the effluent streams from a process are within current environmental standards
of acceptability.
Sampling and analysis programs for environmental assessments are extensive,
and presently are conducted in three distinct phases or levels. Level I in-
volves a broad survey of all process streams. Level II concentrates on those
streams identified in Level I as "high priority," i.e., having significant
amounts of harmful materials present. These prioritized streams are subjected
to more quantitative studies, for the purpose of establishing the appropriate
control technology. Level III is a long-term, continuous monitoring program
for selected "indicator" materials.
8.1.2 Applicability of Quality Control and Quality Assurance Procedures
Request for Proposal
The RFP must specify the data quality requirementst for each level of the
assessment. For purposes of process stream prioritization, the RFP for a given
industrial or energy process should indicate, if possible, the "threshold"
concentrations of specified suspected or known pollutants. These concentrations
would be the minimum for designating those pollutants for more quantitative
study in Level II. It should be stated that such designated concentrations are
subject to modification in light of the results of Level I studies.
Because of the completeness and comprehensiveness of the sampling and
analysis programs in an environmental assessment, the RFP will not generally be
able to anticipate the data quality requirements for each measurement. Never-
theless, every attempt should be made to specify the quality control and quality
assurance requirements as completely as possible, so that prospective contractors
Taken from a draft copy of "Guidelines for Environmental Sampling and
Analysis Programs - Historical Development and Strategy of a Phased Approach,"
by Dorsey, Lochmuller, Johnson and Statnick, IERL-EPA-RTP, 1976.
As quantitatively as possible.
31
-------
will be realistic in their estimations of the cost of quality control for the
assessment.
Proposal Evaluation
Environmental assessments generally require extensive sampling and analyt-
ical capability. Each proposal must be evaluated on the known personnel and
equipment inventory of the bidding organization. Instrumentation requirements
may be large, and there must be experienced technical and professional personnel
to carry out each level of the assessment. The ability to produce high-quality
data, within whatever specifications were made in the RFP, should enter heavily
into the overall evaluation of each proposal.
Work Plan Review
The work plan affords the contractor the opportunity to demonstrate in
some detail how the assessment will be carried out, and in particular how the
data quality will be assured. The project officer, after reviewing the work
plan with the Quality Assurance Group representative, may wish to negotiate
more or less time or resources to various phases of the assessment, in line
with EPA's conception of the program's priorities and data quality requirements.
The work plan should be as definite as possible with respect to how the data
will be acquired and what quality control procedures will be employed.
Project Implementation
Periodic reports to the project officer should contain appropriate sample
data, statistical treatments, and contractor judgments as to the quality of
data being currently obtained. These reports should be carefully read and
analyzed by the project officer and Quality Assurance Group representative,
since these reports serve as the primary means of communication from the
contractor. If inadequate data quality procedures are being used, the project
officer should immediately make arrangements to improve quality control at the
project, and perhaps should modify the nature and/or frequency of systems
reviews and performance audits.* These quality assurance programs can, if
*
The assumption here is that the reviews and audits can be carried out by
an audit team which visits the industrial or energy site and makes comparable
measurements, as well as observing the contractor team in operation. This may
not be practical in some situations.
32
-------
correctly timed, point out contractor quality control deficiencies and make for
minimum data loss.
Final Report
Quality control and quality assurance procedures must be included in the
final report of an environmental assessment, since the assessment is a phased
operation, and decisions as to priority will be made after each phase is
completed. Quality control and quality assurance must be documented such that
sufficient confidence may be placed in the data to justify the decisions made.
8.2 INDUSTRY SYSTEM STUDIES/POLLUTANT SYSTEM STUDIES
8.2.1 Description of Project Category
This category includes projects that are paper studies involving the
collection and analysis of air pollution measurement data from the literature
or personal contacts.
An industry system study as performed in-house or by a contractor for the
IERL Laboratory is of short to medium duration (normally 3 to 6 months).
Typical objectives of an industry system study are the following:
1. To characterize an industry with respect to the type and magnitude
of air pollutant emissions;
2. To assess the present degree of control of these emissions; and
3. To evaluate the technical feasibility and the economic and environ-
mental costs and benefits of improved control methods.
Pollutant system studies are also usually of short duration. The purpose
of these studies is to identify the sources and to estimate the concentration
of a particular pollutant.
The data reported in a paper study must be of sufficient quality to allow
IERL to determine the appropriate followup action. Quality requirements will
depend upon the financial implications of the decisions to be made on the basis
of the data. The quantitative confidence level exhibiting these characteristics
varies with the type of pollutant. For example, an industry system study
reporting of a nitrogen oxide emission level as 50 ppm + 30 ppm would probably
provide an adequate basis from which to determine that the emissions were of
little consequence, whereas reporting the emission of vinyl chloride to 50 ppm
33
-------
+ 30 ppm would most likely spur an immediate followup for additional informa-
tion. In general, the accuracy and precision of data reported in a paper study
is not as critical as the data quality from, for example, a specific pilot plant
study, since the paper study will be for general program development decisions
and will be followed by more detailed and specific projects when necessary.
8.2.2 Application of Quality Control and Quality Assurance Procedures
Request for Proposal
Quality assurance for a study begins with preparation of the RFP. The
project officer should consult with the Quality Assurance Group to insure that
proposals can be evaluated on an equivalent basis. The RFP should include the
specific type of data to be obtained and an indication of the type of precision
and accuracy statements to be attached to the reported data. For example, in
a pollutant system study (or industry system study), it may be stated that:
1. "All pollutant sources should be identified for which the concen-
tration of NO exceeds 50 ppm;"
-cv
or
2. "Concentration of NO is to be given for identified sources with
j^
estimates of the precision of the reported levels, e.g., + 20 percent
of the reported value."
The RFP should specify or request the bidder to estimate the sample size
necessary to insure that the sample is representative of the population of
interest. General requirements for statistical evaluation of the data obtained
as well as guidelines for referencing and assignment of confidence levels to
the data should be given in the RFP.
Proposal Evaluation
A representative from the Quality Assurance Group should evaluate the
proposals and prepare a report to the project officer describing the ability of
each contractor to accomplish the above requirements. The contractor's proposal
rationale for analyzing the data and making inferences and conclusions also
provides a preview of quality control for the project. For example, the means
by which a contractor may propose to estimate the data quality if 2. above is
the suggested objective might be: (1) to use the Federal Register guidelines
on precision and accuracy for standard reference methods when the information
34
-------
is not presented in a report, (2) to compare to the reference methods if the
results are quoted with precision statements, or (3) to call the author of the
report/data tabulations if neither of the above is applicable.
Work Plan Review
When the contract is awarded, preparation and review of the work plan
allows refinement of the quality control procedures outlined in the proposal.
The project officer in conjunction with the Quality Assurance Group representa-
tive can ascertain if the contractor has allowed sufficient time and resources
to develop a reasonable data base from which statistical correlations can be
developed. The proposed methodology of analyzing and drawing conclusions from
the data can also be reviewed at this time for adequacy and completeness. The
work plan is expected to be more definitive than the proposal and should address
the data quality aspects, when required, even if the proposal did not respond
to those.
Project Implementation
During the course of the project, monthly reports from the contractor to
the project officer provide the opportunity for continuing quality assurance.
Review of monthly reports provides the project officer the opportunity to
detect quality problems by comparing the interim results against acceptance
criteria established in the work plan. Early detection of problems can lead
to correction in a timely manner to avoid the use of questionable data or the
loss of time occasioned by correcting deficiencies late in the program. Correc-
tion of the problem may consist of modification of the information collection
or correlation procedure or perhaps a redefinition of the project objectives
in light of the accumulated experiences.
Final Report
The results of the study should be evaluated in view of several points of
consideration. The degree of depth of the evaluation clearly depends on the
data quality objectives. Some of the points to be considered are:
1. Data sources;
2. Inferences from data on:
a. the effect of data precision and accuracy on conclusions,
b. the appropriateness of the analysis procedure (engineering and
statistics);
35
-------
3. Comparison with comparable studies/research that are considered to
be of good quality;
4. Limitations of data; and
5. The need to make measurements.
A summary of the applicability of the reported data along with the
appropriate comments on their quality should be included in the final report
when requested. This information should be reviewed by the Quality Assurance
Group and reported in abbreviated form to the laboratory director with a copy
to the project officer. These statements provide a source of information on
the adherence of the contractor to the data quality criteria as described in
the request for proposal, proposal, and/or work plan. This report by the
Quality Assurance Group serves as a project evaluation file and a source of
quality assurance information.
8.3 FIELD STUDIES
8.3.1 Description of Project Category
Projects in this category are usually of short duration, i.e., less than
2 or 3 months. There will be a great variation in data quality requirements
depending on the project objectives. A field study may, for example, be for
evaluation of a new device for sizing submicron particles.
The conduct of such studies will typically require a contractor familiar
with the industry, control system, or device to be assessed or evaluated. The
quality control techniques for the pertinent measurement systems will need to
be supplied by the project officer and Quality Assurance Group. The contractor
should be required to use standard reference methods or an equivalent method,
if possible. The burden of proof of using another equivalent method would be
upon the contractor. The justification for using nonreference methods should
be clearly stated.
8.3.2 Application of Quality Control and Quality Assurance Procedures
The degree of precision and accuracy requested in the reported results of
field studies may vary considerably from one study to another, and consequently
it is necessary to specify as carefully as possible the requirements of the
prospective contractor in the RFP. For example, measurement methods may be
36
-------
specified, the desired precision of the results given, and the responder asked
to indicate in the proposal how the desired data quality levels can be attained
through proper measurement methods and appropriate quality control procedures.
The Quality Assurance Group can provide advice as to what levels of precision
are attainable, based on past experience.
Proposal Evaluation
The proposals submitted can be reviewed and rated as to their responses
to .the data quality aspects and the quality assurance methods. The Quality
Assurance Group would utilize its file of past experience on contractors to
aid in this evaluation.
Work Plan Review
After contract award, the work plan becomes one of the most effective
means for assurance of data quality through the use of appropriate measurement
methods and quality control techniques, and the use of suggested standards for
checking on the precision and accuracy of the method. The Quality Assurance
Group will review the work plan in detail for its inclusion of applicable
quality control techniques.
Project Implementation
During the project implementation phase, the Quality Assurance Group will
be kept informed through periodic reports from the contractor to the project
officer and from the project officer concerning adherence to the work plan, any
problems in doing so, and suggested corrections to problems. If the Quality
Assurance Group can advise as to the means for assuring data quality, it should
indicate this to the project officer for consideration/use by the contractor.
The contractor's proficiency in making the measurements can be evaluated and
estimates of the precision and accuracy of the measurements made by the use of
on-site system reviews and audits, conducted at the beginning and periodically
throughout the project life.
Final Report
At the termination of the contract and upon submittal of the final report,
the Quality Assurance Group should file an evaluation report on the study
briefly summarizing the pertinent data quality information, the capability of
V
the contractor to produce high-quality data, and general comments on the
37
-------
contractor's quality assurance procedures as appropriate. This file of
evaluative information provides a ready source of data for future contractor
evaluation and data quality improvement techniques. The good techniques are
just as important to include in the evaluation as the deficient ones.
8.4 RESEARCH AND BENCH-SCALE PROJECTS
8.4.1 Description of Project Category
Research and bench-scale projects are exploratory studies of a pollutant
measurement or control method, with the primary objectives of developing and
evaluating a successful process or device. The duration of this type of
project is variable and may continue concurrently with pilot plant or prototype
studies.
Research projects are defined here as primarily paper studies consisting
of the theoretical development of a new technology. The bench-scale project is
the first experimental attempt to demonstrate the postulated results. The
data quality requirements of a bench-scale study are dependent on the nature
of the particular project. For example, documentation of the precision and
accuracy of measurements would be critical when bench-scale project results
support pilot plant studies, whereas qualitative measurements with an accuracy
of + 20 percent may be adequate for a preliminary demonstration of the feasi-
bility of a new pollutant control device.
8.4.2 Application of Quality Control and Quality Assurance Procedures
Request for Proposal
An RFP leading to research and bench-scale studies may of necessity be
general since IERL cannot always anticipate the characteristics of novel devices
and innovative techniques proposed by contractors. However, the RFP and
proposal evaluation should require a discussion of measurement systems and basis
for selection.
Work Plan Review
When the contract is awarded, more specific guidelines can be laid out in
the preparation and review of the work plan, although a stringently defined
test plan is not always applicable to a bench-scale study. The project may be
38
-------
conducted in a more cost-effective manner by allowing the researcher to make
exploratory studies with inexact measurements in the early stages of the project
and then to concentrate on additional measurement development specific to the
most promising directions.
Project Implementation
During the course of the project, frequent communication between the
contractor, the project officer, and the Quality Assurance Group is essential
for quality assurance. Written monthly reports are useful, but frequently more
cost-effective project management and quality assurance can be ascertained
through personal conferences to review the progress of the work and the intended
direction for continuation. If the project is of such a nature that measurements
of high precision and accuracy are required, the project officer should require
documentation of the procedure for operations, calibrations, quality checks,
and validation of data. Data quality may be assessed by means of performance
audits using duplicate, control, and/or blind samples. These methods may
detect quality problems and allow for their correction early in the program.
Final Report
Quality assurance in the final report of a research or bench-scale project
is of critical importance, since the study may lead to additional funding of
the process or equipment development at a more sophisticated level. The methods
used for sampling and measurement of data, for analysis, and for correlation
must be rigorously presented so that the results could be duplicated in an
independent study if desired.
8.5 DEVELOPMENT OR PILOT PROGRAM
8.5.1 Description of Project Category
This program has as one of its major objectives the development and refine-
ment of theoretical and empirical models relating the process variables and
system parameters to the response variables of interest, e.g., concentration of
particulate matter. Hence, this is the stage of development in which an
experiment consisting of a series of runs is planned for deriving the relation-
ships of interest. In some cases the determination will be made of "optimum"
values of control system parameters for various plant parameters. Data quality
39
-------
aspects are described in the steps of the contract procurement and implementa-
tion cycle.
8.5.2 Application of Quality Control and Quality Assurance Procedures
Request for Proposal
In this stage of control equipment development, the objectives of the
program should be clearly indicated as to data quality. The statement of work
should include the request that the contractor suggest an experimental plan
that will assure the attainment of the quality objectives (assuming these
objectives to be compatible with the existing or developed technology). The
RFP should give relatively high priority to the data quality control plan as an
evaluation criterion of the proposal. Types of statements which might be
employed in the RFP are:
"Submit a proposed experimental plan to relate variables X, Y, Z,
and W to concentration of the pollutant in the effluent stream
and indicate how the precision of the relationship will be checked
by the use of the design plan and the associated analysis."
"Indicate the methods of measurement of critical variables and the
expected precision and accuracy of these methods."
Proposal Evaluation
At this stage, the project officer and Quality Assurance Group should
review the data quality aspects of the proposals for the purpose of rating the
prospective contractors on their apparent ability to provide data of good
quality. At this point it should be noted that past records of contractor
performance will be very helpful. Judgment of new prospective contractors must
be made on the basis of a careful evaluation of their discussion. Absence of
any details on data quality aspects, even though the discussion may have very
good general ideas, would be a signal for concern and lower ratings.
Work Plan Review
Shortly after the award of the contract to a company, the Quality Assurance
Group should assist the project officer in the role of evaluating the adequacy
of the quality control aspects of the work plan. The plan should have consid-
erably more detail concerning data quality than the proposal. Also it is
possible that the selected contractor was strong on several other rating
4Q
-------
criteria in the proposal evaluation but lower in the quality control aspects.
This would be the appropriate time to request that the contractor strengthen
the data quality program, in particular the experimental plan. At this point,
in order to insure good quality data that will meet the needs of IERL, the
following factors should be considered:
1. The experimental plan to attain objectives;
2. The means for adherence to the plan—control of system parameters;
3. Measurement methods of sufficient precision and accuracy for meeting
the project needs, and use of standards in checking the data quality;
4. Appropriate checks of measurements under specified conditions/
control system parameters (i.e., independent or repeat runs should
be included in the experimental plan, and duplicate measurements
should be made for those results with large variation; for example,
if the coefficient of variation [100 x standard deviation/mean] is
greater than 5 percent).
5. Data reporting and quality checks included in plan;
6. Analysis methods with indicated checks, if necessary, to insure
valid results.
The ultimate benefits of a well-designed experimental program should be
the provision of the greatest information per unit cost of the experiment.
Because the cost of a single run of an experiment of the type required at this
stage of development is high, it places the greatest importance on making the
best possible use of the data, e.g., utilizing the preferred analysis methods
resulting in valid interpretations with attached measures of confidence/preci-
sion.
Project Implementation
Throughout the conduct of the project work, the Quality Assurance Group
should be made aware of any quality control problems through appropriate
reports, e.g., monthly progress reports and quarterly reports. The project
officer should provide for on-site checks of the quality control procedures.
This may be performed by personnel of the Quality Assurance Group or a
contractor so designated by the Group, with a written report provided to the
Quality Assurance Group. The Quality Assurance Group should advise the project
officer and project contractor on such matters of quality control as:
41
-------
1. use of specific standards for checking pollutant measurement methods,
2. how often checks should be made,
3. calibration frequency,
4. use of quality control techniques as required, and
5. interlaboratory programs for pollutant measurement methods.
Final Report
The final report on the project should include the necessary information
on data quality. Raw data should be provided as an appendix to the report.
Only in circumstances of unusually large amounts of data should they be sepa-
rated from the report, e.g., provided in one copy by computer printout. The
analysis of the data should be illustrated by example, the statement of
precision and accuracy of the data included, with method of computation,
particularly when different from well-documented methods. If relationships
are presented (e.g., concentration of S02 vs. selected control system parameters)
raw data should be shown on the figures, if two-dimensional. Otherwise, the
raw data and corresponding predicted mean given by the relationship are to be
given in tabulated form. This allows more ready interpretation by IERL. The
raw data should be given by order of the experimental run along with remarks or
conditions surrounding the run which may be pertinent in some later evaluation
or analysis. The inability to obtain data for certain parameter values should
be noted. Pertinent raw data on the instrument checks should also be included
in the report with supporting data maintained in the lab notebooks, should
any questions arise. A statement of limitations and applicability of the data
should be included in the final report.
Based on the final report, the Quality Assurance Group should provide a
summary report to the director of IERL with a copy to the project officer
indicating the pertinent data aspects of the project. A checklist of consid-
erations should be employed for convenience in such evaluations.
8.6 DEMONSTRATION PROJECTS
8.6.1 Description of Project Category
These projects have as their goal the demonstration of control systems,
models, and methods in full-scale systems, i.e., under plant operational
42
-------
conditions. The control systems methods and models will have been developed
under previous research and pilot-scale operations. The demonstration projects
usually will be of at least 1-year duration. The values of the control system
parameters and plant operation variables will be changed very little, offering
little opportunity for experimental research. There will be the opportunity
for the use of both quality control and quality assurance techniques. The data
quality will be an important aspect of this type of study. For example, quality
control chart data should be maintained on the critical parameters and response
measurements and these data provided with the final report in an appropriate
summary form. There will be the need for using well-documented measurement
methods of desired precision/accuracy. Standards should be used to calibrate
and maintain control of quality of the measurement methods. The data quality
aspects should be described in reference to the steps of the procurement and
project implementation stages of the control system demonstration and evaluation.
8.6.2 Application of Quality Control and Quality Assurance Procedures
Request for Proposal
The objectives of the demonstration program for the specified control
system and its associated models and methods of analysis should be clearly
indicated in the RFP. The contractor should be requested to give relatively
high priority to data quality, quality control and assessment techniques,
maintenance and calibration of pollutant measurement instruments, use of
appropriate standards, and other general quality control capabilities in the
proposal.
The Quality Assurance Group should provide inputs to the RFP based on the
previous studies of the control system and the requirements of data precision
and accuracy. An example of such a statement would be:
"Methods used for measuring the emissions of the pollutant
must include the standard reference methods of the Federal
Register—or an equivalent method, provided proof is given
that the method employed is equally precise and accurate."
Proposal Evaluation
The proposals should be reviewed and rated as to the discussion of the
data quality aspects of the demonstration study. This rating should be based
on the prospective contractor's familiarity with quality control procedures,
43
-------
his current quality control activities, performance on previous studies if
available from the Quality Assurance Group files, and other general laboratory
practices, such as preventive maintenance of equipment. The Quality Assurance
Group should be responsible for review of this aspect of the proposals.
Work Plan Review
The selected contractor should prepare a work plan shortly after contract
award, detailing the project demonstration plan, which includes the quality
control aspects of the study. The work plan should contain much more detail
concerning quality control than the proposal. In cases where quality control
may have been a weak point of the selected contractor's proposal, this aspect
of the work plan will need to be reviewed carefully to assure that:
1. The measurement program will meet the objectives of the study if
properly implemented;
2. Means are presented for controlling and monitoring the plant and
control system parameters;
3. Measurement methods are sufficiently precise and accurate for
meeting the project needs; that standards will be used to calibrate
and maintain instruments, and interlaboratory and intralaboratory
measurements will be used to assure data quality;
4. Quality control charts will be maintained on selected parameters
deemed to be important;
5. Data reporting and quality checks will be included in the plan
and data quality will be assessed;
6. Problem identification, e.g., out-of-control operations, will be
indicated, with progress reports that give the problem and its
effect on data quality.
Project Implementation
During the implementation stage, the Quality Assurance Group should be
provided a copy of all progress reports that relate to any aspect of data
quality. At certain indicated milestones in the project work, the group
should check to see that the contractor is adhering to the work plan (data
quality aspects), and should advise the project officer and contractor of the
availability of certain standards, estimated precisions of specific methods
44
-------
(if available and unknown to them), and uses of quality control charts. The
Quality Assurance Group is to serve for the duration of the project both as a
resource group, because of its records of similar information from several
projects that may use the same methods, and as an advisory source of information
on general aspects of quality assurance. The Quality Assurance Group should
design and coordinate through the project officer a program of performance
audits and on-site system inspections sufficient to assure, assess, and docu-
ment the data quality throughout the project's duration.
Final Report
The final report serves as a means of reporting and properly summarizing
the raw data, data quality, the precision and accuracy of measurement methods,
the methods of engineering and statistical analysis, and quality control chart
information. All raw data should be included in the final report, to the extent
possible. If it is excessive, then one computer printout might be filed with
IERL to back up the laboratory notebook maintained by the contractor, should
any problems arise in interpretation and analysis at a later date.
Methods of analysis of data quality should be provided unless they are
well-documented in the literature, in which case they may be referenced.
A statement of limitation, if any, and applicability of the results should
be given in the report. This may be part of an executive summary or in a
section entitled "Data Quality — Applicability of Methods and Results."
The results of the project demonstration as reported in the final report
should be assessed by the Quality Assurance Group, against both the data
quality objectives set forth in the RFP and the details provided in the work
plan. The contractor should be rated according to a checklist of items to be
used in such studies. A copy of this evaluation should be given to the director
of IERL, to the project officer, and perhaps to QAB. This file of project
evaluation closes the loop and provides useful information for future projects,
proposal evaluations, RFP statements, and alternate improvements of data
quality in all IERL-RTP projects in the area of activity.
45
-------
SECTION IX REFERENCES
1. Glossary and Tables for Statistical Quality Control, the American Society
for Quality Control, Milwaukee, Wisconsin, 1973.
2. Quality Assurance Handbook for Air Pollution Measurement Systems,
Volume 1, Principles, EPA-600/9-76-005.
3. "Planning Document for a Control Systems Laboratory Quality Assurance
Program," Final Report for EPA Contract No. 68-02-1398, Task 8,
December 1974.
4. "Guidelines for Demonstration Project Quality Assurance Programs,"
Final Report for EPA Contract No. 68-02-1398, Task 20, January 1976.
5. "A Quality Assurance Program for the EPA Wet Limestone Scrubber Demon-
stration Project, Shawnee Steam-Electric Plant, Paducah, Kentucky,"
Final Report for EPA Contract No. 68-02-1398, Task 20, January 1976.
6. "Development and Trial Field Application of a Quality Assurance Program
for IERL Projects," Final Report for EPA Contract No. 68-02-1398, Task
20, January 1976.
46
-------
APPENDIX A QUALITATIVE ON-SITE SYSTEMS
AUDIT CHECKLIST
This checklist gives three descriptions to each facet of a typical
quality control system. In all cases, the "5" choice indicates the most
desirable and effective mode of operation; "3" is marginal and tolerable;
"1" is definitely unacceptable and ineffective as a mode of operation.
It is not always possible to describe accurately all options with only
three choices. Therefore, a "2" or "4" rating may be selected if the
evaluator feels that an in-between score is more descriptive of the actual
situation.
After all the applicable questions are answered, an average is computed
to give an overall indication of the quality control system effectiveness.
Generally, a rating of 3.8 or better is considered acceptable.
A rating between 2.5 and 3.8 indicates a need for improvement but no
imminent threat to current project performance.
47
-------
A.I QUALITY ORGANIZATION
SCORE
(1.1) Overall responsibility for quality assurance (or
quality control) for the organization is:
(a) Assigned to one individual by title (e.g.,
Quality Control Coordinator). 5
(b) Assigned to a specific group within the organization. 3
(.c) Not specifically assigned but left to the discre-
tion of the various operational, analytical, inspect
tion, and testing personnel. 1
Cl.2) The Quality Control Coordinator is located in the
organization such that;
(a) He has direct access to the top management level
for the total operation, independent of others in-
volved in operational activities. 5
(b) He performs as a peer with others involved in
operational activities, with access to top manage-
ment through the normal chain of command. 3
(c) His primary responsibility is in operational
activities, with quality assurance as an extra or
part-time effort. 1
(1.3) Data reports on quality are distributed by the Quality
Control•Coordinator to;
(a) All levels of management.* 5
(b) One level of management only. 3
(c) The quality control group only. 1
(1.4) Data Quality Reports contain;
(a) Information on operational trends, required
actions, and danger spots. 5
(b) Information on suspected data/analyses and
their causes. 3
(c) Percent of valid data per month. 1
Management at appropriate levels in all applicable organizations such
as subcontractors, prime contractor, EPA.
48
-------
A.2 THE QUALITY SYSTEM
SCORE
(2.1) The quality control system is:
(a) Formalized and documented by a set of procedures
which clearly describe the activities necessary
and sufficient to achieve desired quality objec-
tives, from procurement through to reporting data
to the EPA/RTP. 5
(b) Contained in methods procedures or is implicit in
those procedures. Experience with the materials,
product, and equipment is needed for continuity
of control. 3
Cc) Undefined in any procedures and is left to the cur-
rent managers or supervisors to determine as the
situation dictates. 1
C2.2) Support for quality goals and results is indicated by:
(a) A clear statement of quality objectives by the top
executive, with continuing visible evidence of its
sincerity, to all levels of the organization. 5
(b) Periodic meetings among operations personnel and the
individual(s) responsible for quality assurance, on
quality objectives and progress toward their achieve-
ment . 3
(c) A "one-shot" statement of the desire for product
quality by the top executive, after which the quality
assurance staff is on its own. 1
(2.3) Accountability for quality is:
(a) Clearly defined for all sections and operators/
analysts where their actions have an impact on
quality. 5
(b) Vested with the Quality Control Coordinator who
must use whatever means possible to achieve quality
goals. 3
Cc) Not defined. 1
49
-------
A.2 THE QUALITY SYSTEM (continued)
SCORE
(2.4) The acceptance criteria for the level of quality
of the demonstration projects routine performance are:
(a) Clearly defined in writing for all characteris-
tics . 5
(b) Defined in writing for some characteristics
and some are dependent on experience, memory
and/or verbal communication. 3
(c) Only defined by experience and verbal communica-
tion . 1
(2.5) Acceptance criteria for the level of quality of the
project's routine performance are determined by:
(a) Monitoring the performance in a structured pro-
gram of inter- and intralaboratory evaluations.
(b) Scientific determination of what is technically
feasible.
(c) Laboratory determination of what can be done using
currently available equipment, techniques, and
manpower.
(2.6) Decisions on acceptability of questionable results are
made by:
(a) A review group consisting of the chief chemist or
engineer, quality control, and others who can render
expert judgment. 5
(b) An informal assessment by quality control. 3
(c) The operator/chemist. 1
50
-------
A.2 THE QUALITY SYSTEM (continued)
SCORE
(2.7) The quality control coordinator has the authority to:
(a) Affect the quality of analytical results by in-
serting controls to assure that the methods meet
the requirements for precision, accuracy, sensi-
tivity, and specificity. 5
(b) Reject suspected results and stop any method that
projects high levels of discrepancies. 3
(c) Submit suspected results to management for a
decision on disposition. 1
A.3 IN-PROCESS QUALITY ASSURANCE
(3.1) Measurement methods are checked:
(a) During operation for conformance to operating
conditions and to specifications, e.g., flow rates,
reasonableness of data, etc. 5
(b) During calibration to determine acceptability
of the results. 3
(c) Only when malfunctions are reported. 1
•>
(3.2) The capability of the method to produce within
specification limit is:
(a) Known through method capability analysis (X-R
Charts) to be able to produce consistently
acceptable results. 5
(b) Assumed to be able to produce a reasonably
acceptable result. 3
(c) Unknown. 1
(3.3) Method determination discrepancies are:
(a) Analyzed immediately to seek out the causes and
apply corrective action. 5
(b) Checked out when time permits. 3
(c) Not detectable with present controls and procedures. 1
51
-------
A.3 IN-PROCESS QUALITY ASSURANCE (continued)
SCORE
(3.4) The operating conditions (e.g., flow rate, range,
temperature, etc.) of the methods are:
(a) Clearly defined in writing in the method for each
significant variable. 5
(b) Controlled by supervision based on general guide-
lines . 3
(c) Left up to the operator/analyst. 1
(3.5) Auxiliary measuring, gaging, and analytical
instruments are:
(a) Maintained operative, accurate, and precise
by regular checks and calibrations against
stable standards which are traceable to the
U.S. Bureau of Standards.
(b) Periodically checked against a zero point or
other reference and examined for evidence of
physical damage, wear or inadequate maintenance.
(c) Checked only when they stop working or when ex-
cessive defects are experienced which can be
traced to inadequate instrumentation.
A.4 CONFIGURATION CONTROL
Procedures for documenting, for the record, any design
change in the system are:
(a) Written down and readily accessible to those
individuals responsible for configuration con-
trol. 5
(b) Written down but not in detail. 3
(c) Not documented. 1
52
-------
A.4 CONFIGURATION CONTROL (continued)
SCORE
(4.2) Engineering schematics are:
(a) Maintained current on the system and subsystem
levels. 5
(b) Maintained current on certain subsystems only. 3
(c) Not maintained current. 1
(4.3) All computer programs are:
(a) Documented and flow charted. 5
(b) Flow charted. 3
(c) Summarized. 1
(4.4) Procedures for transmitting significant design changes
in hardware and/or software to the EPA project officer
are:
(a) Documented in detail sufficient for implementation. 5
(b) Documented too briefly for implementation. 3
(c) Not documented. 1
A.5 DOCUMENTATION CONTROL
(5.1) Procedures for making revisions to technical documents
are:
(a) Clearly spelled out in written form with the line
of authority indicated and available to all involved
personnel. 5
(b) Recorded but not readily available to all personnel. 3
(c) Left to the discretion of present supervisors/mana-
gers . 1
53
-------
A.5 DOCUMENTATION CONTROL (continued)
SCORE
(5.2) In revising technical documents, the revisions are:
(a) Clearly spelled out in written form and distrib-
uted to all parties affected, on a controlled.basis
which assures that the change will be implemented
and permanent. 5
(b) Communicated through memoranda to key people who
are responsible for effecting the change through
whatever method they choose. 3
(c) Communicated verbally to operating personnel who
then depend on experience to maintain continuity
of the change. 1
(5.3) Changes to technical documents pertaining to opera-
tional activities are:
(a) Analyzed to make sure that any harmful side effects
are known and controlled prior to revision effectiv-
ity. 5
(b) Installed on a trial or gradual basis, monitoring
the product to see if the revision has a net bene-
ficial effect. 3
(c) Installed immediately with action for correcting side
effects taken if they show up in the final results. 1
(5.4) Revisions to technical documents are:
(a) Recorded as to date, serial number, etc. when the
revision becomes effective. 5
(b) Recorded as to the date the revision was made on
written specifications. 3
(c) Not recorded with any degree of precision. 1
54
-------
A.5 DOCUMENTATION CONTROL (continued)
SCORE
(5.5) Procedures for making revisions to computer software
programs are:
(a) Clearly spelled out in written form with the line
of authority indicated. 5
(b) Not recorded but changes must be approved by the
present supervisor/manager. 3
(c) Not recorded and left to the discretion of the
programmer. 1
(5.6) In revising software program documentation, the re-
visions are:
(a) Clearly spelled out in written form, with reasons
for the change and the authority for making the
change distributed to all parties affected by the
change. 5
(b) Incorporated by the programmer and communicated
through memoranda to key people. 3
(c) Incorporated by the programmer at his will. 1
(5.7) Changes to software program documentation are:
(a) Analyzed to make sure that any harmful side
effects are known and controlled prior to
revision effectivity.
(b) Incorporated on a trial basis, monitoring the
results to see if the revision has a net bene-
ficial effect.
(c) Incorporated immediately with action for detecting
and correcting side effects taken as necessary.
55
-------
A.5 DOCUMENTATION CONTROL (continued)
SCORE
(5.8) Revisions to software program documentation.are:
(a) Recorded as to date, program name or number, etc.,
when the revision becomes effective. 5
(b) Recorded as to the date the revision was made. 3
(c) Not recorded with any degree of precision. 1
A.6 PREVENTIVE MAINTENANCE
(6.1) Preventive maintenance procedures are:
(a) Clearly defined and written for all measurement
systems and support equipment. 5
(b) Clearly defined and written for most of the measure-
ment systems and support equipment. 3
(c) Defined and written for only a small fraction of the
total number of systems. 1
(6.2) Preventive maintenance activities are documented:
(a) On standard forms in station log books. 5
(b) Operator/analyst summary in log book. 3
(c) As operator/analyst notes. 1
(6.3) Preventive maintenance procedures as written appear
adequate to insure proper equipment operation for:
(a) All measurement systems and support equipment.
(b) Most of the measurement systems and support equip-
ment.
(c) Less than half of the measurement systems and sup-
port equipment.
56
-------
A.6 PREVENTIVE MAINTENANCE (continued)
SCORE
(6.4) A review of the preventive maintenance records indicates
that:
(a) Preventive maintenance procedures have been carried
out on schedule and completely documented. 5
(b) The procedures were carried out on schedule but not
completely documented. 3
(c) The procedures were not carried out on schedule all
the time and not always documented. 1
(6.5) Preventive maintenance records (histories) are:
(a) Utilized in revising maintenance schedules, de-
veloping an optimum parts/reagents inventory and
development of scheduled replacements to minimize
wear-out failures. 5
(b) Utilized when specific questions arise and for
estimating future work loads. 3
(c) Utilized only when unusual problems occur. 1
A.7 DATA VALIDATION PROCEDURES
(7.1) Data validation procedures are:
(a) Clearly defined in writing for all measurement
systems. 5
(b) Defined in writing for some measurement systems,
some dependent on experience, memory, and/or
verbal communication. 3
(c) Only defined by experience and verbal communica-
tion. !
57
-------
A.7 DATA VALIDATION PROCEDURES (continued)
SCORE
(7.2) Data validation procedures are:
(a) A coordinated combination of computerized and
manual checks applied at different levels in the
measurement process. 5
(b) Applied with a degree of completeness at no more
than two levels of the measurement process. 3
(c) Applied at only one level of the measurement pro-
cess. 1
(7.3) Data validation criteria are documented and include:
(a) Limits on: (1) operational parameters such as
flow rates; (2) calibration data, (3) special
checks unique to each measurement; e.g., succes-
sive values/averages; (4) statistical tests; e.g.,
outliers; (5) manual checks such as hand calcula-
tions . 5
(b) Limits on the above type checks for most of the
measurement systems . 3
(c) Limits on some of the above type checks for only
the high-priority measurements. 1
(7.4) Acceptable limits as set are reasonable and adequate
to insure the detection of invalid data with a high
probability for:
(a) All measurement systems. 5
(b) At least 3/4 of the measurement systems. 3
(c) No more than 1/2 of the measurement systems. 1
58
-------
A.7 DATA VALIDATION PROCEDURES (continued)
SCORE
(7.5) Data validation activities are:
(a) Recorded on standard forms at all levels of the
measurement process. 5
(b) Recorded in the operator's/analyst's log book. 3
(c) Not recorded in any prescribed manner. 1
(7.6) Examination of data validation records indicates that:
(a) Data validation activities have been carried out
as specified and completely documented. 5
(b) Data validation activities appear to have been
performed but not completely documented. 3
(c) Data validation activities, if performed, are not
formally documented. 1
(7.7) Data validation summaries are:
(a) Prepared at each level or critical point in the
measurement process and forwarded to the next level
with the applicable block of data. 5
(b) Prepared by and retained at each level. 3
(c) Not prepared at each level nor communicated between
levels. 1
(7.8) Procedures for deleting invalidated data are:
(a) Clearly defined in writing for all levels of the meas-
urement process, and invalid data are automatically
deleted when one of the computerized validation cri-
teria is exceeded. 5
(b) Programmed for automatic deletion when computerized
validation criteria are exceeded but procedures not
defined when manual checks detect invalid data. 3
(c) Not defined for all levels of the measurement pro-
cess. 1
59
-------
A.7 DATA VALIDATION PROCEDURES (continued)
SCORE
(7.9) Quality audits (i.e., both on-site system reviews and/or
quantitative performance audits) independent of the normal
operations are:
(a) Performed on a random but regular basis to ensure
and quantify data quality. 5
(b) Performed whenever a suspicion arises that there
are areas of ineffective performance. 3
(c) Never performed. 1
A.8 PROCUREMENT AND INVENTORY PROCEDURES
(8.1) Purchasing guidelines are established and documented
for:
(a) All equipment and reagents having an effect on data
quality. 5
(b) Major items of equipment and critical reagents. 3
(c) A very few items of equipment and reagents. 1
(8.2) Performance specifications are:
(a) Documented for all items of equipment which have
an effect on data quality. 5
(b) Documented for the most critical items only. 3
(c) Taken from the presently used items of equipment. 1
(8.3) Reagents and chemicals (critical items) are:
(a) Procured from suppliers who must submit samples
for test and approval prior to initial shipment. 5
(b) Procured from suppliers who certify they can meet
all applicable specifications. 3
(c) Procured from suppliers on the basis of price and
delivery only. 1
60
-------
A.8 PROCUREMENT AND INVENTORY PROCEDURES (continued)
SCORE
(8.4) Acceptance testing for incoming equipment is:
(a) An established and documented inspection procedure
to determine if procurements meet the quality assurance
and acceptance requirements. Results are document-
ed. 5
(b) A series of undocumented performance tests performed
by the operator before using the equipment. 3
(c) The receiving document is signed by the responsible
individual indicating either acceptance or rejection. 1
(8.5) Reagents and chemicals are:
(a) Checked 100% against specification, quantity, and
for certification where required and accepted
only if they conform to all specifications. 5
(b) Spot-checked for proper quantity and for shipping
damage. 3
(c) Released to analyst by the receiving clerk without
being checked as above. 1
(8.6) Information on discrepant purchased materials is:
(a) Transmitted to the supplier with a request for
corrective action. 5
(b) Filed for future use. 3
(c) Not maintained. 1
(8.7) Discrepant purchased materials are:
(a) Submitted to a review by Quality Control and
Chief Chemist for disposition. 5
(b) Submitted to Service Section for determination
on acceptability. 3
(c) Used because of scheduling requirements. 1
61
-------
A.8 PROCUREMENT AND INVENTORY PROCEDURES (continued)
SCORE
(8.8) Inventories are maintained on;
(a) First-in, first-out basis. 5
(b) Random selection in stock room. 3
(c) Last-in, first-out basis. 1
(8.9) Receiving of materials is;
(a) Documented in a receiving record log, giving a
description of the material, the date of receipt,
results of acceptance test, and the signature
of the responsible individual. 5
(b) Documented in a receiving record log with material
title, receipt date, and initials of the individual
logging the material in. 3
(c) Documented by filing a signed copy of the requisi-
tion. 1
(8.10) Inventories are:
(a) Identified as to type, age, and acceptance status. 5
(b) Identified as to material only. 3
(c) Not identified in writing. 1
C8.ll) Reagents and chemicals which have limited shelf life are:
(a) Identified as to shelf life expiration data and
systematically issued from stock only if they
are still within that date. 5
(b) Issued on a first-in, first-out basis, expecting
that there is enough safety factor so that the
expiration date is rarely exceeded. 3
(c) Issued at random from stock. 1
62
-------
A.9 PERSONNEL TRAINING PROCEDURES
SCORE
(9.1) Training of new employees is accomplished by:
(a) A programmed system of training where elements of
training, including quality standards, are included
in a training checklist. The employee's work is
immediately rechecked by supervisors for errors or
defects and the information is fed back instanta-
neously for corrective action. 5
(b) On-the-job training by the supervisor who gives
an overview of quality standards. Details of
quality standards are learned as normal results
are fed back to the chemist. 3
(c) On-the-job learning with training on the rudi-
ments of the job by senior coworkers. 1
(9.2) When key personnel changes occur:
(a) Specialized knowledge and skills are retained in
the form of documented methods and descriptions. 5
(b) Replacement people can acquire the knowledge of
their predecessors from coworkers, supervisors,
and detailed study of the specifications and
memoranda. 3
(c) Knowledge is lost and must be regained through long
experience or trial-and-error. 1
(9.3) The people who have an impact on quality, e.g., cali-
bration personnel, maintenance personnel, bench chemists,
supervisors, etc., are:
(a) Trained in the reasons for and the benefits of
standards of quality and the methods by which
high quality can be achieved. 5
(b) Told about quality only when their work falls below
acceptable levels. 3
(c) Are reprimanded when quality deficiencies are
directly traceable to their work. 1
63
-------
A.9 PERSONNEL TRAINING PROCEDURES (continued)
SCORE
(9.4) The employee's history of training accomplishments
is maintained through:
(a) A written record maintained and periodically
reviewed by the supervisor. 5
(b) A written record maintained by the employee. 3
(c) The memory of the supervisor/employee. 1
(9.5) Employee proficiency is evaluated on a continuing
basis by:
(a) Periodic testing in some planned manner with the
results of such tests recorded. 5
(b) Testing when felt necessary by the supervisor. 3
(c) Observation of performance by the supervisor. 1
(9.6) Results of employee proficiency tests are:
(a) Used by management to establish the need for and
type of special training. 5
(b) Used by the employee for self-evaluation of needs. 3
(c) Used mostly during salary reviews. 1
A.10 FEEDBACK AND CORRECTIVE ACTION
(10.1) A feedback and corrective action mechanism to assure
that problems are reported to those who can correct them
and that a closed loop mechanism is established to assure
that appropriate corrective actions have been taken is:
(a) Clearly defined in writing with individuals assigned
specific areas of responsibility. 5
(b) Written in general terms with no assignment of
responsibilities. 3
(c) Not formalized but left to the present supervisors/
managers. 1
64
-------
A.10 FEEDBACK AND CORRECTIVE ACTION (continued)
SCORE
(10.2) Feedback and corrective action activities are:
(a) Documented on standard forms. 5
(b) Documented in the station log book. 3
(c) Documented in the operator's/analyst's notebook. 1
(10.3) A review of corrective action records indicates that:
(a) Corrective actions were systematic, timely, and
fully documented. 5
(b) Corrective actions were not always systematic,
timely, or fully documented. 3
(c) A closed loop mechanism did not exist. 1
(10.4) Periodic summary reports on the status of corrective
action are distributed by the responsible individual to:
(a) All levels of management. 5
(b) One level of management only. 3
(c) The group generating the report only. 1
(10.5) The reports include:
(a) A listing of major problems for the reporting
period; names of persons responsible for correc-
tive actions; criticality of problems; due dates;
present status; trend of quality performance (i.e.,
response time, etc.); listing of items still open
from previous reports. 5
(b) Most of the above items. 3
(c) Present status of problems and corrective actions. 1
65
-------
A.ll CALIBRATION PROCEDURES
SCORES
(11.1) Calibration procedures are:
(a) Clearly defined and written out in step-by-step
fashion for each measurement system and support
device, 5
(b) Defined and summarized for each system and device. 3
(c) Defined but operational procedures developed by
the individual. 1
(11.2) Calibration procedures as written are:
(a) Judged to be technically sound and consistent with
data quality requirements. 5
(b) Technically sound but lacking in detail. 3
(c) Technically questionable and lacking in detail. 1
(11.3) Calibration standards are:
(a) Specified for all systems and measurement devices
with written procedures for assuring, on a con-
tinuing basis, traceability to primary standards. 5
(b) Specified for all major systems with written
procedures for assuring traceability to pri-
mary standards. - 3
(c) Specified for all major systems but no procedures
for assuring traceability to primary standards. 1
(11.4) Calibration standards and traceability procedures as
specified and written are:
(a) Judged to be technically sound and consistent
with data quality requirements. 5
(b) Standards are satisfactory but traceability is
not verified frequently enough. 3
(c) Standards are questionable. 1
66
-------
A.ll CALIBRATION PROCEDURES (continued)
SCORE
(11.5) Frequency of calibration is:
(a) Established and documented for each measurement
system and support measurement device. 5
(b) Established and documented for each major meas-
urement system. 3
(c) Established and documented for only certain
measurement systems. 1
(11.6) A review of calibration data indicates that the
frequency of calibration as implemented:
(a) Is adequate and consistent with data quality
requirements. 5
(b) Results in limits being exceeded a small frac-
tion of the time. 3
(c) Results in limits being exceeded frequently. 1
(11.7) A review of calibration history indicates that:
(a) Calibration schedules are adhered to and results
fully documented. 5
(b) Schedules are adhered to most of the time. 3
(c) Schedules are frequently not adhered to. 1
(11.8) A review of calibration history and data validation
records indicates that:
(a) Data are always invalidated and deleted when
calibration criteria are exceeded. 5
(b) Data are not always invalidated and/or deleted
when criteria are exceeded. 3
(c) Data are frequently not invalidated and/or deleted
when criteria are exceeded. 1
67
-------
A.11 CALIBRATION PROCEDURES (continued)
SCORE
(11.9) Acceptability requirements for calibration results
are:
(a) Defined for each system and/or device requiring
calibration including elapsed time since the
last calibration as well as maximum allowable
change from the previous calibration. 5
(b) Defined for all major measurement systems. 3
(c) Defined for some major measurements systems only. 1
(11.10) Acceptability requirements for calibration results as
written are:
(a) Adequate and consistent with data quality require-
ments: 5
(b) Adequate but others should be added. 3
(c) Inadequate to ensure data of acceptable quality. 1
(11.11) Calibration records (histories) are:
(a) Utilized in revising calibration schedules (i.e.,
frequency). 5
(b) Utilized when specific questions arise and re-
viewed periodically for trends, completeness,
etc. 3
(c) Utilized only when unusual problems occur. 1
A.12 FACILITIES/EQUIPMENT
(12.1) Facilities/Equipment are:
(a) Adequate to obtain acceptable results. 5
(b) Adequate to obtain acceptable results most of
the time. 3
(c) Additional facilities and space are needed. 1
68
-------
A. 12 FACILITIES/EQUIPMENT (continued)
SCORE
(12.2) Facilities, equipment, and materials are:
(a) As specified in appropriate documentation and/or
standards. 5
(b) Generally as specified in appropriate standards. 3
(c) Frequently different from specifications. 1
(12.3) Housekeeping reflects an orderly, neat, and
effective attitude of attention to detail in:
(a) All of the facilities. 5
(b) Most of the facilities. 3
(c) Some of the facilities. 1
(12.4) Maintenance Manuals are:
(a) Complete and readily accessible to maintenance
personnel for all systems, components, and
devices. 5
(b) Complete and readily accessible to maintenance
personnel for all major systems, components, and
devices. 3
(c) Complete and accessible for only a few of the
sys terns. 1
A. 13 RELIABILITY
(13.1) Procedures for reliability data collection, processing,
and reporting are:
(a) Clearly defined and written for all system com-
ponents. 5
(b) Clearly defined and written for major components
of the system. 3
(c) Not defined. 1
69
-------
A.13 RELIABILITY (continued)
SCORE
(13.2) Reliability data are:
(a) Recorded on standard forms. 5
(b) Recorded as operator/analyst notes. 3
(c) Not recorded. 1
(13.3) Reliability data are:
(a) Utilized in revising maintenance and/or replace-
ment schedules. 5
(b) Utilized to determine optimum parts inventory. 3
(c) Not utilized in any organized fashion. 1
70
-------
APPENDIX B STANDARD TECHNIQUES USED IN
QUANTITATIVE PERFORMANCE AUDITS
71
-------
Table 1. Physical measurements
Yoperty
Density
Flow
ro
Humidity
Measurement Methods
a. Vibrating U-tube
b. Mass/flow meter
c. Bubble tube
a. Orifice meter
i. manometer
ii. differential
pressure cell
iii. mechanical gauges
iv. electrical cells
b. Pitot tube
i. manometer
ii. mechanical gauges
iii. electrical cells
iv. diffe'rential
pressure cell
c. Venturl, meter
d. Magnetic flow meter
e. Ultrasonic flow meter
a. Wet bulb/dry bulb
thermometers
b. Dewpoint meters
c. Electronic humidity
cells
d. Fluidic
Calibration Methods
Take sample, get weight and
volume at process temperature
and calculate density.
Primary method is to remove
meter from process and cali-
brate on test stand.
Secondary method ia calibra-
tion of elements following
sensor.
Calculation of humidity from
wet and dry bulb measurements
and psychrometric relations.
Audit Techniques
Frequency: Before start and at
end of demonstration, monthly
in between.
Technique: Use of appropriate
laboratory weight and volume
measures.
Frequency: Before start and at
end of demonstration, monthly
in between.
Remove sensor element and in-
spect for corrosion or foul-
ing.
Carry out manufacturer
recommended calibration
procedure.
For transducer and output,
apply substitute signal and
calibrate.
Frequency: Before start and at
end of demonstration, weekly
for wet/dry bulb, monthly
for others.
Technique: Remove sensor and
subject to air stream having
wet/dry apparatus for com-
parison.
-------
Table 1. Physical measurements (con.)
Property
Measurement Methods
Calibration Methods.
Audit Techniques
Level
Pressure, differ-
ential pressure
Temperature
a. Bubble tube
b. Float
c. Conductivity cell
d. Capacitance cell
e. Differential
pressure cell
f. Ultrasonic
g. Sight glass
a. Mechanical gauge
b. Manometer
c. Electrical pressure
cell
d. Differential pressure
cell
a. Thermocouple
b. Resistance Temperature
Detector
c. Thermistor
d. Filled bulb
e. Mercury thermometer
Measure level with sight
glass or dip stick.
Use of dead weight tester is
primary standard. Secondary
standards are
a. Manometer with known fluid
b. 'Precision mechanical gauges
c. Standard electrical
pressure cells
Comparison to reference point
a. Ice point H.O
b. Boiling point H.O
c. Standard thermometer
d. Electronic standard
Frequency: Before start and at
end of demonstration, monthly
in between.
Technique: Measure level at
several points in range and
compare to readout.
Frequency: Before start and at
end of demonstration, monthly
in between.
Technique: Pressure sensors to
be provided with test taps
and valves, also tap for
secondary source for d/p
or electrical cells.
Manometer Is preferred for
calibration in field where
possible
Frequency: Before start and at
end of demonstration, monthly
in between.
Technique: Remove sensor, insert
in reference temperature. For
non-removable sensor, measure
output of sensor," and insert
substitute signal into instru-
ment .input and check calibra-
tion.
-------
Table 2. Gas effluent streams
Material
Measurement Method
Calibration Method
Audit Techniques
Carbon Monoxide
Nitrogen Oxide
Method 10 (Continuous)
Method 7 (Grab)
Particulates
Continuous
Method 5 (Sampling Train)
Standard Reference
Optical (transtnissometer)
aterial, from the National Bure
Standard calibration gas.
Calibrate Sampling Train com-
ponents and use control samples
for analysis phase.
1. Use standard calibration
gases plus
2. Compare results to Method 7
Calibrate components of sam-
pling train: pitot tube, dry
gas meter, orifice meter,
temperature measurement de-
vices, probe heater,, filter
holder, temperature system
Filters
u of Standards.
Provide SRM* for measurement.
1. Independent duplicate
sampling and analysis.
2. Review and observe operating
procedures, check calibra-
tion of train components,
and prepare blind samples
for field team to measure.
1. Provide NO/NO, calibration
gas:NBS-SRM (analysis phase),
2. Compare to Method 7 (total
measurement method).
1. Audit of total method by
independent, simultaneous
measurement from sampling
through analysis.
2, Calibration check on sample
train components per cent
isokinetic rate, and visual
observation of operating
procedures.
Calibration check with inde-
pendent set of NBS filters.
-------
Table 2. Gas effluent streams (con.)
Material
Measurement Method
Calibration Method
Audit Techniques
Sulfur Dioxide
Method 6 (Batch)
Calibrate sampling train com-
ponents and use standards
samples for analysis phase.
Ui
Method 12 (Continuous)
1. Use standard calibration
gases plus
2. Use calibrated absorbance
filter furnished by instru-
ment manufacturer.
Independent duplicate
sampling and analysis.
Review/observe operating
procedures, check calibra-
tion of train components,
and analyze split samples
and/or prepare blind sam-
ples for field team to
measure.
1. Compare to Method 6.
2. Provide SRS for measure-
ments.
-------
Table 3. Liquid streams, suspended solids
Material
Measurement Method
Calibration Method
Audit Techniques
Liquid Stream Samples-
par cent solids, ionic
species
Effluent solids; e.g.,
per cent water, CaO,
-------
APPENDIX C DEFINITIONS AND STATISTICAL TECHNIQUES
USEFUL IN QUALITY ASSURANCE PROGRAMS
77
-------
I. CENTRAL TENDENCY AND DISPERSION
A. The Arithmetic Mean.
The sum of all values in a measurement set, divided by the number
of values summed. Commonly called the "average." Often denoted
symbolically by a bar over the variable symbol, as "X".
n
X
£ xi A
i=l '
B. Range.
The difference between the maximum and minimum values of a set of
values.
R = X - X .
max mm
A rough indication of variability, particularly when the set of values
is small (<10).
C. Standard Deviation.
An indication of the dispersion of a set of numbers about the
mean value. Normal (and other) distributions are expressed as a func-
tion of the standard deviation.
For a given set of values, the defining equation is:
n
1/2
n-1
For computational purposes, it is convenient to use:
1/2
s = / ^ _ 2 2
- X
n
78
-------
D. Relative Standard Deviation, or Coefficient of Variation.
The dispersion of a set of values, expressed as a percentage of
the mean.
CV = (s/X) x 100
II. MEASURES OF VARIABILITY
A. Accuracy.
The difference (either on an absolute or percentage basis) between
a measured value and an assumed "true" value. The larger the difference,
the lower the accuracy.
B = X - T, or
*,T> _ (X-T) x IQO
%B f
(see "Bias")
B. Bias.
A nonrandom measurement error; a consistent difference either
between sets of results or between a measured value and a "true"
value. If the latter, the bias or percent bias is measured by the
relationships in A above. (See III. SIGNIFICANCE TESTS, A. t-test)
C. Precision.
A measure of agreement among individual measurements of a vari-
able, under identical or similar conditions. Precision may be ex-
pressed in several ways, and care must be exercised in the defini-
tion and use of precision measures.
One set of such measures follows:
1. Within-laboratory: The within-laboratory standard deviation,
s, measures the dispersion in replicate single determina-
tions made by one laboratory team (same field operators,
* These definitions are taken from EPA collaborative test result
publications, and are applied to the various federal reference
sampling and analysis techniques. Since these techniques are
frequently used in evaluating emissions from IERL projects, they
are particularly appropriate for this guidelines document.
79
-------
laboratory analyst, and equipment) sampling the same true
concentration.
2. Between-laboratory: The between-laboratory standard devia-
tion, s, , measures the total variability in a concentration
determination due to determinations by different labora-
tories sampling the same true stack concentration. The
2
between laboratory variance, s, , may be expressed as
2 2,2
S, = S,. + S
b L
and consists of a within-laboratory variance plus a labora-
2
tory bias variance, s .
Jj
3. Laboratory bias; The laboratory bias standard deviation,
s = /si: - sz , is that portion of the total variability
J_j
that can be ascribed to differences in the field operators,
analysts and instrumentation, and due to different manners
of performance of procedural details left unspecified in
a technique. This term measures that part of the total
variability in a determination which results from the use
of a technique by different laboratories, as well as from
modifications in usage by a single laboratory over a period
of time. The laboratory bias standard deviation is esti-
mated from the within-and between-laboratory estimates pre-
viously obtained.
A corresponding set of coefficients of variation would be CV,
CV, , and CV . These are convenient to use if the precision is pro-
o L
portional to the mean value of the variable.
80
-------
III. SIGNIFICANCE TESTS
A. t-test.
If one has an assumed "true" value, however obtained, the
existence of a significant bias in other measurements of this value
can be defined by a t-test:
d - 0
where t = a parameter, the magnitude of which is referenced to
tabulated values. A t-value which exceeds the tabulated
value for given specifications of probability and number
of degrees of freedom indicates the existence (within the
definition of probability specified) of a significant
bias. The more stringent the probability requirement;
i.e., the smaller the probability chosen, the larger the
tabulated t-value.
d = the average difference between the true value and the
measured values; the average bias.
s, = the standard deviation of the differences, d .
d i
n = the number of measurements made.
B. Chi-square test.
If one has a reasonable estimate of the expected standard devia-
tion of a set of measurements, the existence of a defined "excess
variability" can be tested as follows:
2 2
X sd
02{x>
81
-------
2
where x /f = a random variable with tabulated values ( f = n - 1 =
number of degrees of freedom).
2
0 {x} = the expected variance of the measurements of x.
2
If X /f is larger than the chosen tabulated value (with specified
probability), it is concluded that the measurements are exhibiting
excess variability. The chi-square test is a measure of the validity
of a series of measurements based on an "expected" variability. The
test is worthwhile only whenever a measurement technique has been
tested thoroughly, so that a realistic expectation can be estimated.
IV. CONFIDENCE LIMITS
Confidence limits take two forms. One form defines a numerical range
within which one has a (arbitrarily chosen) probability of finding the true
mean value of the measured variable. If the measurement variability is ex-
pressed as a standard deviation, the confidence limits as defined above can
be calculated as follows:
CL = X + ts/ /n~
where all symbols have been previously defined. Note that as the number of
measurements, n, increases, the magnitude of CL decreases. Also, for higher
probabilities of containing the true mean within CL, the larger the value of
t and therefore the larger the size of CL.
The second form of confidence limit defines an interval within which the
next individual measurement can be expected to fall with a given probability.
The calculation of this limit, sometimes called a tolerance limit, is by
the following relationship:
TL = X + ts
82
-------
While n, the number of measurements, does not explicitly appear in the
equation for TL, it does determine (along with the selected probability)
the value of t; i.e., as n increases, t decreases.
V. TESTING FOR OUTLIERS
An outlier is an extreme value, either high or low, which has question-
able validity as a member of the measurement set with which it is associ-
ated.
Detection of outliers may be on one of the following basis:
a) A known experimental aberration, such as an instrument failure or
a technique inconsistency.
b) A statistical test for significance, such as the Dixon ratio test.
This test is described below.
The Dixon criteria is based entirely on ratios of differences be-
tween observations where it is desirable to avoid calculation of s or
where quick judgment is called for. For the Dixon test, the sample cri-
terion or statistic changes with sample size. Critical values of the
statistic for various levels of significance are tabulated.
r
&
Table 1 below, presents selected significance (probability) levels
for criteria over the n range 3 to 25. Note that the measurement values
are first, arranged in order of ascending magnitude; i.e., x is the largest
value.
Taken from "Processing Data for Outliers," by W. J. Dixon, Biometrics.
Vol. 9, No. 1, 1953.
83
-------
Table 1. DIXON CRITERIA FOR TESTING OF EXTREME OBSERVATION
(SINGLE SAMPLE)3
n
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
Criterion
HO = (x2 ~ x-|)/(xn - x-j) if smallest
value is suspected;
= (xn * xn-l)/(xn ' xl} 1f
largest value is suspected.
rll = (X2 ' xl)/(xn-l " xl) 1f
smallest value is suspected;
= (xn - xn.1)/(xn - x2) if
largest value is suspected.
r— (V — YI/IY — Y^1"F
OT \O " "1 / / \ "« 1 i /
smallest value is suspected;
= (xn- xn_2)/(xn - x2) if
largest value is suspected.
r~IY » Y 1 / I Y — Yl"1"f
™~ \"O "1 / / V « O ™ T»
c£. 3 1 n-c. \
smallest value is suspected;
= (xn - xn_2)/(xn - x3) if
largest value is suspected.
Significance Level
10%
•• 0.886
0.679
0.557
0.482
0.434
0.479
0.441
0.409
0.517
0.490
0.467
0.492
0.472
0.454
0.438
0.424
0.412
0.401
0.391
0.382
0.374
0.367
0.360
5%
0.94-1
0.765
0.642
0.560
0.507
0.554
0.512
0.477
0.576
0.546
0.521
0.546
0.525
0.507
0.490
0.475
0.462
0.450
0.440
0.430
0.421
0.413
0.406
1%
0.988
0.889
0.780
0.698
0.637
0.683
0.635
0.597
0.679
0.642
0.615
0.641
0.616
0.595
0.577
0.561
0.547
0.535
0.524
0.514
0.505
0.497
0.489
84
-------
APPENDIX D SOME STANDARD AMBIENT AIR AND SOURCE
SAMPLING TECHNIQUES
85
-------
SOURCE SAMPLING TECHNIQUES
Pollutant EPA
Method
or
Number
S02 6
so2
and 8
S03/H2S04
NO 7
* *
CO 10
Particulates 5
Visible 9
emissions
Be 104
a TM<; table is a summary
Bias (absolute,
or percent of
mean concentra-
tion)
0
-2% (analysis
only)
-2% (analysis
only)
0
+7 ppm
No information
+1 .4% opacity
-20%, average
of information contained
Precision (absolute, or
coefficient of variation)
within
1 aboratory
3.9
0.1 g/m3
60%
7%
13 ppm
10-30%
2% opacity
44%
in the cited
between
laboratory
5.5
0.11 g/m3
65%
10%
25 ppm
20-40%
2.5%
58%
references, all
Comments
Major error source is dif-
ficulty of obtaining repro-
ducible titration end-
points. Minimum detect-
able limit is 3 ppm.
Same analysis technique
as Method 6 above.
Grab sample; largest error
source is failure to re-
calibrate spectrophoto-
meter.
Analyzer drift and COp
interference are largest
problems. Minimum detect-
able limit is 20 ppm.
Numerous small error
sources associated with
stack sampling.
Good results depend to a
great extent on the effec-
tive training of observers.
of which are Quality Assurance
Reference
EPA-650/
14-74-
005-e
EPA-650/
14-74-
005-g
EPA-650/
14-74-
005-f
EPA-650/
14-74-
005-h
EPA-650/
14-74-
005-d
EPA-650/
14-74-
005- i
EPA-650/
14-74-
005 -k
Guide-
lines Manuals.
-------
AMBIENT AIR TECHNIQUES'
Pollutant EPA Bias (absolute,
Method or percent of
or mean concentra-
Number tion)
S02 6 0
NO, N0_2 Chemilumi- 0
HJQ nescent
A
Precision (absolute, or
coefficient of variation)
within
laboratory
5-13 pg/m3,
from
x = Q-1000
ug/m3
7-8% at 3
100 ug/m
(0.05 ppm)
between
laboratory
10-25 pg/m3
from x = -
0-1000 pg/nT
Comments
Lower limit of detec-
tion is 25 pg/m . Flow
rate changes, sampling
train leakage are prim-
ary error sources.
Lower limit of detec-
tion is 10 pg/m3 (0.005
ppm). Errors are asso-
Reference
EPA-R4-
73-028d
d
00
Photochem
ical oxi-
dants
Chemilumi-
nescent
CO
NDIR
-35 to -15%
from 0.05 to
0.50 ppm
+2.5
0.0033 +
0.0255 x
(0-0.5 ppm)
0.6 mg/nr
0.0008 +
0.0355 x
(0-0.15 ppm)
-0.0051 +
0.0690 x
(0.15-0.5 ppm)
0.8 - 1.6
mg/nr (non-
linear varia-
tion) over
0-60 mg/m3
ciated with calibration
and instrument drift
(from zero and span
settings).
Lower detection limit is
0.0065 ppm
EPA-R4-
73-028C
Lower detection limit is EPA-R4-
0.3 mg/m3. Interference 73-028a
of water vapor is signifi-
cant.
a This table is a summary of information contained in the cited references, all of which are Quality Assurance
Guideline Manuals published by EPA. Collaborative test results are cited, if available, in the manuals.
b x = pollutant concentration.
c EPA-650/4-75/016.
d Guidelines for Development of a Quality Assurance Program for The Continuous Measurement of Nitrogen Dioxide in
the Ambient Air (Chemiluminescent), Smith & Nelson, Research Triangle Institute, Research Triangle Park, N.C. 27709
-------
AMBIENT AIR TECHNIQUES (cont.)
Pollutant EPA
Method
or
Number
Particulates High
Volume
N0? Arsenite
Bias (absolute,
or percent of
mean concentra-
tion)
No information
-3% (50-300
yg/m3)
Precision (absolute, or
coefficient
within
laboratory
3%
8 yg/m3
(50-300 yg
m3)
of variation
between
laboratory
3.7%
11 ug/m3
(50-300 yg/
m3)
Comments
Minimum detectable limit
is 3 mg. Shorter samp-
ling periods give less
precise results, biased
high.
A tentative method.
Lower detectable limit
is 9 yg/m3.
Reference
EPA-R4-
73-028b
EPA-R4-
73-2800
oo
oo
-------
TECHNICAL REPORT DATA
(Please read Instructions on the reverse before completing)
1. REPORT NO.
EPA-600/2-76-159
3. RECIPIENT'S ACCESSION NO.
4. TITLE AND SUBTITLE
IERL-RTP Data Quality Manual
5. REPORT DATE
June 1976
6. PERFORMING ORGANIZATION CODE
7. AUTHOR(S)
Franklin Smith and James Buchanan
8. PERFORMING ORGANIZATION REPORT NO.
9. PERFORMING ORGANIZATION NAME AND ADDRESS
Research Triangle Institute
P.O. Box 12194
Research Triangle Park, North Carolina 27709
10. PROGRAM ELEMENT NO.
EHB-524
11. CONTRACT/GRANT NO.
68-02-1398, Task 35
12. SPONSORING AGENCY NAME AND ADDRESS
EPA, Office of Research and Development
Industrial Environmental Research Laboratory
Research Triangle Park, NC 27711
13. TYPE OF REPORT AND PERIOD COVERED
Task Final; 3-5/76
14. SPONSORING AGENCY CODE
EPA-ORD
15. SUPPLEMENTARY NOTES ipro].ect officer for this manual is L.D. Johnson, Mail Drop 62,
(919) 549-8411, Ext 2557.
is. ABSTRACT
manuai gives guidelines for the establishment and maintenance of an
integrated data quality program for EPA's Industrial Environmental Research Labora-
tory—Research Triangle Park (TERL-RTP). Administrative systems dedicated to
the data quality program are delineated. These systems include quality policies and
objectives, organizational structure and key quality personnel, and a schedule for
implementation. Components of both quality control programs and quality assurance
programs are given. IERL-RTP projects are divided into six categories. Projects
within a given category have common characteristics (e.g. , size, duration, objec-
tives, and data quality requirements), making them amenable to the same general
set of quality control and quality assurance practices and procedures. Quality control
and quality assurance procedures applicable to each of the six categories are given
for each phase of the project's life cycle. These phases include: Request for
Proposal preparation, proposal evaluation, work plan evaluation, project implemen-
tation and maintenance , and reports .
17.
KEY WORDS AND DOCUMENT ANALYSIS
DESCRIPTORS
b.lDENTIFIERS/OPEN ENDED TERMS
. COSATI Field/Group
Pollution
Data
Quality
Quality Control
Quality Assurance
Management Systems
Evaluation
Proposals
Reporting
Pollution Control
Stationary Sources
IERL-RTP
Work Plans
13B
13H,14D
05A
18. DISTRIBUTION STATEMENT
Unlimited
19. SECURITY CLASS (This Report)
Unclassified
21. NO. OF PAGES
95
20. SECURITY CLASS (Thispage)
Unclassified
22. PRICE
EPA Form 2220-1 (9-73)
89
------- |