EPA-600/2-76-083
March 1976
Environmental Protection Technology Series
DEVELOPMENT AND TRIAL FIELD
APPLICATION OF A QUALITY ASSURANCE
PROGRAM FOR DEMONSTRATION PROJECTS
Industrial Environmental Research Laboratory
Office of Research and Development
U.S. Environmental Protection Agency
Research Triangle Park, North Carolina 27711
-------
RESEARCH REPORTING SERIES
Research reports of the Office of Research and Development, U.S. Environmental
Protection Agency, have been grouped into five series These five broad
categories were established to facilitate further development and application of
environmental technology. Elimination of traditional grouping was consciously
planned to foster technology transfer and a maximum interface in related fields.
The five series are:
1. Environmental Health Effects Research
2. Environmental Protection Technology
3. Ecological Research
4. Environmental Mqnitorihg
5,: ,-,Soe'i6ecjonomic Environmental Studies
This report-,has. p&ek -assignee! to the ENVIRONMENTAL PROTECTION
TECHNOLOGY series. This'series describes research performed to develop and
demonstrate instrumentation, equipment, and methodology to repair or prevent
environmental degradation from point and non-point sources of pollution. This
work provides the new or improved technology required for the control and
treatment of pollution sources to meet environmental quality standards
EPA REVIEW NOTICE
This report has been reviewed by the U. S. Environmental
Protection Agency, and approved for publication. Approval
does not signify that the contents necessarily reflect the
views and policy of the Agency, nor does mention of trade
names or commercial products constitute endorsement or
recommendation for use.
This document is available to the public through the National Technical Informa-
tion Service, Springfield. Virginia 22161.
-------
EPA-600/2-76-083
March 1976
DEVELOPMENT AND TRIAL FIELD APPLICATION
OF A QUALITY ASSURANCE PROGRAM
FOR DEMONSTRATION PROJECTS
by
James Buchanan
Research Triangle Institute
P.O. Box 12194
Research Triangle Park, NC 27709
Contract No. 68-02-1398, Task 20
ROAP No. ABA-011
Program Element No. EHB-557
EPA Task Officer: L. D. Johnson
Industrial Environmental Research Laboratory
Office of Energy, Minerals, and Industry
Research Triangle Park, NC 27711
Prepared for
U.S. ENVIRONMENTAL PROTECTION AGENCY
Office of Research and Development
Washington, DC 20460
-------
ACKNOWLEDGMENTS
The work on this project was performed by the Systems and Measurements
Division of the Research Triangle Institute. Mr. Frank Smith, Supervisor,
Quality Assurance Section, served as the project leader. Dr. James Buchanan
of the Quality Assurance Section was responsible for the coordination of the
program. Institute staff members Dr. D. E. Wagoner and Mr. Larry Hackworth,
analytical chemists, Mr. Leon Bissette, an electrical engineer, and
Dr. Buchanan, a physical chemist, were major contributors to the program.
Project officer for the Environmental Protection Agency was Dr. L. D. Johnson
of the Process Measurements Branch of the Industrial Environmental Research
Laboratory. The Research Triangle Institute acknowledges the cooperation and
assistance of the project officer and Dr. R. Statnick of the Process Measure-
ments Branch. The Institute also appreciates the assistance and guidance
provided by Mr. John Williams, the EPA project officer for the Shawnee wet
limestone scrubber demonstration. Finally, gratitude is extended to
Mr. Joe Barkley and Mr. Ken Metcalf of TVA and to Mr. Dewey Burbank of Bechtel
Corporation for their cooperation at the Shawnee test site.
iii
-------
TABLE OF CONTENTS
SECTION
1.0 INTRODUCTION ]
2,0 MAJOR COMPONENTS OF A QUALITY CONTROL PROGRAM 3
2-1 QUALITY ASSURANCE ASPECTS OF THE RFP 3
2-2 EVALUATION OF QUALITY CONTROL IN THE PROPOSAL 3
2-3 EVALUATION OF QUALITY CONTROL IN THE WORK PLAN 4
2-4 MANAGEMENT COMMITMENT TO QUALITY CONTROL 4
2-5 QUALITY CONTROL IN THE ORGANIZATIONAL STRUCTURE 4
2-6 ASSESSMENT OF QUALITY CONTROL REQUIREMENTS 5
2.7 SPECIFIC AREAS OF CONCERN FOR DEMONSTRATION
PROJECT QUALITY CONTROL PROGRAMS 5
2-7.1 FACILITIES AND EQUIPMENT 6
2-7.2 CONFIGURATION CONTROL 6
2.7.3 PERSONNEL TRAINING 6
2.7.4 DOCUMENTATION CONTROL 7
2.7.5 CONTROL CHARTS 7
2-7.6 IN-PROCESS QUALITY CONTROL 7
2-7.7 PROCUREMENT AND INVENTORY PROCEDURES 9
2-7.8 PREVENTIVE MAINTENANCE 9
2-7.9 RELIABILITY 9
2.7.10 DATA VALIDATION 9
2-7.11 FEEDBACK AND CORRECTIVE ACTION 10
2-7.12 CALIBRATION PROCEDURES 10
lv
-------
TABLE OF CONTENTS (CON,)
SECTION PAGE
3,0 GUIDELINES FOR DEMONSTRATION PROJECT QUALITY ASSURANCE
PROGRAMS ii
3-1 GENERAL STATEMENTS H
3-2 THE ON-SITE QUALITATIVE SYSTEMS REVIEW H
3-3 THE PERFORMANCE AUDIT 12
3-4 MATERIAL BALANCES 12
3-5 ASSESSMENT OF DATA QUALITY 12
3-6 ASSESSMENT AND MODIFICATION OF THE ONGOING QUALITY
ASSURANCE PROGRAM 14
4,0 A SHORT-TERM QUALITY ASSURANCE PROGRAM IMPLEMENTED AT
THE SHAWNEE SCRUBBER FACILITY 17
4.1 THE CONTROL LABORATORY 17
4.1-1 MEASUREMENT OF pH 17
4.1-2 SLURRY ANALYSIS 19
4.1.3 OVERALL LABORATORY EVALUATION 24
4-2 GAS STREAM SAMPLING 24
4.2.1 PARTICULATE MASS LOADING 24
4.2.1.1 PITOT TUBE COMPARISON 25
4.2.1.2 TEMPERATURE MEASUREMENT 25
4.2.1.3 MOISTURE MEASUREMENT 25
4.2.1.4 VOLUME MEASUREMENT 26
4.2.2 SULFUR DIOXIDE CONCENTRATION DETERMINATIONS 26
4-3 PROCESS INSTRUMENTATION 27
4-4 RECOMMENDATIONS 29
v
-------
TABLE OF CONTENTS (CON,)
SECTION
5,0 EVALUATION OF THE SHORT-TERM QUALITY ASSURANCE PROGRAM
AT SHAWNEE 31
5-1 QUALITATIVE SYSTEMS REVIEW 31
5-2 QUANTITATIVE PERFORMANCE AUDIT 31
5-2.1 SCHEDULING 31
5.2.2 EQUIPMENT AND INSTRUMENTATION 32
5-2.3 PERSONNEL SELECTION 33
APPENDIX A QUALITY AUDIT CHECKLIST FOR DEMONSTRATION
PROJECTS 36
APPENDIX B STANDARD TECHNIQUES USED IN QUANTITATIVE
PERFORMANCE AUDITS 61
APPENDIX C COMPARISON ON ANALYSIS OF LIMESTONE SLURRY 65
-------
LIST OF TABLES
TABLE NO. PAGE
1 COMPARISONS OF pH 18
2 MEAN VALUES FOR SLURRY ANALYSES, BY LABORATORY 21
3 COMPARISON OF SHAWNEE RESULTS WITH MEAN VALUE OF
COOPERATING LABORATORIES 23
4 COMPARISON OF S0¥ DETERMINATIONS 28
LIST OF FIGURES
FIGURE NO.
1 STANDARD QUALITY CONTROL CHART,
Vii
-------
1.0 INTRODUCTION
The major objective of this project was to develop a general quality
assurance (QA) program for EPA demonstration projects, using the wet limestone
scrubber facility at the Shawnee steam plant, Paducah, Kentucky, as an example
project. A second objective was to field test the QA program at the Shawnee
facility and carry out whatever modifications were necessary in light of that
field trial.
Two concurrent final reports deal separately with these objectives and
should be consulted for further treatment of areas of interest mentioned here.
These reports are:
1. Guidelines for Demonstration Project Quality Assurance Programs,
EPA-600/2-76-081.
2. A Quality Assurance Program for the Environmental Protection Agency
Wet Limestone Scrubber Demonstration Project, Shawnee jiteam-Electric
Plant, Paducah, Kentucky, EPA-600/2-76-080.
Reference will occasionally be made to these reports. For conciseness, reports
1. and 2. will be referred to as the guidelines report and the Shawnee report,
respectively.
Organization of this report is as follows: section 2.0 discusses signifi-
cant areas for quality control (QC) from the RFP to the day-to-day QC program;
section 3.0 outlines a quality assurance program for demonstration projects;
section 4.0 treats the specific project studied, the wet limestone scrubber at
the Shawnee steam plant; and section 5.0 is an evaluation of the QA program
implemented at Shawnee. The major areas investigated at Shawnee were the
analytical (control) laboratory, the gas stream measurement systems, and the
process monitoring and control instrumentation.
To facilitate the reading of this report, two terms should be carefully
defined. These terms are quality control and quality assurance. Concise
definitions are given herewith.
Quality control: the overall system of activities the purpose of which
is to provide a quality of product or service that meets the needs of users.
Quality assurance: A system of activities the purpose of which is to
provide assurance that the overall quality control job is being done effectively
-------
Recommendation of a quality control system as such does not fall within
the scope of this project. It is important, however, to be aware of the
elements of such a system if one is to act as a monitor of QC by means of QA
work. For this reason, the first major area addressed by this report is the
QC system itself.
-------
2.0 MAJOR COMPONENTS OF A QUALITY CONTROL PROGRAM
2.1 Quality Assurance Aspects of the RFP
The design of the RFP is predicated on stating as clearly as possible what
the objectives of the project are; e.g., to design, construct, and maintain a
given control system, systematically examining the interaction of appropriate
system parameters. The quality of the data obtained from the project will
depend upon numerous factors—instrumentation, personnel, sampling technique,
sampling size, statistical expertise. It is therefore critical that the RFP
be as explicit as possible* in delineating two things—what quality data are
expected, and how that quality is to be insured.
Since most RFP's are limited in length, it would usually be inappropriate
to include more than a brief (one- or two-paragraph) statement of QC require-
ments. Nevertheless, it is most important that the bid solicitation be as
explicit as possible concerning QC.
2.2 Evaluation of Quality Control in the Proposal
The proposal should contain a statement as to the precise position the
bidder's company takes regarding quality control programs. This should include
past projects and the quality control program effectiveness in that project.
In particular, there should be a clear and explicit response to the QC require-
ments stated in the RFP. This response must be compared directly, item-by-item,
with other proposals submitted against the RFP. The evaluation should result
in a determination of a "figure of merit" for the bidder's quality control
organization and the competence of the staff.
If a contractor has a good proposal but is unclear on some phases of data
quality, it would seem worthwhile to have him clarify his proposal by asking
him to answer specific questions. If the answers to these questions are still
vague, it is a good indication that the quality for these phases of the project
may be questionable if this contractor carries out the project.
'it is understood that, because of the nature of the proposed work, it may not
be possible to specify either the expected data quality or the way in which
data quality is to be insured.
-------
2.3 Evaluation of Quality Control in the Work Plan
The work plan should be a detailed accounting of the actual steps to be
taken to complete the work delineated in the proposal and should be in direct
accord with the requirements of the RFP and other agreements with the project
officer. Particular attention should be placed on mutually agreed upon critical
areas in order to realize the collection of data having acceptable precision,
accuracy, representativeness, and completeness.
In cases where the submitted proposal has been accepted but lacks the
completeness required by the project officer, finalized negotiations to remove
the problem areas should be directly addressed in the work plan showing the
details of the work to be done.
The work plan must be submitted to the project ofJ'icer before any work is
begun by the contractor. The plan can be accepted in draft form, which will
allow for minor changes prior to the final plan's acceptance and approval.
2.4 Management Commitment to Quality Control
No quality control program, regardless of the amount of planning or level
i
of effort expended, will be effective without the explicitly visible support
of top management. The support should be expressed initially as the project
gets underway and periodically throughout the duration of the program. The
support of top management then filters down through middle and lower management
to the operators, resulting in a program where QC is practiced on a day-to-day
basis, rather than being an additional program or nuisance. Quality control
must be built-in, functional area within the total program, and this is not
possible without continuing obvious management support.
2.5 Quality Control in the Organization Structure
Support for quality control is most visible when the organizational struc-
ture has provision for personnel whose authority and responsibilities lie in
the area; i.e., a quality control coordinator (QCC) and/or any other staff
appropriate to the program. The QCC is responsible for the organization's
entire QC program, and this person's judgment determines the effectiveness of
the program. The basic function of the QCC should be the fulfillment of the
QC objective of management in the most efficient and economical manner
-------
commensurate with insuring continuing completeness, accuracy, and precision of
the data produced. The responsibilities and authority of the QCC are detailed
in the Quality Assurance Handbook for Air Pollution Measurement Systems, Vol. 1,
Principles, EPA-600/9-76-005.*
The QCC should have, within the main organizational structure, a subordi-
nate organization for QC activities (auditing, calibration, quality control).
He should have authority for assignment of QC duties and for coordination of
the entire program, and must not be directly subordinate to operational person-
nel in the project.
2.6 Assessment of Quality Control Requirements
The establishment of a QC program for a demonstration project requires
first of all the setting, in as quantitative a manner as possible, of project
objectives. The desired precision and accuracy of each measurement should be
specified, as well as the technical means of attaining this degree of data
quality; i.e., the tasks to be performed. Once this is done, it is efficient
to group the tasks organizationally and assign responsibility for the QC function
to each task group. It is inevitable that problems are incurred in each step
of the planning and establishment of the QC program. Some of these cannot be
resolved until the program enters the functional stage. What is important
initially is that these problems be identified and clearly stated, so that they
can be resolved as quickly as possible once the program gets underway.
2.7 Specific Areas of Concern for Demonstration Project Quality Control
Programs
A quality control program for a demonstration project serves to:
1. Evaluate the overall adequacy of the project insofar as
data quality is concerned;
2. Identify potential as well as existing problems in the
data-producing system, from measurement to data reduction;
Some of the general discussion of QC programs in this report has been taken
from this document.
-------
3. Stimulate research into and discussion of alternative
methods for obtaining data of the required quality.
It is advisable to delineate a number of important aspects of the project
which have direct bearing on data quality, and to discuss each of these in
some detail.
2.7.1. Facilities and Equipment
An obvious beginning point in the assessment of an ongoing program is a
general survey of the facilities and equipment available for day-to-day opera-
tion of the project. Are they adequate for the job at hand? Do standards
exist for evaluation of facilities, equipment, and materials?
The laboratories, data processing sections, and other operation areas
should be neat and orderly, within common-sense limits imposed by the nature
of the facility. A neat, well-organized laboratory area serves to inspire
neatness and organization among the laboratory workers.
Good laboratory maintenance, particularly for certain types of instru-
mentation, requires complete manuals, kept in a convenient place so that they
are readily available to appropriate personnel. Responsibility for keeping
up with all necessary manuals should be given to an individual, with the under-
standing that he must devise a system (checkin-checkout) for quick location
of each document.
2.7.2 Configuration Control
The documentation of design changes in the system must be carried out
unfailingly. Procedures for such documentation should be written, and be
accessible to any individual responsible for configuration control.
2.7.3 Personnel Training
It is highly desirable that there be a programmed training system for new
employees. This system should include motivation toward producing data of
acceptable quality standards. This is to be preferred to on-the-job training,
which may be excellent or slipshod, depending upon a number of circumstances.
-------
A thorough personnel training program should focus particular attention
on those people whose work directly affects data quality (calibration personnel,
bench chemists, etc.). These people must be cognizant of the quality standards
fixed for the project and the reasons for those standards. They must be made
aware of the various ways of achieving and maintaining quality data. As these
people progress to higher degrees of proficiency, their accomplishments should
be reviewed and then documented.
2.7.4 Documentation Control
Procedures for making revisions to technical documents must be clearly
written out, with the lines of authority indicated. The revisions themselves
should be written and distributed to all affected parties, thus insuring that
the change will be implemented and will become permanent.
2.7.5 Control Charts
Control charts are essential as a routine day-to-day check on the consis-
tency or "sameness" of the data precision. A control chart should be kept for
each measurement that directly affects the quality of the data. Typically,
control charts are maintained for duplicate analyses, percent isokinetic
sampling rate, calibration constants, and the like. An example control chart
is given as figure 1. The symbol a (sigma) represents a difference, d, of one
standard deviation unit in two duplicate measurements, one of which is taken
as a standard, or audit value. Two cr is taken as a warning limit and 3a as
a control limit.
2.7.6 In-Process Quality Control
During routine operation, critical measurement methods should be checked
for conformance to standard operating conditions (flow rates, reasonableness
of data being produced, and the like). The capability of each method to pro-
duce data within specification limits should be ascertained by means of appro-
priate control charts. When a discrepancy appears in a measurement method, it
should be analyzed and corrected as soon as possible.
-------
30
2a
-0
-20
-30
CHECK NO.
ACTION LIMIT
• XL
WARNING LIMIT
-CL
WARNING LIMIT
ACTION LIMIT
•— LCL
8
10
DATE/TIME
OPERATOR
PROBLEM AND
CORRECTIVE
ACTION
Figure 1, Standard quality control chart.
-------
2.7.7 Procurement and Inventory Procedures
There should be well-defined and documented purchasing guidelines for all
equipment and reagents having an effect on data quality. Performance specifi-
cations should be documented for all items of equipment having an effect on
data quality. In the case of incoming equipment, there should be an established
and documented inspection procedure to determine if procurements meet the quality
assurance and acceptance requirements. The results of this inspection procedure
should be documented.
Once an item has been received and accepted, it should be documented in a
receiving record log giving a description of the material, the data of the
receipt, results of the acceptance test, and the signature of the responsible
individual. It is then placed in inventory, which should be maintained on a
first-in, first-out basis.
2.7.8 Preventive Maintenance
It is most desirable that preventive maintenance procedures be clearly
defined and written for each measurement system and its support equipment.
When maintenance activity is necessary, it should be documented on standard
forms maintained in log books. A history of the maintenance record of each
system serves to throw light on the adequacy of its maintenance schedule and
parts inventory.
2.7.9 Reliability
The reliability of each component of a measurement system relates directly
to the probability of obtaining valid data from that system. It follows that
procedures for reliability data collection, processing, and reporting should be
clearly defined and in written form for each system component. Reliability
data should be recorded on standard forms and kept in a log book. If this
procedure is followed, the data can be utilized in revising maintenance and/or
replacement schedules.
2.7.10 Data Validation
Data validation procedures, defined ideally as a set of computerized and
manual checks applied at various appropriate levels of the measurement process,
-------
should be clearly defined, in written form, for all measurement systems. Cri-
teria for data validation must be documented. The required data validation
activities (flow-rate checks, analytical precision, etc.) must be recorded on
standard form in a log book.
Any demonstration project should, on a random but regular basis, have
quality audits performed by in-house personnel. These audits must be independ-
ent of normal project operations, preferably performed by the QCC or appointees
of the QCC. The audits should be both qualitative and quantitative (i.e., they
should include both system reviews and independent measurement checks). For
the system review, a checklist is desirable to serve as a guide for the reviewer.
Such a checklist is included as appendix A of this report.
The quantitative aspect of the audit will vary depending on the nature of
the project. Some guidelines for quantitative audits are given in appendix B.
2.7.11 Feedback and Corrective Action
Closely tied to the detection of invalid data is the problem of establish-
ment of a closed loop mechanism for problem detection, reporting, and correction.
Here it is important that the problems are reported to those personnel who can
take appropriate action. A feedback and corrective action mechanism should be
written out, with individuals assigned specific areas of responsibility.
2.7.12 Calibration Procedures
Calibration procedures are the crux of any attempt to produce quality data
from a measurement system. For this reason it is extremely important that the
procedures be technically sound and consistent with whatever data quality re-
quirements exist for that system. Calibration standards must be specified for
all systems and measurement devices, with written procedures for assuring, on
a continuing basis, traceability to primary standards. Since calibration
personnel change from time to time, the procedures must be, in each instance
clearly written in step-by-step fashion. Frequency of calibration should be
set and documented, subject to rescheduling as the data are reviewed. Full
documentation of each calibration and a complete history of calibrations per-
formed on each system are absolutely essential. This permits a systematic
review of each system reliability.
10
-------
3,0 GUIDELINES FOR DEMONSTRATION PROJECT QUALITY ASSURANCE PROGRAMS
3.1 General Statements
The objective of quality assurance is to independently assess the quality
control program of the project. This assessment should normally take two major
forms: (1) a qualitative audit (systems review), and (2) a quantitative per-
formance audit. These are discussed in detail below, as sections 3.2 and 3.3
respectively.
The frequency of a qualitative and/or a performance audit obviously should
be dictated by the specific project. It is recommended that a minimum frequency
be once each calendar year. The initial systems review and performance audit
should take place within the first quarter of the first project year. Subse-
quent scheduling should be dependent on the requirements of management and the
apparent quality of the day-to-day data being obtained. More frequent auditing
may be necessary in the initial stages of the project.
3.2 The Qualitative Audit
The objective of the qualitative audit is to assess and document facilities;
equipment; systems; recordkeeping; data validation; operation, maintenance, and
calibration procedures; and reporting aspects of the total quality control pro-
gram for demonstration projects. The review should accomplish the following:
1. Identify existing system documentation—i.e., maintenance
manuals, organizational structure, operating procedures, etc;
2. Evaluate the adequacy of the procedures as documented;
3. Evaluate the degree of use of and adherence to the documented
procedures in day-to-day operations based on observed conditions
(auditor) and a review of applicable records on file.
To aid the auditor in performing the review, a checklist is included as
appendix A. This checklist will allow for systematic appraisal of the areas
mentioned above.
11
-------
3.3 The Performance Audit
In addition to a thorough on-site qualitative audit, quantitative perform-
ance audits should be periodically undertaken at each demonstration project.
The objective of these audits is to evaluate the quality of project data by
independent measurement techniques. It is convenient to classify the major
measurement methods into three areas: physical measurements, gas stream
measurements, and liquid stream measures (the latter including analysis of any
suspended solids). Appendix B lists in matrix form a number of standard tech-
niques for auditing in the three major areas just mentioned. Table 1 of
appendix B is a compilation of commonly measured physical properties, with a
selection of possible measurement, calibration, and audit techniques. Table 2,
concentrating on analysis of gas effluent streams, lists the material to be
analysed and measurement, calibration, and audit techniques for that material.
Finally, table 3 very briefly and generally deals with measurement methods
appropriate to liquids and solids. The specific techniques vary widely from
project to project, but the audit technique generally involves use of control
(reference) samples of known composition and/or splitting a sample among
several laboratories for independent analyses.
3.4 Material Balances
Material balances serve as a gross indication of the quality of the total
measurement system complex of the project. The extent of closure will be
directly related to the precision and bias of each measurement taken. In
general, both physical measurements of flow rates, temperatures, pressured
(and so on), and chemical analysis of material composition will bear on the
degree of closure attained. The achievable extent of closure must be estimated
for each project and used as a target figure. The frequency with which material
balances are run is related to how successful one is in attaining the estimated
closure over a significant period of time.
3.5 Assessment of Data Quality
Standard methods exist for estimation of the precision and accuracy f
measurement data. Efficient usage of the audit data requires that a ratio
12
-------
be followed which gives the best possible estimates of precision and accuracy
within the limits imposed by timing, sample size, etc.
For a given measurement, the difference between the field (or plant) and
the audited results,
is used to calculate a mean and standard deviation as follows:
n
d =Y^ d,/n ,
E
n
E
- d)2/(n -
where d is an estimate of the bias in the measurements (i.e., relative to the
audited value). Assuming the audited data to be unbiased, the existence of a
bias in the field data can be checked by the appropriate t-test, i.e.,
sd/n
If t is significantly large, say greater than the tabulated value of t
with n - 1 degrees of freedom, which is exceeded by change only 5 percent of
the time, then the bias is considered to be real and some check should be made
for a possible cause of the bias. If t is not significantly large, then the
bias should be considered zero, and the accuracy of the data is acceptable.
The standard deviation, s,, is a function of both the standard deviation
d
of the field measurements and of the audit measurements. Assuming the audit
values to be much more accurate than the field measurements, then s is an
estimate of a{x}, by using the statistical test procedure
13
-------
.
7 -T
S
{X}
o
where X /f is the value of a random variable having the chi-square distribution
with f = n - 1 degrees of freedom. If X /f is larger than the tabulated value
exceeded only 5 percent of the time, then it would be concluded that the test
procedure is yielding more variable results due to faulty equipment or opera-
tional procedure.
The measured values should be reported along with the estimated biases,
standard deviations, the number of audits, n, and the total number of field
tests, N, sampled (n < N). Estimates (such as s, and d) which are significantly
— a
different from the assumed population parameters should be identified on the
data sheet.
2
The t-test and X -test described above are used to check on the biases and
standard deviations separately.
Other statistical techniques exist which may apply to specific projects
(or to highly specialized areas of a given project). It is usually worthwhile
to acquire the services of a statistical consultant in order to more effectively
treat the available data.
3.6 Assessment and Modification of the Ongoing Data Quality Program
The guidelines put forth in the preceding sections serve as a basis for
development of a data quality program specific to the needs of a particular
project. A program should not be attempted without a thorough study of the
entire facility, supplemented by at least one site visit. It is to be desired
that provision for QC be made from the project's inception. The EPA project
officer's responsibility is then to see that a program of adequate QC practices
is incorporated into the day-to-day project operations, along with periodic
QA audits conducted by outside organizations.
Implementation of the program, at whatever point in the lifetime of the
project, exposes weaknesses of approach and problems that were not anticipated
in the planning stages. Certainly it is necessary to maintain maximum flexi-
bility of approach as the interface with project realities is made. The
14
-------
Shawnee report documents several problems of implementation and suggests pro-
cedures for avoiding those problems in future efforts. They are also discussed
in the final section of this report.
Generally, one should expect that a degree of modification would be re-
quired in the areas listed below:
1. audit instrumentation and general equipment requirements,
2. sampling frequencies, and
3. audit personnel requirements.
Experience must always be the final judge of the effectiveness of a data quality
program, as one monitors data quality on a continuing basis.
15
-------
4.0 A SHORT-TERM QUALITY ASSURANCE PROGRAM IMPLEMENTED AT THE SHAWNEE
SCRUBBER FACILITY
A quality assurance program was implemented at the wet scrubber facility,
Shawnee steam Plant, Paducah, Kentucky. This program was carried out by per-
sonnel of the Research Triangle Institute, Research Triangle Park, North
Carolina. The program consisted of two site visits of approximately 1 week
each, occurring October 28-31 and November 17-21, 1975. The first visit was
largely occupied with the qualitative audit, using as a guide the checklist
provided as appendix A. An evaluation was made of equipment needs for the
second visit, when the quantitative performance audit was conducted. This
section presents the results of both the qualitative review and the performance
audit, with recommendations for a QA program at the facility.
4.1 The Control Laboratory
4.1.1 Measurement of pH
One of the major elements for control of the scrubber is pH measurement.
For this reason a significant part of the RTI effort was spent in observing and
verifying the TVA techniques for pH determination at scrubber inlet and outlet.
A portable pH system is used several times a day by TVA operators to check
the control room readings obtained from inline pH sensors. These devices are
permanently situated in pots through which slurry continuously circulates when
the scrubber is operating. Four such sensors monitor the inlet and outlet pH
for the TCA and Venturi scrubbers.
A series of direct comparison pH measurements were made on November 19-20,
1975. RTI and TVA personnel made simultaneous measurements of slurry pH. The
operators were instructed to carry out their measurements routinely, from
standardization to cleanup. Also during this period the inline probes were
removed from their pots and immersed in pH 5 and 6 buffers. Both RTI and TVA
long-lead probes were put into these same buffers, after independent standard-
ization. The values obtained in the buffers and in slurry are summarized in
table 1. Measurements to the nearest 0.01 pH unit were made using the RTI
Acumet pH meter. Readings of such precision were not possible with the TVA
17
-------
Table I. Comparisons* of pH
TEST POINT
18161
18252
2816^
2S254
nH
RTI (portable)
5.25
5.13
c**
5.05s
5.24
5.28
5.34
5.06
4.92
4.83
4.77
5.035
6!025
5.035
6.016
nH
TVA' (portable)
5.3
5.2
4.9
5.2
5.2
5.3
5.1
4.9
4.9
4.8
...-,-_-
6.0
5.0
6.0
TVA (inline)
5.16
5.29
5.04
5.34
5.38
5.34
5.25
5.17
5.02
5.08
4.82
5.77
4.99
6.12
Temperature
(°Centrigrade)
54
54
16
54
50
51
50
53
54
50
21
21
21
21
Date
(mo/day/yr)
11/19/75
11/19/75
11/20/75
11/20/75
11/20/75
11/20/75
11/20/75
11/19/75
11/19/75
11/20/75
11/19/75
11/19/75
11/19/75
11/19/75
Tine
(hours)
11:15
15:30
08:45
09:10
09:45
11:15
15:15
11:30
15:30
15:15
10:00
10:30
70:50
11:00
I-1
00
1 Venturi effluent hold tank.
2 Venturi outlet.
3 TCA effluent hold tank.
4 TCA outlet.
Unless otherwise noted, measurements are on
slurry in inline probe pots.
**
Superscript number indicates measurement of
a buffer solution of pH 5 or 6.
-------
Orion pH meter, where estimates of 0.1 pH unit were made. Actual scale mark-
ings on the TVA meter were at 0.2-pH unit intervals.
Two observations are in order, after study of table 1:
1. The RTI and TVA portable (long-lead) pH systems agreed within 0.1
unit or better in every comparison made. This verifies the accuracy of the
TVA portable system, since the RTI system was standardized against certified
buffer. Operator reading errors are probably the largest error source.
2. The inline system readings differed from RTI readings from 0 to 0.3
pH unit, with the mean difference being 0.14 pH unit over 14 readings. Certain-
ly it would be unwise to dwell on the significance of the statistics of such
a brief study. One point can be made, however, with respect to the confidence
placed in the inline pH readings, as follows: it appears unlikely that, using
the present system, pH measurements on the slurry can be made to better than
0.1 pH unit. There are a number of factors which militate against greater
accuracy, the major one probably being the nature of the slurry itself. This
viscous, highly abrasive suspension tends to clog lines, coat out on probe
surfaces, etc., making reproducible measurements quite difficult. The non-
equilibrium mixture of reactive chemicals has a pH that will change on removal
from the scrubber proper; i.e., as it flows into the pots within which measure-
ments are made.
Another factor is the difficulty of standardization of inline probes.
The present system calls for probe removal, cleaning, and standardization
roughly each 2 days. The accuracy of the pH reading is surely dependent on
the condition of the probe surface, and restandardizing is ideally done shortly
before each measurement.
4.1.2 Slurry Analysis
As a check on the reliability of the chemical analysis phase of the
scrubber operation, a series of slurry samples was collected* and sent to
several other laboratories for independent analysis of both the liquid phase
and suspended solids.
*A11 samples were taken from the venturi effluent hold tank.
19
-------
The laboratories originally selected for participation in this phase of
the audit were two TVA laboratories (Chattanooga and Muscle Shoals), E/ETB
(EpA-RTP), and RTI. The E/ETB laboratory later declined to participate in the
project.
Each laboratory was given five l-£ samples of slurry, which were taken
concurrently with control laboratory samples. After filtering and drying the
solid, it was analyzed for calcium, magnesium, and total sulfur. The filtrate
was analyzed for calcium, magnesium, sodium, potassium, and chloride.
Complete results of the analyses are given as appendix C. These results
are given in matrix form, both by laboratory and by element. This report will
present a limited statistical analysis of the data and will comment on the
techniques used by each laboratory.
The five slurry samples were taken over a 36-hour period, all from the
effluent hold tank. There was little apparent change in the slurry composition
over this period of time. Taking a simple numerical average of the five analy-
ses for each element yields a number which itself has little significance, since
it represents the combined effect of analytical uncertainty and slurry composi-
tion change over 36 hours. The rationale for obtaining such an average is that,
if the analytical technique exhibits a bias, this will result in a number that
is correspondingly biased. Thus a comparison can be made among the participating
laboratories and the various analytical techniques. Table 2 is a matrix of
these averages, with the analytical technique used by the laboratory given
underneath the number. A few observations are in order:
1. Analysis results for calcium in the solid were extremely close
among the Shawnee, Chattanooga, and Muscle Shoals laboratories,
but RTI obtained a considerably lower value by AA. The Shawnee
XRF standard value for calcium was established by sending por-
tions of the standard to various TVA laboratories, including
the Chattanooga and Muscle Shoals facilities. Also, not that
these two laboratories both used the same technique, EDTA
titration. Further work would be required to determine which
technique, EDTA or AA, is inherently more accurate.
2. Results for magnesium show the expected large variation for an
element present in low concentration.
20
-------
Table 2. Mean values for slurry analyses, by laboratory
^XLABORATORY
ELEMENT ^s^^
Ca (CaO)
wt % in solid
Mg (MgO)
wt % in solid
Total Sulfur
(S03) wt %
in solid
Ca, ppm
in liquid
Mg, ppm
in liquid
Na, ppm
in liquid
K, ppm
in liquid
Cl, ppm
in liquid
SHAWNEE
23.19
m
0.29
Wf
31.99
m
1929
AA
697
AA
71
AA
128
AA
3580
Pot. Titration
CHATTANOOGA
22.93
EDTA
0.60
EDTA
30.00
BaSO, ppt.
1700
EDTA
772
EDTA
65
FE
108
FE
3601
Volhard
MUSCLE
SHOALS
23.16
EDTA
0.25
AA
30.74
BaS04 ppt.
1759
EDTA
736
AA
39
FE
56*
FE
3680
\gNO, titration
RTI
19.72
AA
0.42
AA
33.01
BaS04 ppt.
1800
AA
932
AA
74
AA
106
AA
3780
Volhard
XRF = X-ray fluorescence
EDTA = Ethylenediaminetetracetic
acid titration
BaSCL ppt.
Precipitation as
barium sulfate
AA = Atomic absorbance
FE = Flame emission
* Value discarded, second RTI value (different RTI laboratory) of 111 ppm
used for statistical purposes.
-------
3. Total sulfur determinations were consistent between Chattanooga
and Muscle Shoals; RTI's value was high and Shawnee's came in
between, with no large discrepancies.
4. Results in the liquid phase were relatively consistent, although
the Muscle Shoals laboratory obtained extremely low numbers for
sodium and potassium. Chattanooga, which also used flame
emission, got results consistent with the AA determinations of
Shawnee and RTI.
5. Chloride determinations showed good consistency across all
laboratories, although an interesting sidelight is that some
preliminary determinations at RTI, using a chloride-sensing
electrode, gave results that were high by roughly 100 percent.
This result was duplicated by a second, non-RTI laboratory,
indicating that the electrical environment of the slurry
liquid phase was unsuitable for chloride determination by
ion-selective electrode.
Table 3 singles out the Shawnee control laboratory results for comparison
with the mean results from the other three laboratories. Close agreement exists
for the most critical elements—calcium (both liquid and solid), total sulfur
(solid), and chloride (liquid). Magnesium in the solid showed the greatest
variation among the comparison laboratories (43 percent) and between Shawnee
and the mean of the comparison laboratories (31 percent). Across the board,
these results indicate the Shawnee control laboratory is performing routine
analyses at about the +20 percent level, referenced to comparable laboratories.
In particular, it appears to be obtaining rather good accuracy (+5 to 10 per-
cent) in its calcium, sulfur, and chloride analyses. These results are con-
sistent with material balance closures of 5 to 15 percent that have been
repeatedly obtained by Bechtel Corporation.
It should be stressed that the percentages quoted above are simple esti-
mates based upon direct comparisons between the Shawnee results and the mean
of the cooperating laboratories' results. They are not confidence levels. It
would be appropriate to carry out detailed statistical work only if consider-
ably more data were obtained.
22
-------
Table 3. Comparison of Shawnee results with
mean value of cooperating laboratories
ELEMENT
Ca (CaO)
wt * in solid
Mg (MgO)
wt % in solid
Total Sulfur (SO,)
wt % in solid
Ca, ppm in liquid
Mg, ppm in liquid
Na, ppm in liquid
K., ppm in liquid
Cl , ppm in liquid
SHAWNEE
RESULTS
23.19
0.29
31.99
1929
697
71
128
3580
MEAN OF
CHATTANOOGA,
MUSCLE SHOALS
& RTI RESULTS
21.94
0.42
31.25
1753
813
59
108*
3687
STANDARD
DEVIATION
ABOUT MEAN
1.92
0.18
1.57
50
104
18.2
. 2.5
90
COEFFICIENT
OF VARIATION
ABOUT MEAN
(*)
8.8
43
5.0
2.9
12.8
31
2.3
2.4
(SHAWNEE/MEAN) x 100
-100
(*)
+ 5.7
- 31
+ 2.4
+ 10.0
- 14.3
+ 19.7
+ 18.5
- 2.9
U)
* Low Muscle Shoals results thrown out, second RTI result (from another RTI laboratory) of 111
ppm used in averaging.
-------
A final point is that the Shawnee ionic imbalances for solid and liquid
were typically 2 to 3 percent and 10 to 15 percent respectively, both biased
negatively, during the sampling period. The close solid ionic balance indi-
cates either accurate analyses or balancing positive and negative ion analytical
errors. Results of this study appear to validate the accuracy of the analyti-
cal procedures used on the slurry solid.
The negative ionic imbalance in the liquid phase analyses cannot be
rationalized by the data obtained from this comparison study.
4.1.3 Overall Laboratory Evaluation
The control laboratory operation appears to be adequate for the routine
analytical work it performs. It has no formal quality control program, but
bad data may be flagged by either TVA or Bechtel personnel. Acceptance limits
on data are not formalized, but "reasonableness" is the experience-based
criterion.
There are problems associated with the lack of operator training programs,
incentives for superior performance and the like, but so long as the laboratory
operations remain strictly routine these problems are not likely to seriously
hamper the program.
Equipment and instrumentation is appropriate for the type of work done,
and it is maintained on a regular basis (largely by service contracts).
4.2 Gas Stream Sampling
4.2.1 Particulate Mass Loading
Side-by-side duplicate runs were not attempted. The entire sampling
procedure was observed, with critical techniques checked repeatedly, during the
site visits. Overall performance was evaluated using a checklist. On a scale
of 1 to 5, ranging from unacceptable to excellent, the Shawnee particulate
loading technique was rated 3 (acceptable). A major problem appeared to be the
failure of Shawnee personnel to carry out adequate leak-checking of the sampling
train. Specific comments are given below.
24
-------
4,2.1.1 Pitot tube comparison.
A comparison of TVA and RTI pitot tubes was performed at the Venturi inlet
only. A check of the outlet tube was not carried out due to the outlet tube
misalignment (> 30°) along its roll axis.
Side-by-side measurements were performed. Based upon comparison with the
RTI (NBS calibrated) pitot tube, the TVA tube Cp factor was 0.879. The assumed
value was 0.850. The difference was considered to be negligible.
4.2.1.2 Temperature measurement.
A system capable of measuring the stack gas temperature to within 1.5 per-
cent of the minimum absolute stack temperature is required. The temperature-
measuring system (inlet sampler) was checked versus a calibrated thermocouple
and was found to be within 1 percent.
4.2.1.3 Moisture measurement.
The impinger section of the EPA sampling train is intended to collect
moisture from the sample gases for determination of moisture content. The last
impinger contains silica gel to adsorb the water vapor not condensed in the
first two impingers. The moisture content of the sample gas leaving the silica
gel impinger increases as the exit gas temperature rises. Also, the exit gas
moisture content will increase as the sample train vacuum increases at any one
sample temperature. Moisture not collected by the condensation system is
incorrectly measured as dry gas by the dry test meter and the error is carried
through the isokinetic and grain loading calculations. However, if the exit
gas temperature is held below 25° C and the rain vacuum is held below 380 mm
of Hg, the resulting error in the sample volume will be less than 2 percent.
A single RTI reading of exit gas temperature was 22° C.
There was evidence of significant moisture accumulation in the silica gel,
indicating the presence of some water vapor in the total gas volume measured.
This does not likely introduce a large error into the technique, although it
would be advisable to make quantitative or semiquantitative checks on the
actual water volume collected versus water content of the stack gas. This is
not presently being done at Shawnee.
25
-------
4.2.1.4 Volume measurement.
The sampling train was checked for accuracy of volumetric measurement with
a calibrated dry test meter (1 cf/revolution) which had been previously cali-
brated versus a 1-cf wet test meter. The RTI meter was connected directly to
the Shawnee probe tip, so that the actual volume intake at the probe was
measured. RTI volume was 15.8 percent lower than TVA volume, indicating a
rather large positive bias in the TVA measurement. Critical examination of the
TVA sampling system led to the conclusion that the bias could be attributed to
leaks in the system (broken or cracked polycarbonate impinger tubes, loose
probe tip, etc.). Leakage rate was estimated to be 0.36 cfm at 380 mm of Hg
vacuum.* Inaccuracies in volume measurements appear directly in the concentra-
tion and particulate mass emission rate determinations.
A probe tip diameter check was made with a micrometer. The range of the
diameter measurements was 0.7 mm, indicating a severely out-of-round nozzle
which should be repaired or replaced. The estimated nozzle area was calculated
2
to be roughly 20 percent lower than the assumed area (0.583 cm calculated,
2
0.7125 cm assumed). An error in the nozzle diameter is quadrupled in the
process of determining isokinetic sampling rates and is doubled in the percent
of isokinetic sampling calculation. The percent isokinetic, as calculated with
respect to the above errors in volume measurement and nozzle diameter could
result in either a positive or negative bias, depending upon which factor
predominates.
4.2.2 Sulfur Dioxide Concentration Determinations
Sulfur dioxide concentrations at the wet limestone scrubber facility are
determined by means of du Pont Model 400 photometric analyzers. The analyzers
continuously monitor inlet and outlet gas streams of the venturi and TCA units.
The RTI audit team collected a total of 23 gas samples, all collected at
the venturi inlet. Thirteen of these samples were analyzed by a modified
barium chloranilate (colorimetrie) method, the remaining 10 by sodium hydroxide
TVA leak-checking was not observed by the audit team. A thorough leak-check
would surely have detected such a significant leak-rate.
26
-------
titration. Results are given In table 4. The average bias of the photometric
method with respect to the wet chemical methods was +6.9 percent, with a stand-
ard deviation of 8.7 percent. These results indicate that the du Pont analyzer
at the venturi inlet is yielding data of high quality. At the 95 percent con-
fidence level, an individual photometric determination should have a precision
of +18 percent of the mean concentration, biased 7 percent high on the average.*
Due to the time limitation of the audit team, it was not possible to run checks
on the other three analyzers.
4.3 Process Instrumentation
A series of calibration checks on the electronic instrumentation was
scheduled. Some checks and observations were not carried out because of TVA
personnel work schedules, but enough was accomplished for a judgment to be made
as to the quality of the instrumentation facilities.
Three types of sensors (temperature, differential pressure, and flow rate)
are the primary sources of measurement information being recorded and used for
the scrubber's mechanical operation control, and four readout devices are
employed for visual display of the output signals. The methods of test and
calibration are simple, using rudimentary sources of stimuli for sensor exam-
ination. Straightforward electrical current measuring instruments are used to
monitor currents produced by the transmitters. As performed, the tests are
sufficient to maintain the quality of measurement to the degree established by
the manufacturers in their design specifications.
The performance of equipment over time can best be judged by a review of
accurate records which clearly show a life history of each item having a
functional part in the operation of a system. The Shawnee facility was judged
deficient in recordkeeping for its instrumentation. In spite of this, it was
felt that the electronic devices used for physical measurements were being
maintained sufficiently to provide pressure, level, and flow information to a
+2 percent tolerance of desired nominal values, and temperature information
to a +10 percent tolerance of desired information (temperature sensors can be
This assumes no bias in the wet methods.
27
-------
Table 4.
Comparison of SO determinations
X
Sampling Train Mo.l
Date
11/18/75
11/18/75
11/1.9/75
11/19/75
11/19/75
11/19/75
11/19/75
11/19/75
11/19/75
11/19/75
11/19/75
00 11/19/75
11/19/75
11/19/75
11/20/75
11/20/75
11/20/75
11/20/75
11/20/75
11/20/75
11/20/75
11/20/75
11/20/75
Sample
Number
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
Sample
Time
17:15-17:25
17:35-17:45
10:15-10:23
10:42-10:50
11:07-11:21
12:08-12:17
12:43-12:49
12:53-13:02
13:39-13:47
13:50-13:59
14:22-14:31
14:32-14:41
15:06-15:15
15:17-15:27
09:16-09:27
09:30-09:40
09:53-10:04
10:05-10:15
10:22-10:33
10:41-10:52
10:58-11:09
11:11-11:22
11:37-11:48
SO by
Barium Cliloranllate
(ppm)
4187
2736
Sample Voided
Sample Voided
2261
2172
2309
1958
1582
1827
1750
1724
1690
S0x Contained
in Isopropranol
Scrubber
(ppm)
543
650
530
563
530
551
523
643
610
541
637
Total
4730
3386
2791
2735
2839
2509
2105
2470
2360
2265
2327
SO by*
NaOH Tit rat ion
2562
2252
3053
2832
2812
2757
2435
2385
2459
2294
S02 by
TVA
(DuPont Analy
2750
2750
3625
3585
3366
3167
3051
3046
3016
2985
2998
2947
2893
2909
2506
2509
2505
2508
2570
2600
2606
2648
2719
TVA-RTI
RTI X 10°
zer)
>7.3
+9.0
- 23.3
+5.9
-0.1
+9.1
+6.5
+9.1
+6.6
+3.8
+4.9
+ 15.9
+ 19.0
+3.0
+1.4
+5.2
+8.9
+5.7
+ 15.1
+ 15.4
+ 16.8
SO, analyzer accepted on
a total acid determination (TVA analysis)
Average Bias: +6.9%
Std. Dev.: 8.69%
95% Confidence Interval + 18.08%
-------
calibrated to a +2 percent tolerance of a known temperature—the inaccuracies
are estimated to be high because of the lack of knowledge of the thermodynamics
of the stack gases being measured).
4.4 Recommendations
Specification of a quality control program for the Shawnee scrubber project
does not fall within the scope of this report. The recommendations made in the
following paragraphs apply to the implementation of a (qualitative and perform-
ance audit) QA program.* This type of program normally should be carried out
by an organization which has no special interest in the data; i.e., no self-
interest to protect and no preconceptions as to the quality of the information
forthcoming. On the other hand, the organization should be reputable and well
qualified to carry out the type of auditing program desired.
In the case of EPA demonstration projects, EPA may wish to contract a
third party to handle the audit program, or it may handle the program by means
of its own QA staff. In either case, it is quite important that the auditing
be done competently and objectively.
It is recommended that the wet limestone scrubber operation located at
the Shawnee steam-electric plant be externally audited twice each calendar
year. Timing of the audit program, which normally should take 1 work week,
should be coordinated among the auditing team, EPA, TVA, and the Bechtel
Corporation. Some advance notice is necessary in order to insure cooperation
of operational personnel. It is not recommended that the audits be scheduled
on a regular basis, since by definition an audit is conducted without extensive
"preparation" at the project being audited. Advance notice to EPA and Bechtel
supervisory staff should be at least 2 weeks, so that the audit team can be
apprised of special test and analysis schedules which may alter its audit
procedure or cause postponement of the audit itself. Advance notice to TVA
staff (senior chemist, instrumentation foreman) should be at least 1 week.
*It is important that the Shawnee project continue to develop its own QC
program internally.
29
-------
It is recommended that the audit team concentrate its efforts in the
following major areas:
1. Verification of pH measurements at inlet and outlet, on both
TCA and venturi scrubbers. An accurate pH meter brought in by
the audit team, with appropriate buffer solutions, should be
used. Measurement of pH is critical to efficient process
control at this facility.
2. Independent chemical analysis of slurry samples by several
laboratories. A continuing audit program can aid in
establishing acceptance limits and method biases.
3. Verification of particulate mass loading and sulfur dioxide
measurement systems. If possible, side-by—side operation of
TVA and audit team sampling trains should be carried out,
with independent analyses of the collected samples. A wet
chemical technique such as total acid titration should be
used to check the SCL analyzer response. If a duplicate
sampling train cannot be used by the audit team, then
critical measurement parameters should be identified and
checked. For stack sampling procedure, this includes (at
a minimum):
a. Sample volume measurement check by means of a calibrated
wet test or dry gas meter;*
b. Pitot tube (C_ factor) check by means of an NBS calibrated
pi tot tube;
c. Thermometer and thermocouple checks with a calibrated
temperature measurement system;
d. Stack gas moisture content check by means of an absorbing
impinger train.
4. Electronic checks on process instrumentation and physical
measurement techniques.
For suggestions as to techniques available, see "Process Stream Volumetric Flow
Measurement and Gas Sample Extraction Methodology," by Brooks and Williams.
This manual (TRW Document No. 24916-6028-RU-OO) was prepared under EPA Contract
No. 68-02-1412, for the Process Measurements Branch of IEBL.
30
-------
5.0 EVALUATION OF THE SHORT-TERM QUALITY ASSURANCE PROGRAM AT SHAWNEE
The objective of this project was to devise, implement, and modify a
general quality assurance program for IERL demonstration projects. It is ap-
propriate then to evaluate the procedures which RTI used in its qualitative
and performance audit at the Shawnee scrubber facility.
5.1 Qualitative Audit
The audit checklist (appendix A) is a valuable device for assuring a
balanced review of each major area. For the control systems laboratory opera-
tions, the checklist was quite useful. It is recommended that either this
checklist or a similar one be used each time a review is done. One note of
caution is in order. No checklist can substitute for experience and common
sense. Each facility will require a somewhat different approach by a QA team.
The checklist alone will not suffice. It will be particularly appropriate with
certain phases of a project, but in other areas a good part of the questionnaire
will not be applicable.
5.2 Quantitative Performance Audit
The quantitive performance audit at Shawnee emphasized several potential
problem areas of which an audit team should be cognizant. These are discussed
in the subsections below.
5.2.1 Scheduling
If samples are to be taken for round robin purposes, it is important that
the laboratory being audited perform complete analyses on each sample to be
distributed to outside laboratories. This requires close supervision by the
member of the auditing team responsible for sample collection. At Shawnee,
the sampling schedule (at the time of the audit) called for analysis of the
solid at certain times of the day, and both solid and liquid at other times.
Because of the preoccupation of the senior chemist with routine operations,
this schedule was not made clear and several samples (which required complete
analyses for comparison purposes) were collected at times calling for solid
31
-------
analysis only. When this was discovered, it was necessary to ship some surplus
samples (which RTI was fortunate to have kept) back to the Shawnee laboratory
for liquid analysis. Had the surplus samples not been available, it would have
been necessary to collect a second series of samples at another time. This
example serves to point out the need for complete understanding, on the part of
both senior laboratory personnel and audit team personnel, of the requirements
of the audit.
A related scheduling problem occurred when the Shawnee instrumentation
foreman was unable to provide RTI with the assistance needed to complete cali-
bration and testing of the facilities' electronic process monitoring equipment.
It is of overriding importance that efficient scheduling of audit activities
occur, since the total time allowed is normally a few days to 1 week. Un-
scheduled operational problems will occur which could not have been anticipated,
and these must be handled as expeditiously as possible.
5.2.2 Equipment and Instrumentation
It should be unnecessary to emphasize the desirability of having reliable,
high-quality audit equipment and instrumentation. Where feasible, the audit
team should have duplicate (or at least equivalent) backup items of that equip-
ment which is crucial to the audit. For example, if pH measurements are to be
made, two probes should be on hand. If possible, two meters should also be
available. During the Shawnee audit, a measurement problem developed due to a
shielded probe lead making electrical contact with the meter chassis. This
problem had not been discovered earlier because the meter had previously been
used only in nongrounded environments (such as glass beakers). The metal
slurry pots were directly grounded to the scrubber framework and a relatively
large extraneous signal was being read by the meter. This signal, by its magni-
tude, swamped the circuit and pegged the meter dial each time a pH reading was
attempted. Fortunately, the problem was diagnosed as a missing insulating
washer, and a washer was fabricated in time to take a series of readings. It
would have been much more desirable to have had a second meter immediately
available, since pH comparisons were certainly an important aspect of this work.
Spare glassware (and other fragile items) is obviously desirable, since
in an unfamiliar environment laboratory workers are more prone to make mistakes
resulting in breakage.
32
-------
5,2.3 Personnel Selection
An audit team must be competent and versatile. The RTI team assigned to
the Shawnee project consisted of a physical chemist, an analytical chemist, and
an electrical engineer. Each man knew his responsibility. An audit visit,
because of its short duration, does not allow for inefficient use of personnel
time or audit equipment. The services of the electrical engineer were useful
when the pH meter malfunction was discovered, pointing up the advantages of a
diversity of technical talent. A team composed entirely of chemists might not
have been able to repair the meter in time to make the required number of
measurements.
33
-------
APPENDIX A
QUALITATIVE AUDIT CHECKLIST
FOR DEMONSTRATION PROJECTS
35
-------
APPENDIX A QUALITATIVE AUDIT CHECKLIST FOR DEMONSTRATION PROJECTS
This checklist is designed to:
1. Identify existing system documentation; i.e., maintenance manuals,
organizational structure, operating procedures, etc.
2. Evaluate the adequacy of the procedures as documented.
3. Evaluate the degree of use of and adherence to the documented
procedures in day-to-day operations based on observed conditions
(auditor) and a review of applicable records on file.
The checklist gives three descriptions to each facet of a quality control
system. In all cases the "5" choice is the most desirable and effective mode
of operation; "3" is marginal and tolerable; "1" is definitely unacceptable
and ineffective as a mode of operation.
It is not always possible to describe accurately all options with only
three choices. Therefore, a "2" or "4" rating may be selected if the evaluator
feels that an in-between score is more descriptive of the actual situation.
After all the applicable questions are answered, an average is computed
to give an overall indication of the quality system effectiveness.
Generally, a rating of 3.8 or better is considered acceptable.
A rating between 2.5 and 3.8 indicates a need for improvement but no
imminent threat to project performance as it stands.
For the control laboratory, the results are as follows:
1. Of 82 check questions, 65 were answered on site;
2. Average score was 3.0 (5.0 maximum), indicating a satisfactory but
not outstanding program as presently operated;
3. The control laboratory was judged weak in its quality control
organization, procurement, and inventory procedures, and in its
personnel training policy;
4. Strong points were its day-to-day "in-process" quality assurance,
its calibration procedures, and its facilities and equipment.
The completed questionnaire, with indicated judgments in specific areas,
is given herewith. These judgments are for the control laboratory operation
only.
36
-------
A.I QUALITY ORGANIZATION
SCORE
(1.1) Overall responsibility for quality assurance (or
quality control) for the organization is:
(a) Assigned to one individual by title (e.g.,
Quality Control Coordinator). 5
(b) Assigned to a specific group within the organization. 3
(c) Not specifically assigned but left to the discre-
tion of the various operational, analytical, inspec-
tion, and testing personnel. 1
(1.2) The Quality Control Coordinator is located in the
organization such that:
(a) He has direct access to the top management level
for the total operation, independent of others in-
volved in operational activities. 5
(b) He performs as a peer with others involved in
operational activities, with access to top manage-
ment through the normal chain of command. 3
(c) His primary responsibility is in operational
activities, with quality assurance as an extra or
part-time effort. 1
(1.3) Data reports on quality are distributed by the Quality
Control Coordinator to:
(a) All levels of management.* 5
(b) One level of management only. 3
(c) The quality control group only. 1
(1.4) Data Quality Reports contain:
(a) Information on operational trends, required
actions, and danger spots. 5
(b) Information on suspected data/analyses and
their causes. 3
(c) Percent of valid data per month. 1
*Management at appropriate levels in all applicable organizations such
as subcontractors, prime contractor, EPA.
37
-------
A.2 THE QUALITY SYSTEM
SCORE
(2.1) The quality control system is:
(a) Formalized and documented by a set of procedures
which clearly describe the activities necessary
and sufficient to achieve desired quality objec-
tives, from procurement through to reporting data
to the EPA/RTP. 5
(b) Contained in methods procedures or is implicit in
those procedures. Experience with the materials,
product, and equipment is needed for continuity
of control. 3
(c) Undefined in any procedures and is left to the cur-
rent managers or supervisors to determine as the
situation dictates. 1
(2.2) Support for quality goals and results is indicated by:
(a) A clear statement of quality objectives by the top
executive, with continuing visible evidence of its
sincerity, to all levels of the organization. 5
(b) Periodic meetings among operations personnel and the
individual(s) responsible for quality assurance, on
quality objectives and progress toward their achieve-
ment . 3
(c) A "one-shot" statement of the desire for product
quality by the top executive, after which the quality
assurance staff is on its own. 1
(2.3) Accountability for quality is:
(a) Clearly defined for all sections and operators/
analysts where their actions have an impact on
quality.
(b) Vested with the Quality Control Coordinator who
must use whatever means possible to achieve quality
goals.
(c) Not defined.
38
-------
A.2 THE QUALITY SYSTEM (continued)
SCORE
(2.4) The acceptance criteria for the level of quality
of the demonstration projects routine performance are:
(a) Clearly defined in writing for all characteris-
tics . 5
(b) Defined in writing for some characteristics
and some are dependent on experience, memory
and/or verbal communication. 3
(c) Only defined by experience and verbal communica-
tion. 1
(2.5) Acceptance criteria for the level of quality of the
project's routine performance are determined by:
(a) Monitoring the performance in a structured pro-
gram of inter- and intralaboratory evaluations. 5
(b) Scientific determination of what is technically
feasible. 3
(c) Laboratory determination of what can be done using
currently available equipment, techniques, and
manpower. 1
(2.6) Decisions on acceptability of questionable results are
made by:
(a) A review group consisting of the chief chemist or
engineer, quality control, and others who can render
expert judgment. 5
(b) An informal assessment by quality control. 3
(c) The operator/chemist. 1
39
-------
A.2 THE QUALITY SYSTEM (continued)
SCORE
(2.7) The quality control coordinator has the authority to:
(a) Affect the quality of analytical results by in-
serting controls to assure that the methods meet
the requirements for precision, accuracy, sensi-
tivity, and specificity. 5
(b) Reject suspected results and stop any method that
projects high levels of discrepancies. 3
(c) Submit suspected results to management for a
decision on disposition. 1
A.3 IN-PROCESS QUALITY ASSURANCE
(3.1) Measurement methods are checked:
(a) During operation for conformance to operating
conditions and to specifications, e.g., flow rates,
reasonableness of data, etc. 5
(b) During calibration to determine acceptability
of the results. 3
(c) Only when malfunctions are reported. 1
(3.2) The capability of the method to produce within
specification limit is:
(a) Known through method capability analysis (X-R
Charts) to be able to produce consistently
acceptable results.
(b) Assumed to be able to produce a reasonably
acceptable result.
(c) Unknown.
(3.3) Method determination discrepancies are:
(a) Analyzed immediately to seek out the causes and
apply corrective action. 5
(b) Checked out when time permits. 3
(c) Not detectable with present controls and procedures. 1
40
-------
A.3 IN-PROCESS QUALITY ASSURANCE (continued)
SCORE
(3.4) The operating conditions (e.g., flow rate, range,
temperature, etc.) of the methods are:
(a) Clearly defined in writing in the method for each
significant variable. 5
(b) Controlled by supervision based on general guide-
lines . 3
(c) Left up to the operator/analyst. 1
(3.5) Auxiliary measuring, gaging, and analytical
instruments are:
(a) Maintained operative, accurate, and precise
by regular checks and calibrations against
stable standards which are traceable to the
U.S. Bureau of Standards. 5
(b) Periodically checked against a zero point or
other reference and examined for evidence of
physical damage, wear or inadequate maintenance. 3
(c) Checked only when they stop working or when ex-
cessive defects are experienced which can be
traced to inadequate instrumentation. 1
A.4 CONFIGURATION CONTROL
(4.1) Procedures for documenting, for the record, any design
change in the system are:
(a) Written down and readily accessible to those
individuals responsible for configuration con-
trol. 5
(b) Written down but not in detail. 3
(c) Not documented. 1
41
-------
A.4 CONFIGURATION CONTROL (continued)
SCORE
(4.2) Engineering schematics are:
(a) Maintained current on the system and subsystem
levels. 5
(b) Maintained current on certain subsystems only. 3
(c) Not maintained current. 1
(4.3) All computer programs are:
(a) Documented and flow charted. 5
(b) Flow charted. 3
(c) Summarized. 1
(4.4) Procedures for transmitting significant design changes
in hardware and/or software to the EPA project officer
are:
(a) Documented in detail sufficient for implementation. 5
(b) Documented too briefly for implementation. 3
(c) Not documented. 1
A.5 DOCUMENTATION CONTROL
(5.1) Procedures for making revisions to technical documents
are:
(a) Clearly spelled out in written form with the line
of authority indicated and available to all involved
personnel. 5
(b) Recorded but not readily available to all personnel. 3
(c) Left to the discretion of present supervisors/mana-
gers. 1
42
-------
A.5 DOCUMENTATION CONTROL (continued)
SCORE
(5.2) In revising technical documents, the revisions are:
(a) Clearly spelled out in written form and distrib-
uted to all parties affected, on a controlled basis
which assures that the change will be implemented
and permanent. 5
(b) Communicated through memoranda to key people who
are responsible for effecting the change through
whatever method they choose. 3
(c) Communicated verbally to operating personnel who
then depend on experience to maintain continuity
of the change. 1
(5.3) Changes to technical documents pertaining to opera-
tional activities are:
(a) Analyzed to make sure that any harmful side effects
are known and controlled prior to revision effectiv-
ity. 5
(b) Installed on a trial or gradual basis, monitoring
the product to see if the revision has a net bene-
ficial effect. 3
(c) Installed immediately with action for correcting side
effects taken if they show up in the final results. 1
(5.4) Revisions to technical documents are:
(a) Recorded as to date, serial number, etc. when the
revision becomes effective. 5
(b) Recorded as to the date the revision was made on
written specifications. 3
(c) Not recorded with any degree of precision. 1
43
-------
A.5 DOCUMENTATION CONTROL (continued)
SCORE
(5.5) Procedures for making revisions to computer software
programs are:
(a) Clearly spelled out in written form with the line
of authority indicated. 5
(b) Not recorded but changes must be approved by the
present supervisor/manager. 3
(c) Not recorded and left to the discretion of the
programmer. 1
(5.6) In revising software program documentation, the re-
visions are:
(a) Clearly spelled out in written form, with reasons
for the change and the authority for making the
change distributed to all parties affected by the
change. 5
(b) Incorporated by the programmer and communicated
through memoranda to key people. 3
(c) Incorporated by the programmer at his will. 1
(5.7) Changes to software program documentation are:
(a) Analyzed to make sure that any harmful side
effects are known and controlled prior to
revision effectivity. 5
(b) Incorporated on a trial basis, monitoring the
results to see if the revision has a net bene-
ficial effect. 3
(c) Incorporated immediately with action for detecting
and correcting side effects taken as necessary. 1
44
-------
A.5 DOCUMENTATION CONTROL (continued)
SCORE
(5.8) Revisions to software program documentation are:
(a) Recorded as to date, program name or number, etc.,
when the revision becomes effective. 5
(b) Recorded as to the date the revision was made. 3
(c) Not recorded with any degree of precision. 1
A.6 PREVENTIVE MAINTENANCE
(6.1) Preventive maintenance procedures are:
(a) Clearly defined and written for all measurement
systems and support equipment. 5
(b) Clearly defined and written for most of the measure-
ment systems and support equipment. 3
(c) Defined and written for only a small fraction of the
total number of systems. 1
(6.2) Preventive maintenance activities are documented:
(a) On standard forms in station log books. 5
(b) Operator/analyst summary in log book. 3
(c) As operator/analyst notes. 1
(6.3) Preventive maintenance procedures as written appear
adequate to insure proper equipment operation for:
(a) All measurement systems and support equipment. 5
(b) Most of the measurement systems and support equip-
o
ment. J
(c) Less than half of the measurement systems and sup-
port equipment. *•
45
-------
A.6 PREVENTIVE MAINTENANCE
SCORE
(6.4) A review of the preventive maintenance records indicates
that:
(a) Preventive maintenance procedures have been carried
out on schedule and completely documented. 5
(b) The procedures were carried out on schedule but not
completely documented. 3
(c) The procedures were not carried out on schedule all
the time and not always documented. 1
(6.5) Preventive maintenance records (histories) are:
(a) Utilized in revising maintenance schedules, de-
veloping an optimum parts/reagents inventory and
development of scheduled replacements to minimize
wear-out failures. 5
(b) Utilized when specific questions arise and for
estimating future work loads. 3
(c) Utilized only when unusual problems occur. 1
A.7 DATA VALIDATION PROCEDURES
(7.1) Data validation procedures are:
(a) Clearly defined in writing for all measurement
systems. 5
(b) Defined in writing for some measurement systems,
some dependent on experience, memory, and/or
verbal communication. 3
(c) Only defined by experience and verbal communica-
tion. 1
46
-------
A.7 DATA VALIDATION PROCEDURES (continued)
SCORE
(7.2) Data validation procedures are:
(a) A coordinated combination of computerized and
manual checks applied at different levels in the
measurement process. 5
(b) Applied with a degree of completeness at no more
than two levels of the measurement process. 3
(c) Applied at only one level of the measurement pro-
cess
(7.3) Data validation criteria are documented and include:
(a) Limits on: (1) operational parameters such as
flow rates; (2) calibration data, (3) special
checks unique to each measurement; e.g., succes-
sive values/averages ; (4) statistical tests; e.g.,
outliers; (5) manual checks such as hand calcula-
tions .
(b) Limits on the above type checks for most of the
measurement systems.
(c) Limits on some of the above type checks for only
the high-priority measurements.
(7.4) Acceptable limits as set are reasonable and adequate
to insure the detection of invalid data with a high
probability for:
(a) All measurement systems. 5
(b) At least 3/4 of the measurement systems. 3
(c) No more than 1/2 of the measurement systems. 1
47
-------
A.7 DATA VALIDATION PROCEDURES (continued)
SCORE
(7.5) Data validation activities are:
(a) Recorded on standard forms at all levels of the
measurement process. 5
(b) Recorded in the operator's/analyst's log book. 3
(c) Not recorded in any prescribed manner. 1
(7.6) Examination of data validation records indicates that:
(a) Data validation activities have been carried out
as specified and completely documented. 5
(b) Data validation activities appear to have been
performed but not completely documented. 3
(c) Data validation activities, if performed, are not
formally documented. 1
(7.7) Data validation summaries are:
(a) Prepared at each level or critical point in the
measurement process and forwarded to the next level
with the applicable block of data. 5
(b) Prepared by and retained at each level. 3
(c) Not prepared at each level nor communicated between
levels. 1
(7.8) Procedures for deleting invalidated data are:
(a) Clearly defined in writing for all levels of the meas-
urement process, and invalid data are automatically
deleted when one of the computerized validation cri-
teria is exceeded. 5
(b) Programmed for automatic deletion when computerized
validation criteria are exceeded but procedures not
defined when manual checks detect invalid data. 3
(c) Not defined for all levels of the measurement pro-
cess. 1
48
-------
A.7 DATA VALIDATION PROCEDURES (continued)
SCORE
(7.9) Quality audits (i.e., both on-site system reviews and/or
quantitative performance audits) independent of the normal
operations are:
(a) Performed on a random but regular basis to ensure
and quantify data quality. 5
(b) Performed whenever a suspicion arises that there
are areas of ineffective performance. 3
(c) Never performed. 1
A.8 PROCUREMENT AND INVENTORY PROCEDURES
(8.1) Purchasing guidelines are established and documented
for:
(a) All equipment and reagents having an effect on data
quality. 5
(b) Major items of equipment and critical reagents. 3
(c) A very few items of equipment and reagents. 1
(8.2) Performance specifications are:
(a) Documented for all items of equipment which have
an effect on data quality. 5
(b) Documented for the most critical items only. 3
(c) Taken from the presently used items of equipment. 1
(8.3) Reagents and chemicals (critical items) are:
(a) Procured from suppliers who must submit samples
for test and approval prior to initial shipment. 5
(b) Procured from suppliers who certify they can meet
all applicable specifications. 3
(c) Procured from suppliers on the basis of price and
delivery only. l
49
-------
A.8 PROCUREMENT AND INVENTORY PROCEDURES (continued)
SCORE
(8.4) Acceptance testing for incoming equipment is:
(a) An established and documented inspection procedure
to determine if procurements meet the quality assurance
and acceptance requirements. Results are document-
ed. 5
(b) A series of undocumented performance tests performed
by the operator before using the equipment. 3
(c) The receiving document is signed by the responsible
individual indicating either acceptance or rejection. 1
(8.5) Reagents and chemicals are:
(a) Checked 100% against specification, quantity, and
for certification where required and accepted
only if they conform to all specifications. 5
(b) Spot-checked for proper quantity and for shipping
damage. 3
(c) Released to analyst by the receiving clerk without
being checked as above. 1
(8.6) Information on discrepant purchased materials is:
(a) Transmitted to the supplier with a request for
corrective action. 5
(b) Filed for future use. 3
(c) Not maintained. j_
(8.7) Discrepant purchased materials are:
( a) Submitted to a review by Quality Control and
Chief Chemist for disposition. 5
(b) Submitted to Service Section for determination
on acceptability. 3
(c) Used because of scheduling requirements. 1
50
-------
A.8 PROCUREMENT AND INVENTORY PROCEDURES (continued)
SCORE
(8.8) Inventories are maintained on:
(a) Flrst-in, first-out basis. 5
(b) Random selection in stock room. 3
(c) Last-in, first-out basis. 1
(8.9) Receiving of materials is:
(a) Documented in a receiving record log, giving a
description of the material, the date of receipt,
results of acceptance test, and the signature
of the responsible individual. 5
(b) Documented in a receiving record log with material
title, receipt date, and initials of the individual
logging the material in. 3
(c ) Documented by filing a signed copy of the requisi-
tion. 1
(8.10) Inventories are:
(a) Identified as to type, age, and acceptance status. 5
(b) Identified as to material only. 3
(c) Not identified in writing. 1
(8.11) Reagents and chemicals which have limited shelf life are:
(a) Identified as to shelf life expiration data and
systematically issued from stock only if they
are still within that date. 5
(b) Issued on a first-in, first-out basis, expecting
that there is enough safety factor so that the
expiration date is rarely exceeded. 3
(c) Issued at random from stock. 1
51
-------
A.9 PERSONNEL TRAINING PROCEDURES
SCORE
(9.1) Training of new employees is accomplished by:
(a) A programmed system of training where elements of
training, including quality standards, are included
in a training checklist. The employee's work is
immediately rechecked by supervisors for errors or
defects and the information is fed back instanta-
neously for corrective action. 5
(b) On-the-job training by the supervisor who gives
an overview of quality standards. Details of
quality standards are learned as normal results
are fed back to the chemist. . 3
(c) On-the-job learning with training on the rudi-
ments of the job by senior coworkers. 1
(9.2) When key personnel changes occur:
(a) Specialized knowledge and skills are retained in
the form of documented methods and descriptions.
(b) Replacement people can acquire the knowledge of
their predecessors from coworkers, supervisors,
and detailed study of the specifications and
memoranda.
(c) Knowledge is lost and must be regained through long
experience or trial-and-error.
(9.3) The people who have an impact on quality, e.g., cali-
bration personnel, maintenance personnel, bench chemists,
supervisors, etc., are:
(a) Trained in the reasons for and the benefits of
standards of quality and the methods by which
high quality can be achieved.
(b) Told about quality only when their work falls below
acceptable levels.
(c) Are reprimanded when quality deficiencies are
directly traceable to their work.
52
-------
A.9 PERSONNEL TRAINING PROCEDURES (continued)
SCORE
(9.4) The employee's history of training accomplishments
is maintained through:
(a) A written record maintained and periodically
reviewed by the supervisor. 5
(b) A written record maintained by the employee. 3
(c) The memory of the supervisor/employee. 1
(9.5) Employee proficiency is evaluated on a continuing
basis by:
(a) Periodic testing in some planned manner with the
results of such tests recorded. 5
(b) Testing when felt necessary by the supervisor. 3
(c) Observation of performance by the supervisor. 1
(9.6) Results of employee proficiency tests are:
(a) Used by management to establish the need for and
type of special training. 5
(b) Used by the employee for self-evaluation of needs. 3
(c) Used mostly during salary reviews. 1
A.10 FEEDBACK AND CORRECTIVE ACTION
(10.1) A and corrective action mechanism to assure
that problems are reported to those who can correct them
and that a closed loop mechanism is established to assure
that appropriate corrective actions have been taken is:
»
(a) Clearly defined in writing with individuals assigned
specific areas of responsibility. 5
(b) Written in general terms with no assignment of
responsibilities. 3
(c) Not formalized but left to the present supervisors/
managers. 1
53
-------
A.10 FEEDBACK AND CORRECTIVE ACTION (continued)
SCORE
(10.2) Feedback and corrective action activities are:
(a) Documented on standard forms. 5
(b) Documented in the station log book. 3
(c) Documented in the operator's/analyst's notebook. 1
(10.3) A review of corrective action records indicates that:
(a) Corrective actions were systematic, timely, and
fully documented. 5
(b) Corrective actions were not always systematic,
timely, or fully documented. 3
(c) A closed loop mechanism did not exist. 1
(10.4) Periodic summary reports on the status of corrective
action are distributed by the responsible individual to:
(a) All levels of management. 5
(b) One level of management only. 3
(c) The group generating the report only. 1
(10.5) The reports include:
(a) A listing of major problems for the reporting
period; names of persons responsible for correc-
tive actions; criticality of problems; due dates;
present status; trend of quality performance (i.e.,
response time, etc.); listing of items still open
from previous reports. 5
(b) Most of the above items. ' 3
(c) Present status of problems and corrective actions. 1
54
-------
A.ll CALIBRATION PROCEDURES
SCORE
(11.1) Calibration procedures are:
(a) Clearly defined and written out in step-by-step
fashion for each measurement system and support
device. 5
(b) Defined and summarized for each system and device. 3
(c) Defined but operational procedures developed by
the individual. 1
(11.2) Calibration procedures as written are:
(a) Judged to be technically sound and consistent with
data quality requirements. 5
(b) Technically sound but lacking in detail. 3
(c) Technically questionable and lacking in detail. 1
(11.3) Calibration standards are:
(a) Specified for all systems and measurement devices
with written procedures for assuring, on a con-
tinuing basis, traceability to primary standards. 5
(b) Specified for all major systems with written
procedures for assuring traceability to pri-
mary standards. 3
(c) Specified for all major systems but no procedures
for assuring traceability to primary standards. 1
(11.4) Calibration standards and traceability procedures as
specified and written are:
(a) Judged to be technically sound and consistent
with data quality requirements. 5
(b) Standards are satisfactory but traceability is
not verified frequently enough. 3
(c) Standards are questionable. 1
55
-------
A.11 CALIBRATION PROCEDURES (continued)
SCORE
(11.5) Frequency of calibration is:
(a) Established and documented for each measurement
system and support measurement device. 5
(b) Established and documented for each major meas-
urement system. 3
(c) Established and documented for only certain
measurement systems. 1
(11.6) A review of calibration data indicates that the
frequency of calibration as implemented:
(a) Is adequate and consistent with data quality
requirements. 5
(b) Results in limits being exceeded a small frac-
tion of the time. 3
(c) Results in limits being exceeded frequently. 1
(11.7) A review of calibration history indicates that:
( a) Calibration schedules are adhered to and results
fully documented. 5
(b) Schedules are adhered to most of the time. 3
(c) Schedules are frequently not adhered to. 1
(11.8) A review of calibration history and data validation
records indicates that:
(a) Data are always invalidated and deleted when
calibration criteria are exceeded.
(b) Data are not always invalidated and/or deleted
when criteria are exceeded.
(c) Data are frequently not invalidated and/or deleted
when criteria are exceeded.
56
-------
A.11 CALIBRATION PROCEDURES (continued)
SCORE
(11.9) Acceptability requirements for calibration results
are:
(a) Defined for each system and/or device requiring
calibration including elapsed time since the
last calibration as well as maximum allowable
change from the previous calibration. 5
(b) Defined for all major measurement systems. 3
(c) Defined for some major measurements systems only. 1
(11.10) Acceptability requirements for calibration results as
written are:
(a) Adequate and consistent with data quality require-
ments : 5
(b) Adequate but others should be added. 3
(c) Inadequate to ensure data of acceptable quality. 1
(11.11) Calibration records (histories) are:
(a) Utilized in revising calibration schedules (i.e.,
frequency). 5
(b) Utilized when specific questions arise and re-
viewed periodically for trends, completeness,
etc. 3
(c) Utilized only when unusual problems occur. 1
A. 12 FACILITIES/EQUIPMENT
(12.1) Facilities/Equipment are:
(a) Adequate to obtain acceptable results. 5
(b) Adequate to obtain acceptable results most of
the time. 3
(c) Additional facilities and space are needed. 1
57
-------
A.12 FACILITIES/EQUIPMENT (continued)
SCORE
(12.2) Facilities, equipment, and materials are:
(a) As specified in appropriate documentation and/or
standards. 5
(b) Generally as specified in appropriate standards. 3
(c) Frequently different from specifications. 1
(12.3) Housekeeping reflects an orderly, neat, and
effective attitude of attention to detail in:
(a) All of the facilities. 5
(b) Most of the facilities. 3
(c) Some of the facilities. 1
(12.4) Maintenance Manuals are:
(a) Complete and readily accessible to maintenance
personnel for all systems, components, and
devices. 5
(b) Complete and readily accessible to maintenance
personnel for all major systems, components, and
devices. 3
(c) Complete and accessible for only a few of the
systems. 1
A.13 RELIABILITY
(13.1) Procedures for reliability data collection, processing,
and reporting are:
(a) Clearly defined and written for all system com-
ponents .
(b) Clearly defined and written for major components
of the system.
(c) Not defined.
58
-------
A.13 RELIABILITY (continued)
SCORE
(13.2) Reliability data are:
(a) Recorded on standard forms. 5
(b) Recorded as operator/analyst notes. 3
(c) Not recorded. 1
(13.3) Reliability data are:
(a) Utilized in revising maintenance and/or replace-
ment schedules. 5
(b) Utilized to determine optimum parts inventory. 3
(c) Not utilized in any organized fashion. 1
59
-------
APPENDIX B
STANDARD TECHNIQUES USED IN
QUANTITATIVE PERFORMANCE AUDITS
61
-------
Ambient air techniques'
to
Pollutant EPA Bias (absolute,
method or percent of
or mean con centra-
number tion)
SO 6 0
^
NO, N00 Chemilumi- 0
2
NO nescent
X
Photochem- Chemilumi- -35 to -15% °
ical oxi- nescent from 0.05 to
dants 0.50 ppm
CO NDIR +2.5
Precision
coefficient
Within-
laboratory
5-13 yg/m3,
from
x = 0-1000
yg/m3
7-8% at
100 yg/m
(0.05 ppm)
0.0033 + °
0.0255 x
(0-0.5 ppm)
3
0.6 mg/m
(absolute, or
of variation)
Between-
laboratory
10-25 yg/m3
from x = .
0-1000 yg/m
_
0.0008 +
0.0355 x
(0-0.5 ppm)
-0.0051 +
0.0690 x
(0.15-0.5 ppm)
0.8 - 1.6
mg/m3 (non-
linear varia-
tion) over
0-60 mg/m3
Comments
Lower limit of detec-
tion is 25 yg/m3. Flow
rate changes, sampling
train leakage are prim-
ary error sources.
Lower limit of detec-
tion is 10 yg/m3 (0.005
ppm) . Errors are asso-
ciated with calibration
and instrument drift
(from zero and span
settings) .
Lower detection limit is
0.0065 ppm
Lower detection limit is
0,3 mg/m3. Interference
of water vapor is signifi-
cant.
Reference
EPA-R4-
73-028d
d
EPA-R4-
73-028c
EPA-R4-
73-028a
-------
Ambient air techniques (con.)
ON
U>
Pollutant
Particulates
EPA
method
or
number
High-
Volume
Bias (absolute,
or percent of
mean concentra-
tion)
No information
Precision (absolute, or
coefficient of variation)
Within- Between-
laboratory laboratory
3% 3.7%
Comments
Minimum detectable limit
is 3 mg. Shorter samp-
Reference
EPA-R4-
73-0 2 8b
NO
Arsenite
-3% (50-300
yg/m3)
8 yg/m
(50-300 lag/
m3)
11 yg/m
(50-300 yg/
m3)
ling periods give less
precise results, biased
high.
A tentative method.
Lower detectable limit
is 9 yg/m3.
EPA-R4-
73-280o
.
This table is a summary of information contained in the cited references, all of which are quality
assurance guideline manuals published by EPA. Collaborative test results are cited, if available, in the
manuals .
x = pollutant concentration.
°EPA-650/4-75/016.
Guidelines for Development of a Quality Assurance Program for the Continuous Measurement of Nitrogen
Dioxide in the Ambient Air (Chemilumines cent) , Smith & Nelson, Research Triangle Institute, Research
Triangle Park, N.C. 27709.
-------
Source Sampling Techniques'
Pollutant EPA
me t hod
or
numb er
S02 6
so
and 8
S03/H2S04
NO 7
X
CO 10
Particulates 5
Visible 9
emissions
Be 104
Bias (absolute,
or percent of
mean concentra-
tion)
0
-2% (analysis
only)
-2% (analysis
only)
0
+7 ppm
No information
+1.4% opacity
-20%, average
Precision
coefficient
Within-
laboratory
3.9
o
0.1 g/ni
60%
7%
13 ppm
10-30%
2% opacity
44%
(absolute, or
of variation)
Between-
laboratory
5.5
•3
0.11 g/m
65%
10%
25 ppm
20-40%
2.5%
58%
Comments
Major error source is dif-
ficulty of obtaining repro-
ducible titration end-
points . Minimum detect-
able limit is 3 ppm.
Same analysis technique
as Method 6 above.
Grab sample; largest error
source is failure to re-
calibrate spectrophoto-
meter.
Analyzer drift and C02
interference are largest
problems . Minimum detect-
able limit is 20 ppm.
Numerous small error
sources associated with
stack sampling.
Good results depend to a
great extent on the effec-
tive training of observers.
Reference
EPA-650/
14-74-
005-e
EPA-650/
14-74-
005-g
EPA-650/
14-74-
005-f
EPA-650/
14-74-
005-h
EPA-650/
14-74-
00 5- d
EPA-650/
14-74-
005-i
EPA-650/
14-74-
005-k
aThis table is a summary of information contained in the cited references, all of which are quality
assurance guidelines manuals.
-------
APPENDIX C
COMPARISON OF ANALYSES
OF LIMESTONE SLURRY
65
-------
APPENDIX C COMPARISON OF ANALYSIS OF LIMESTONE SLURRY
Cooperating laboratories were:
1. TVA Power Service Center Laboratory, Chattanooga, Tennessee -
Mr. John Rose, contact
2. TVA Power Service Center Laboratory, Muscle Shoals, Alabama -
Dr. Guerry McClellon, contact
3. Research Triangle Institute, Research Triangle Park, North
Carolina - Dr. D. E. Wagoner, contact
Results from RTI laboratories are presented in two sections. One set of
data was obtained on slurry which was filtered at the Shawnee Laboratory. The
second set of data results from analysis of samples filtered in the RTI
laboratory.
The first eight matrixes present results for each element: calcium, mag-
nesium, and total sulfur in the solid; and calcium, magnesium, sodium, potas-
sium, and chloride in the liquid. The next five matrixes give results of all
analyses for each laboratory, with Shawnee results listed first. The last
four matrixes break down the total sulfur and calcium analyses into results by
a standard "wet" technique, by X-ray fluorescence using Shawnee standard values,
and by X-ray fluorescence using RTI-derived standard values on the Shawnee
standard material.
66
-------
Table 1. Analysis for calcium in slurry solid
Sample:
Ca (as CaO)
Wt %
Laboratory
RTI
(Shawnee filtered)
RTI
(RTI filtered)
Chattanooga
(TVA)
Muscle Shoals
(TVA)
-------
Table 2. Analysis for magnesium in slurry solid
Sample:
Mg (as MgO)
Wt %
Laboratory
11/18/75
1100
1500 2300
11/19/75
1100 2300
Shawnee
0.30
0.29
0.28
0.27
0.29
oo
RTI
(Shawnee filtered)
0.43
0.48
0.39
0.36
0.42
RTI
(RTI filtered)
0.61
0.52
0.34
0.39
0.35
Chattanooga
(TVA)
0.65
0.56
0.56
0.61
0.61
Muscle Shoals
(TVA)
0.25
0.25
0.24
0.25
0.24
-------
Table 3. Analysis for total sulfur in slurry solid
^^"
^*»w Sample:
^XJS (as SO-)
v^
Laboratory ^"V^
1 Shawnee
1 RTI
1 (Shawnee filtered)
1 RTI
(RTI filtered)
1 Chattanoog i
(TVA)
1 Muscle Shoals
(TVA)
11/18/75 11/19/75
1100 1500 2300 1100 2300
34.78
36.70
28.60
30.5
31.3
34.16
32.00
27.00
30.7
30.9
31.22
33.03
31.13
30.0
31.4
28.17
30.53
28.78
29.2
29.7
31.64
32.80
32.03
29.6
30.4
VO
-------
Table 4. A-.?.]ysis for calcium in slurry filtrate
Sample:
Ca
(ppm)
Laboratory
11/18/75
1100
1500
2300
11/19/75
11002300
Shawnee
1720
1710
1810
2090
2315
RTI
(Shawnee filtered)
1775
1825
1810
1708
1885
RTI
(RTI filtered)
1700
1810
1730
1720
1825
Chattanooga
(TVA)
1756
1740
1676
1596
1732
Muscle Shoals
(TVA)
1787
1787
1716
1787
1716
-------
Table 5. Analysis for magnesium in slurry filtrate
Sample:
Mg
(ppm)
Laboratory
11/18/75
1100
1500 2300
11/19/75
1100 2300
Shawnee
733
699
662
691
698
RTI
(Shawnee filtered)
785
945
813
1000
1115
RTI
(RTI filtered)
730
805
805
795
770
Chattanooga
(TVA)
Muscle Shoals
(TVA)
768
734
763
724
724
724
780
816
724
784
-------
Table 6. Analysis for sodium in slurry filtrate
Sample:
Na
Laboratory
11/18/75
1100
1500 2300
11/19/75
1100 2300
Shawnee
71
57
70
73
82
ho
RTI
(Shawnee filtered)
71
69
79
75
77
RTI
(RTI filtered)
153
161
180
176
145
Chattanooga
(TVA)
66
62
64
66
69
Muscle Shoals
(TVA)
41
37
37
41
41
-------
Table 7. Analysis for potassium in slurry filtrate
Sample:
K
Laboratory
11/18/75
1100
1500
2300
11/19/75
1100 2300
Shawnee
118
121
123
126
153
OJ
RTI
(Shawnee filtered)
103
101
105
108
111
RTI
(RTI filtered)
116
102
114
118
116
Chattanooga
(TVA)
107
96
107
116
116
Muscle Shoals
(TVA)
58
50
58
58
-------
Table 8. Analysis for chloride in slurry filtrate
Sample:
Cl
Laboratory
11/18/75
1100
1500
2300
11/19/75
1100 2300
Shawnee
3651
3580
3545
3545
3580
RTI
(Shawnee filtered)
3697
3700
3855
3660
3987
RTI
(RTI filtered)
3621
3638
3754
3519
3566
Chattanooga
(TVA)
3692
3543
3571
3571
3628
Muscle Shoals
(TVA)
3800
3700
3600
3600
3700
-------
Table 9. Laboratoty: Shawnee (TVA)
Ul
SOLID (Wt.%)
Ca (CaO)
Mg (MgO)
TS (S03)
11/18/75 11/19/75
1100 1500 2300 1100 2300
24.73
0.30
34.78
19.03
0.48
32.00
LIQUID (ppm)
Ca
Mg
Na
K
1720
733
71
118
1710
699
57
121
22.78
0.28
31.22
21.05
0,27
28.17
22.87
.29
31.64
1810
662
70
123
2090
691
73
126
7^1 S
698
82
153
Cl
3651
3580
3545
3545
3580
-------
Table 10. Laboratory: RTI (Shawnee filtered)
SOLID (Wt. %)
Ca (CaO)
Mg (MgO)
TS (S03)
11/18/75 11/19/75
1100 1500 2300 1100 2300
22.12
0.43
36.7
19.03
0.48
32.00
19.10
0.39
33.03
18.41
0.36
30.53
19.46
0.42
32.80
LIQUID (ppm)
Ca
Mg
Na
K
1775
785
71
103
1825
945
69
101
1810
913
79
105
1708
1000
75
108
1885
1115
77
111
Cl
3697
3700
3855
3660
3987
-------
Table 11. Laboratory: RTI
SOLID (Wt. %)
Ca (CaO)
Mg (MgO)
TS (S03)
11/18/75 11/19/75
1100 1500 2300 1100 2300
17.99
0.61
28.6
19.96
0.52
27.0
18.63
0.34
31.13
18.13
0.39
28.78
19.32
0.35
32.03
LIQUID (ppm)
Ca
Mg
Na
K
1700
730
153
116
1810
805
101
102
1730
805
180
114
1720
795
176
118
1825
770
145
116
Cl
3621
3638
3754
3519'
3566
-------
Table 12. Laboratory: Muscle shoals (TVA)
oo
SOLID (Wt. %)
Ca (CaO)
Mg (MgO)
TS (S03)
11/18/75 11/19/75
1100 1500 2300 1100 2300
23.6
0.25
31.3
23.1
0.25
30.9
23.6
0.24
31.4
22.6
0.25
29.7
22.9
0.24
36.4
LIQUID (ppm)
Ca
Mg
Na
K
1787
724
41
58
1787
724
37
54
Cl
3800
3700
1716
724
37
50
3600
1787
724
41
58
3600
1716
784
41
58
3700
-------
Table 13. Laboratory: Chattanooga (TVA)
VO
SOLID (Wt. %)
Ca (CaO)
Mg (MgO)
TS (S03)
11/18/75 11/19/75
1100 1500 2300 1100 2300
23.45
0.65
30.5
23.24
0.56
30.7
23.02
0.56
30.0
22.46
0.61
29.2
22.46
0.61
29.6
LIQUID (ppm)
Ca
Mg
Na
K
1756
768
66
107
1740
734
62
96
1676
763
64
107
1596
780
66
116
1732
816
67
116
Cl
3692
3543
3571
3571
3628
-------
Table 14. Total sulfur determinations, Shawnee filtered samples analyzed at RTI
wt % (as SO )
11/18/75
1100
1500
2300
11/19/75
1100
2300
By BaCl2 precipitation
36.7
32.00
33.03
30.53
32.80
By X-Ray fluorescence
oo
o
Using Shawnee X-Ray
standard number for TS
Using RTI determined
number for TS
22.08*
21.25
23.15
27.75
22.95
Shawnee X-Ray Standard (as
so3)*
Shawnee given
RTI determined
28.45
28.38
* Shawnee and RTI TS determinations on the XRF standard were virtually identical, so the Shawnee TS
value only was used in calculating wt % TS in each sample.
-------
Table 15. Total sulfur determinations, RTI filtered and analyzed samples
BaCl2 precipitation
X-Ray fluorescence
Using Shawnee X-Ray
standard number for TS
Using RTI-determined
number for TS
11/18/75 11/19/75
1100 1500 2300 1100 2300
28.6
27.00
31.13
28.78
32.03
JO. 28
29.8
34. 4b
oo
-------
Table 16. Calcium determinations, Shawnee filtered samples analyzed at RTI
oo
to
wt % (as CaO)
By AA
11/18/75 11/19/75
1100 1500 2300 1100 2300
22.12
19.03
19.60
18.41
19.46
By X-Ray fluorescence
Using Shawnee X-Ray
standard number for CaO
Using RTI determined
number for CaO
Shawnee X-Ray Standard (as
CaO)
Shawnee given
RTI determined
23.13
20.09
25.41
22.06
22.25
19.32
25.76
22.37
22.89
19.88
25.41
22.06
-------
Table 17. Calcium determinations, RTI filtered and analyzed samples
oo
u>
wt % (as CaO)
•••••^•^••••••••••^•^^•••••••••••^^••••••••••^^•^^••^^
By AA
By X-Ray fluorescence
Using Shawnee X-Ray
standard number for CaO
Using RTI-determined
number for CaO
11/18/75 11/19/75
1100
mmmmmmmmmmmm*mm*m*mmmmimi^m
17.99
1500
B^|MMa|BHBaH^IM^l^MH^^l^HHIHHI^
17.96
2300
••^•••••••••••••••••••••MMMMBM
18.63
1100
BBMMHHBI^V^BHB.HHIHMMM|
18.13
2300
•••^^^•••^^^••••••••••••••i
19.32
28.35
24.61
25.52
22.16
26.17
22.72
27.68
24.04
26.32
22.85
-------
TECHNICAL Rf PORT DATA
(Please read Instructions on the reverse before completing)
1. REPORT NO.
EPA-600/2-76-083
2.
3. RECIPIENT'S ACCESSION>NO.
4. TITLE AND SUBTITLE
Development and Trial Field Application of a Quality
Assurance Program for Demonstration Projects
5. REPORT DATE
March 1976
6. PERFORMING ORGANIZATION CODE
I. AUTHOR(S)
8. PERFORMING ORGANIZATION REPORT NO.
James Buchanan
9. PERFORMING OROANIZATION NAME AND ADDRESS
Research Triangle Institute
P.O. Box 12194
Research Triangle Park, NC 27709
10. PROGRAM ELEMENT NO.
EHB-557; ROAP ABA-011
11. CONTRACT/GRANT NO.
68-02-1398, Task 20
12. SPONSORING AGENCY NAME AND ADDRESS
EPA, Office of Research and Development
Industrial Environmental Research Laboratory
Research Triangle Park, NC 27711
13. TYPE OF REPORT AND PERIOD COVERED
Task Final: 7-12/75
14. SPONSORING AGENCY CODE
EPA-ORD
15. SUPPLEMENTARY NOTES Project officer for this report is L.D. Johnson, Mail Drop 62,
Ext 2557.
16. ABSTRACT Tne repOr£ outlines results of a project: to develop a set of quality assu-
rance guidelines for EPA demonstration projects; to implement a short-term quality
assurance program at the EPA wet limestone scrubber facility at the Shawnee steam/
electric plant; and to modify the guidelines in light of the Shawnee operating exper-
ience. The set of quality assurance guidelines and detailed results of the Shawnee
program are included in two other reports prepared during the project.
7.
KEY WORDS AND DOCUMENT ANALYSIS
DESCRIPTORS
Air Pollution
Quality Assurance
Duality Control
crubbers
limestone
'lue Gases
Industrial Processes
Instruments
Sulfur Dioxide
Dust
Sampling
Weight Measurement
b.lDENTIFIERS/OPEN ENDED TERMS
Air Pollution Control
Stationary Sources
Field Application
Control Laboratory
Particulate
c. COSATI Field/Group
I3B~
3H,14D 14B
07B
07A 11G
I08G
21B
8. DISTRIBUTION STATEMENT
Unlimited
19. SECURITY CLASS (ThisReport)
Unclassified
21. NO. OF PAGES
84
20. SECURITY CLASS (Thispage)
Unclassified
22. PRICE
EPA Form 2220-1 (9-73)
------- |