EPA 600/2-76-081
March 1976
Environmental Protection Technology Series
                      GUIDELINES  FOR  DEMONSTRATION
             'ROJECT QUALITY ASSURANCE  PROGRAMS
                                    Industrial Environmental Research Laboratory
                                         Office of Research and Development
                                        U.S. Environmental Protection Agency
                                   Research Triangle Park, North Carolina 27711

-------
                RESEARCH REPORTING SERIES

 Research reports of the Office of Research and Development, U.S. Environmental
 Protection Agency, have been grouped into five series. These five broad
 categories were established to facilitate further development and application of
 environmental technology. Elimination of traditional grouping was consciously
 planned to foster technology transfer and  a maximum interface in related fields.
 The five series are:

     1.    Environmental Health Effects Research
     2.    Environmental Protection Technology
     3.    Ecological Research
     4.    Environmental Monitoring
     5.    Socioeconomic Environmental Studies

 This report has  been assigned to  the ENVIRONMENTAL PROTECTION
 TECHNOLOGY series. This series describes research performed to develop and
 demonstrate instrumentation,  equipment, and methodology to repair or prevent
 environmental degradation from point and non-point sources of pollution This
 work provides the new or improved technology required for the control and
 treatment of pollution sources to meet environmental quality standards.
                    EPA REVIEW NOTICE

This report has been reviewed by  the U.S.  Environmental
Protection Agency, and  approved for publication.  Approval
does not signify that the contents necessarily reflect the
views and policy of the Agency, nor does mention of trade
names or commercial products constitute endorsement or
recommendation for use.
This document is available to the public through the National Techr,>cal Informa-
tion Service, Springfield. Virginia 22161.

-------
                                EPA-600/2-76-081
                                March 1976
    GUIDELINES FOR DEMONSTRATION  PROJECT

         QUALITY  ASSURANCE  PROGRAMS
                     by

               James  Buchanan
        Research  Triangle  Institute
               P.O.  Box  12194
     Research Triangle Park, NC  27709
      Contract No.  68-02-1398, Task 20
              ROAP  No.  ABA-011
        Program Element No.  EHB-557
   EPA Project Officer:  Larry  D. Johnson

Industrial  Environmental  Research  Laboratory
  Office of Energy,  Minerals,  and  Industry
     Research Triangle  Park, NC  27711
                Prepared  for


    U.S.  ENVIRONMENTAL  PROTECTION AGENCY
     Office  of Research and Development
          Washington,  DC 20460

-------
                         ACKNOWLEDGMENTS

     The work on this project was performed by the Systems and Measure-
ments Division of the Research Triangle Institute.  Mr.  Frank Smith,
Supervisor, Quality Assurance Section, served as the Project Leader.
Dr. James Buchanan of the Quality Assurance Section was  responsible for
the coordination of the program.  Institute staff members Dr. D.  E.
Wagoner and Mr. Larry Hackworth, analytical chemists, Mr. Leon Bissette,
an electrical engineer, and Dr. Buchanan, a physical chemist, were
major contributors to the program.  Project Officer for  the Environ-
mental Protection Agency was Dr. L. D. Johnson of the Process Measure-
ments Branch of the Industrial Environmental Research Laboratory.  The
Research Triangle Institute acknowledges the cooperation and assistance
of the Project Officer and Dr. R. Statnick of the Process Measurements
Branch.  The Institute also appreciates the assistance and guidance
provided by Mr. John Williams, the EPA Project Officer for the Shawnee
wet limestone scrubber demonstration.  Finally, gratitude is extended
to Mr. Joe Barkley and Mr. Ken Metcalf of TVA and Mr. Dewey Burbank of
Bechtel Corporation for their cooperation at the Shawnee test site.
                                 iii

-------
                        TABLE OF CONTENTS

SECTION                                                         PAGE
1,0  INTRODUCTION                                                 1
2,0  MAJOR COMPONENTS OF A QUALITY  CONTROL PROGRAM               3
     2-1  QUALITY ASSURANCE ASPECTS OF  THE RFP                    3
     2-2  EVALUATION OF QUALITY CONTROL IN THE  PROPOSAL          4
     2-3  EVALUATION OF QUALITY CONTROL IN THE  WORK PLAN         6
     2-4  MANAGEMENT COMMITMENT TO  QUALITY CONTROL               7
     2-5  QUALITY CONTROL IN THE ORGANIZATIONAL STRUCTURE        8
     2-6  ASSESSMENT OF QUALITY CONTROL REQUIREMENTS             8
     2-7  SPECIFIC AREAS OF CONCERN FOR DEMONSTRATION            9
          PROJECT QUALITY CONTROL PROGRAMS
          2-7.1  FACILITIES AND EQUIPMENT                         9
          2-7.2  CONFIGURATION CONTROL                             10
          2.7.3  PERSONNEL TRAINING                                10
          2-7.4  DOCUMENTATION CONTROL                             ]1
          2-7.5  CONTROL CHARTS                                    ^
          2-7.6  IN-PROCESS QUALITY  CONTROL                       13
          2-7.7  PROCUREMENT AND  INVENTORY  PROCEDURES             13
          2-7.8  PREVENTIVE MAINTENANCE                            14
          2-7.9  RELIABILITY                                       14
          2-7.10  DATA VALIDATION                                   14
          2-7.11  FEEDBACK AND CORRECTIVE ACTION                    15
          2-7.12  CALIBRATION PROCEDURES                            16
                                IV

-------
                    TABLE OF CONTENTS (CON,)

SECTION                                                        PAGE
3,0  GUIDELINES FOR DEMONSTRATION PROJECT QUALITY ASSUR-         u
     ANCE PROGRAMS
     3-l   GENERAL STATEMENTS                                     17
     3-2   THE ON-SITE QUALITATIVE SYSTEMS REVIEW                 17
     3-3   THE PERFORMANCE AUDIT                                  17
     3-4   MATERIAL BALANCES                                      18
     3-5   ASSESSMENT OF DATA QUALITY                             18
4,0  ASSESSMENT AND MODIFICATION OF THE ONGOING QUALITY          21
     ASSURANCE PROGRAM
     APPENDIX A - QUALITATIVE AUDIT CHECKLIST FOR DEMON-         22
                  STRATION PROJECTS
     APPENDIX B - STANDARD TECHNIQUES USED  IN QUANTITA-          46
                  TIVE PERFORMANCE AUDITS

-------
                        LIST OF FIGURES

    1  STANDARD QUALITY CONTROL CHART.                          12
                  LIST OF TABLES IN APPENDIX B

TABLE NO.
    1  PHYSICAL MEASUREMENTS                                    47
    2  GAS EFFLUENT STREAMS                                     49
    3  LIQUID-STREAMS, SUSPENDED SOLIDS                         51
                               vi

-------
1.0  INTRODUCTION
     The major objective of this project was to develop a general quality
assurance (QA) program for EPA demonstration projects, using the wet lime-
stone scrubber facility at the Shawnee steam plant, Paducah, Kentucky, as an
example project.  A second objective was to field test the QA program at the
Shawnee facility and carry out whatever modifications were necessary in light
of that field trial.
     In order to facilitate the following discussion it is useful to define
three terms; namely, quality, quality control (QC) and quality assurance.  It
is sometimes difficult to distinguish QC from QA unless the proper definitions
are kept in mind.  The following is taken from the handbook, Glossary and
                                       *
Tables for Statistical Quality Control.
     1.   Quality:  The totality of features and characteristics of a product
or service (such as measurement data) that bear on its ability to satisfy a
given need.
     2.   Quality control:  The overall system of activities whose purpose is
to provide a quality of product or service that meets the needs of users; also,
the use of such a system.
          The aim of quality control is to provide quality that is satisfactory,
adequate, dependable, and economic.  The overall system involves integrating
the quality aspects of several related steps, including the proper specification
of what is wanted; production to meet the full intent of the specification;
inspection to determine whether the resulting product or service is in accordance
with the specification; and review of usage to provide for revision of specifica-
tion.
     3.   Quality assurance:  A system of activities whose purpose is to provide
assurance that the overall quality control job is in fact being done effectively.
The system involves a continuing evaluation of the adequacy and effectiveness of
the overall quality control program with a view to having corrective measures
initiated where necessary.  For a specific product or service, this involves
verifications, audits, and the evaluation of the quality factors that affect
the specification, production, inspection, and use of the product or service.
 American Society for Quality Control, Milwaukee, Wisconsin 53203 (1973).

-------
     In summation, the purpose of QA is to independently assess QC.  This
assessment of QC should take place in two ways.  Reviews and performance
audits should be conducted by the QC organization itself (an internal inspec-
tion program), and in addition there should be periodic assessments by an
independent, outside organization.
     It is appropriate to delineate the elements of a comprehensive demonstra-
tion project QC program.  These elements must be evaluated, point-by-point,
when QA is integrated into the project.
     The organization of this report, then, is as follows:  section 2.0 dis-
cusses the major components of a QC program, including EPA's responsibility
with respect to the project RFP and the contractor's responsibility with
respect to the proposal and the work plan; section 3.0 covers the recommended
QA program, including the on-site systems review and the quantitative perform-
ance audit.

-------
2.0  MAJOR COMPONENTS OF A QUALITY CONTROL PROGRAM

2.1  Quality Assurance Aspects of the RFP
     The design of the RFP is predicated on stating as clearly as possible
what the objectives of the project are; e.g., to design, construct and main-
tain a given control system, systematically examining the interaction of appro-
priate system parameters.  The quality of the data obtained from the project
will depend upon numerous factors—instrumentation, personnel, sampling tech-
nique, sample size, statistical expertise.  It is therefore critical that the
                              *
RFP be as explicit as possible  in delineating two things—what quality data
are expected, and how that quality is to be insured.
     The following kinds of questions should be answered in the RFP:  Is the
contractor required to estimate just precision or both precision and bias of
his estimates?  What number of samples is required for statistical analsis?
If confidence limits or tests of significance are required, it may be appro-
priate at this point to state the significance level he is to use.  Are control
charts to be used?  If an experimental design is required, what factors is the
contractor to consider (at a minimum) in designing his experiment and what
other factors might be considered if time and funds are sufficient?  What
instruments should the contractor use (if it is reasonable to specify these)
and what time unit(s) should be used for analysis?  When is the contractor
required to analyze his data and give the results of this analysis (e.g., only
at the end of the project or throughout the project)?  How exactly is the
contractor to express precision and/or bias and for which estimates?  Some or
all of these items may necessarily be suggested by the bidder.
     Since RFPs are limited in length, it would, in most cases be inappropriate
to include more than a brief (one or two paragraph) statement of QC require-
ments.  Nevertheless, it is most important that the bid solicitation be as
explicit as possible concerning QC.  For illustrative purposes, a sample
statement appropriate to a wet limestone sulfur dioxide scrubbing demonstra-
tion project is given below:
*
 It is understood that, because of the nature of the proposed work, it may
 not be possible to specify either the expected data quality or the way in
 which data quality is to be assured.

-------
     "Precision and bias estimates, based on statistically significant sample
sizes, must be provided for such critical process parameters as temperature,
pressure, flow rate and chemical composition in both the liquid slurry and
gas effluent streams, measured at scrubber inlet and outlet.  These estimates
should generally be set at the 90 per cent confidence level.  Control charts
must be used continuously for each critical parameter.
     Interaction of major process parameters must be determined by a factorial
or partial factorial method.  Instrumentation must be appropriate for each
measurement; e.g., determination of calcium and magnesium in slurry solids
must be by a method yielding precision and accuracy comparable to that of
x-ray flourescence.  Standards must be analyzed by at least two independent
outside laboratories, and agree within 5 per cent of the mean value.  Control
samples must be used at appropriate (cost-effective) intervals to serve as
internal audits of the analytical procedures.  Proposals must provide a
statistically sound plan, in detail, for implementing the quality assurance
program as outlined above, and must provide for monthly data reports which
include statistical treatment of current data."
2.2  Evaluation of Quality Control in the Proposal
     The proposal should contain a statement as to the precise position the
bidder's company takes regarding quality control programs.  This should include
past projects and the quality control program effectiveness in that project.
In particular, there should be a clear and explicit response to the QC require-
ments stated in the RFP.  This response must be compared directly, item-by-
item, with other proposals submitted against the RFP.  The evaluation should
result in a determination of a "figure of merit" for the bidder's quality
control organization and the competence of the staff.  For a project requirng
a high degree of professionalism, this may require an on-site audit of the
facility and personnel as part of EPA's QA program for that contract/project.
Such audits should enhance the overall understanding of the submitted proposal
as it relates to the requirements of the RFP.
     The bid should contain a list of reports on previous work which will allow,
if desired, an investigation of how the contractor has performed in producing
good data quality and presenting the data in a precise and meaningful manner.
Investigation of the above reports should include an examination of the use
of appropriate statistical techniques, such as analysis of variance and regr.es-
sion.  Past experience of the contractor in designing and analyzing experiments
will be an indication of his expertise in carrying out a test program.

-------
     The bidder's proposal should directly address the data accuracy require-
ments contained in the RFP.  It should provide qualitative evidence as to his
ability to meet the accuracy requirements.  These statements should include
methods of statistical analysis to be employed to provide the degree of confi-
dence necessary for the work.  His statement should also include formats for
the presentation of raw data, report data, and data derived from data analysis.
     The contractor should indicate how he intends to insure good quality data
(control charts, calibration procedures, etc.)-  What sample sizes are his
statistics to be based upon and what is his justification for these sample
sizes?
     Procedures should be defined for day-to-day recordkeeping and data analy-
sis which will permit prompt discovery of out-of-tolerance data, investigation
into the cause of the problem and formal reporting to the Project Officer.
     Document control, covering test specifications and procedures, sampling
devices, sampling instrumentation, calibration equipment, maintenance and
calibration procedures, and other related documents, should be indicated in
the proposal.  The control should include a center for issuing new documents
and retrieving out-of-date documents, with spot checks on the documents in
current usage.
     The bidder should furnish a statement regarding the procurement procedures
to be followed in purchasing equipment and supplies directly relating to those
items required for the work to be done.  He should also specify the controls to
be used to insure receipt, storage and use of the purchased items.
     The proposal should include a personnel program outline for the training
and evaluation of skills required to complete the work.  The above program
should be based on a periodic review and evaluation of performance.
     There should be provision for changes in procedures when it is evident
that data being obtained is not sufficiently accurate or appropriate for the
intent of the project as outlined by the Project Officer.
     If a contractor has a good proposal but is unclear on some phases of data
quality, it would seem worthwhile to have him clarify his proposal by asking
him to answer specific questions.  If the answers to these questions are still
vague, it is a good indication that the quality for these phases of the project
may be questionable were this contractor to carry out the project.

-------
2.3  Evaluation of Quality Control in the Work Plan
     The work plan should be a detailed accounting of the actual steps to be
taken to complete the work delineated in the proposal and should be in direct
accord with the requirements of the RFP and other agreements with the Project
Officer.  Particular attention should be placed on the areas discussed below
in order to realize the collection of data having acceptable precision, accu-
racy, representativeness and completeness.
     In cases where the submitted proposal has been accepted but lacks the
completeness required by the Project Officer, finalized negotiations to remove
the problem areas should be directly addressed in the work plan showing the
details of the work to be done.
     The work plan must be submitted to the Project Officer before any work is
begun by the contractor.  The plan can be accepted in draft form which will
allow for minor changes prior to the final plan's acceptance and approval.
(see RFP criteria).
     There should be a description of the layout of all equipment and instru-
mentation to be used for measurement and calibration and their specific use in
the program.  The layout should show the positions of measurement probes and
sensors with a discussion of the rationale used to warrant the positions chosen.
Operating instructions (test procedures) for each of the above items and their
supporting equipment should be included.  These procedures must show expected
and acceptable limits of measurement accuracies including the bias of each
item and its own built-in tolerances and errors resulting from calibration
deficiencies.  Maintenance and calibration instructions should also be
included.  Calibration techniques must be in agreement with nationally
accepted practices, using standards that are traceable to primary standards.
                        *
     The Project Officer  should examine all test methods to determine their
concurrence with standard practices.  Innovative measurement techniques must
be evaluated for applicability and reliability.  He should examine the environ-
ment around the sampling equipment for error-producing situations.  He should
examine all operating procedures for possible measurement inaccuracies due to
sampling methods, use of equipment, calibration of equipment, ineffective
_
 or his designate.

-------
maintenance procedures or lack of proper personnel.  There should also be an
examination of statistical analysis techniques for appropriateness, and state-
ments concerning the confidence levels of data to be reported should be care-
fully scrutinized.
     The work plan should include a systems analysis program to provide early
detection of measurement problems based on periodic data reports, as well as
results of all audits.  The analysis program should include provisions for
transmission of such problem information to the Project Officer and the ini-
tiation of the appropriate corrective action.  There should also be provisions
for alternate measurement techniques to be substituted for procedures that
may prove to be faulty, in that the kind of data and/or its accuracy are not
appropriate for the project goals.  The provisions should also allow for
program changes to increase the accuracy and reliability of data which prove
to be of more importance than originally called for.  The provisions should
contain adequate documentation, including the necessary approval for the
Project Officer.
     Finally, the work plan should provide an effective procurement program
for all equipment, materials, and supplies being used for the taking of data.
This should include plans for controlled receipt, storage, and distribution
of the above items.

2.4  Management Commitment to Quality Control
     No QC program, regardless of the amount of planning or level of effort
expended, will be effective without the explicitly visible support of top
management.  The support should be expressed initially as the project gets
under way, and periodically throughout the duration of the program.  The sup-
port of top management then filters down through middle and lower management
to the operators, resulting in a program where QC is practiced on a day-to-
day basis, rather than being an additional problem or nuisance.  Quality
Control must be a built-in, functional area within the total program, and this
is not possible without continuing, obvious management support.

-------
2.5  Quality Control in: the; Organizational Structure
'>T ''.''•            '. v '':'"•'  -~ '       .   "   • "            ;
     Support for quality control is most visible when the organizational struc-
ture has provision for personnel whose authority and responsibilities lie in
the area; i.e'., a Quality Control Coordinator  (QCC) and/or any other staff
appropriate to the program.  The QCC is responsible for the organization's
entire QC program, and his judgment determines the effectiveness of the pro-
gram.  The basic function of the QCC should be the fulfillment of the QC
objectives of management, in the most efficient and economical manner commen-
surate with insuring continuing completeness, accuracy, and precision of the
data produced.  The responsibilities and authority of the QCC are detailed in
the Quality Assurance Handbook for Air Pollution Measurement Systems, Vol. 1,
           *
Principles,  now being'prepared by the Environmental Protection Agency.
     The QCC should have, within the main organizational structure, a subord-
inate organization for QC activities (auditing, calibration, quality control).
He should have authority for assignment of QC duties and for coordination of
the entire program,, and must not be directly subordinate to operational
personnel in the project.

2.6  Assessment of Quality Control Requirements
     The establishment of a quality control program for a demonstration project
requires first of all the setting, in as quantitative manner as possible, of
project objectives.  The desired precision and accuracy of each measurement
should be specified, as well as the technical means of attaining this degree of
data quality; i.e., the tasks to be performed.  Once this is done, it is effi-
cient to group the tasks organizationally and assign responsibility for the QC
function in each task-group.  It is inevitable that problems are incurred in
each step of the planning and establishment of the QC program.  Some of these
cannot be resolved until the program enters the functional stage.  What is
important initially is that these problems be identified and clearly stated,
so that they can be resolved as quickly as possible once the program gets under
way.
 Some of the general discussion of QC programs in this report has been taken
 from this document.

-------
2.7  Specific Areas of Concern for Demonstration Project Quality Control
     Program
     A Quality Control program for a demonstration project serves to:
          1.   Evaluate the overall adequacy of the project insofar .as data
               quality is concerned.
          2.   Identify potential as well as existing problems in the data-
               producing system, from measuremnt to data reduction.
          3.   Stimulate research into and discussion of alternative methods
               for obtaining data of the required quality.
     It is advisable to delineate a number of important aspects of the project
which have direct bearing on data quality, and to discuss each of these in
some detail.

2.7.1  Facilities and Equipment
     An obvious beginning point in the assessment of an ongoing program is a
general survey of the facilities and equipment available for day-to-day oper-
ation of the project.  Are they adequate for the job at hand?  Do standards
exist for evaluation of facilities, equipment and materials?
     The laboratories, data processing and other operational areas should be
neat and orderly, within common-sense limits imposed by the nature of the
facility.  Laboratory benches, particularly areas where critical operations
such as weighing are carried out, should be kept clear of all but necessary
tools, glassware, etc.  Personal items (coats, hats, lunch boxes) should not
be left in the working area.  Provision must be made for storage of these
items in personal lockers or the like.  A neat, well-organized laboratory
area serves to inspire neatness and organization among the laboratory workers.
     Good laboratory maintenance, particularly for certain types of instru-
mentation, requires complete manuals, kept in a convenient place so that they
are readily available to appropriate personnel.  Responsibility for keeping
up with all necessary manuals should be given to an individual, with the
understanding that he must devise a system (checkin-checkout) for quick loca-
tion of each document.

-------
2.7.2  Configuration Control
     The documentation of design changes in the system must be carried out
unfailingly.  Procedures for such documentation should be written, and be
accessible to any individual responsible for configuration control.  It is
all too easy, as the system is modified repeatedly, to allow one key man to
hold, largely by memory only, great amounts of vital information.  This
information would be virtually totally lost if this key man were no longer
available.  Engineering schematics should be maintained current on both the
system and subsystem level, and all computer programs listed and flow charted.
Changes in computer hardware and software must be documented, even when such
changes are apparently trivial.  Significant design changes must be documented
and forwarded to the EPA Project Officer, by way of established procedure.

2.7.3  Personnel Training
     It is highly desirable that there be a programmed training system for new
employees.  This system should include motivation toward producing data of
acceptable quality standards.  A part of the program should involve "practice
work" by the new employee.  The quality of the work can be immediately veri-
fied and discussed by the supervisor, with appropriate corrective action taken.
This system is to be preferred to on-the-job training, which may be excellent
or slipshod, depending upon a number of circumstances.
     Key personnel (laboratory supervisors, senior engineers) should be re-
quired to document their specialized knowledge and techniques, so far as is
possible.  They should each be required to develop an assistant, if the pro-
gram personnel situation allows, who could take responsibility when the senior
man is unavailable.  A most undesirable situation arises when replacement
personnel must be brought in and forced to gain knowledge of the program
through the experience of trial and error.  This :'.s not an infrequent occur-
rance, however, when budgeting constraints override other priorities.
     A thorough personnel training program should focus particular attention
on those people whose work directly affects data quality (calibration personnel,
bench chemists, etc.).  These people must be cognisant of the quality standards
fixed for the project and the reasons for those standards.   They must be made
aware of the various ways of achieving and maintaining quality data.  As these
                                        10

-------
people progress to higher degrees of proficiency, their accomplishments should
be reviewed and then documented.  A motivating factor for high performance could
be direct and obvious rewards (monetary, status or both), awarded in a manner
visible to other comparable personnel.

2.7.4  Documentation Control
     Procedures for making revisions to technical documents must be clearly
written out, with the lines of authority indicated.  The revisions themselves
should be written and distributed to all affected parties, thus insuring that
the change will be implemented and will become permanent.  If a technical docu-
ment change pertains to an operational activity, that change should, be analyzed
for side effects.  The change should not be rendered permanent until any harm-
ful side effects.have been controlled.
     Revisions to computer software should be written, with lines of authority
indicated.  Reasons for the changes must be clearly spelled out.  The revisions
should be written and distributed to all affected parties.

2.7.5  Control Charts
     Control charts are essential as a routine day-to-day check on the consis-
tency or "sameness" of the data precision.  A control chart should be kept for
each measurement that directly affects the quality of the data.  Typically,
control charts are maintained for duplicate analyses, percent isokinetic
sampling rate, calibration constants, and the like.  An example control chart
is given as figure 1.  The symbol a (sigma) represents a difference, d, of one
standard deviation unit in two duplicate measurements, one of which is taken
as a standard, or audit value.  Two a is taken as a warning limit and 3a as
a control limit.  If a laboratory measurement differs from the audit -value by
more than 3cr, the technique is considered out-of-control.  Control charts are
dealt with in depth in a number of standard texts on quality contol of engin-
                 *
eering processes.
*
 Quality Assurance Handbook for Air Pollution Measurement Systems, Volume 1,
 Principles, being prepared by the Environmental Protection Agency.
                                        11

-------
         -2a
         -3a
 CHECK NO.
                              ACTION UMIT


                              WARNING LIMIT
                                        •UCL
                                                                      •CL
WARNING LIMIT


ACTION LIMIT
                                                                      • LCL
                          8
10
DATE/TIME
 OPERATOR
PROBLEM AND
CORRECTIVE
ACTION
                Figure 1.   Standard quality control  chart.
                                    12

-------
2.7.6  In-Process Quality Control
     During routine operation, critial measurement methods should be checked
for conformance to standard operating conditions (flow rates, reasonableness
of data being produced and the like).   The capability of each method to pro-
duce data within specification limits should be ascertained by means of appro-
priate control charts.  When a discrepancy appears in a measurement method, it
should be analyzed and corrected as soon as possible.
     For all standard methods, the operating conditions must be clearly
defined in writing, with specific reference to each significant variable.
Auxiliary measuring, gaging, and analytical instruments should be maintained
operative, accurate, and precise by regular checks and calibrations against
stable standards which are traceable to a primary standard, preferably fur-
nished by the U.S. Bureau of Standards (if available).

2.7.7  Procurement and Inventory Procedures
     There should be well-defined and documented purchasing guidelines for
all equipment and reagents having an effect on data quality.  Performance spec-
ifications should be documented for all items of equipment having an effect on
data quality.  Chemical reagents considered critical to an analytical proce-
dure are best procured from suppliers who agree to submit samples for testing
and approval prior to initial shipment.  In the case of incoming equipment,
there should be an established and documented inspection procedure to deter-
mine if procurements meet the quality assurance and acceptance requirements.
The results of this inspection procedure should be documented.
     Whenever discrepant materials are detected, this information should be
immediately transmitted to the supplier, for satisfaction.  The materials are
either returned or disposed of, at the discretion of the quality control super-
visor.
     Once an item has been received and accepted, it should be documented in
a receiving record log giving a description of the material, the data of the
receipt, results of the acceptance test, and the signature of the responsible
individual.  It is then placed in inventory, which should be maintained on a
first-in, first-out basis.  It should be identified as to type, age and
                                       13

-------
acceptance status.  In particular, reagents and chemicals which have limited
shelf life should be identified as to shelf life expiration date and system-
atically issued from stock only if they are still within that date.

2.7.8  Preventive Maintenance
     It is most desirable that preventive maintenance procedures be clearly
defined and written for each measurement system and its support equipment.
When maintenance activity is necessary, it should be documented on standard
forms maintained in log books.  A history of the maintenance record of each
system serves to throw light on the adequacy of its maintenance schedule and
parts inventory.

2.7.9  Reliability
     The reliability of each component of a measurement system relates directly
to the probability of obtaining valid data from that system.  It follows that
procedures for reliability data collection, processing and reporting should be
clearly defined and in written form for each system component.  Reliability
data should be recorded on standard forms and kept in a log book.  If this
procedure is followed, the data can be utilized in revising maintenance and/
or replacement schedules.

2.7.10  Data Validation
     Data validation procedures, defined ideally as a set of computerized
and manual checks applied at various appropriate levels of the measurement
process, should be clearly defined, in written form, for all measurement sys-
tems.  Criteria for data validation must be documented and include limits on:
(1) operational parameters such as flow rates;  (2) calibration data;  (3)
special checks unique to each measurement; e.g., successive values/averages;
(4) statistical tests, e.g., outliers; and (5)  manual checks such as  hand cal-
culations.   The limits must be adequate to insure the detection of invalid
data with a high probability, for all or certainly most of the measurement
systems.  The required data validation activities (flow-rate checks,  analy-
tical precision, etc.) must be recorded on standard form in a log book.
                                       14

-------
     A summary of validation procedures should be prepared at each level or
critical point in the measurement process and forwarded to the next level with
the applicable block of data.  Any invalidated data should be so indicated.
The procedures for considering such data invalid must be clearly understood
at all levels where such data is to be used, and thus these procedures must
be written and available for scrutiny and comment.  Where possible, valida-
tion criteria should be computerized and data are automatically invalidated.
     Any demonstration project should, on a random but regular basis, have
quality audits performed by in-house personnel.  These audits must be inde-
pendent of normal project operations, preferably performed by the QCC or his
appointees.  The audits should be both qualitative and quantitative (i.e.,
they should include both system reviews and independent measurement checks).
For the system review, a checklist is desirable to serve as a guide for the
reviewer.  Such a checklist is included as appendix A of this report.
     The quantitative aspect of the audit will vary depending on the nature of
the project.  Some guidelines for quantitative audits are given in appendix B.

2.7.11  Feedback and Corrective Action
     Closely tied: to the detection of invalid data is the problem of establish-
ment of a closed loop mechanism for problem detection, reporting and correction.
Here it is important that the problems are reported to those personnel who can
take appropriate action.  A feedback and corrective action mechanism should be
written out, with individuals assigned specific areas of responsibility.
Again, documentation of problems encountered and actions taken is most impor-
tant.  Standard forms, kept in a log book, are recommended.  A periodic sum-
mary report on problems and corrective action should be prepared and distribu-
ted to all levels of management.  This report should include:  a listing of
major problems for the reporting period; names of persons responsible for
corrective actions; criticality of problems; due dates; present status;
trend of quality performance (i.e., response time, etc.); a listing of items
still open from previous reports.
                                        15

-------
2.7.12  Calibration Procedures
     Calibration procedures are the crux of any attempt to produce quality
data from a measurement system.  For this reason it is extrememly important
that the procedures be technically sound and consistent with whatever data
quality requirements exist for that system.  Calibration standards must be
specified for all systems and measurement devices, with written procedures
for assuring, on a continuing basis, traceability to primary standards.  Since
calibration personnel change from time to time, the procedures must be, in
each instance, clearly written in step-by-step fashion.  Frequency of calibra-
tion should be set and documented, subject to rescheduling as the data are
reviewed.  Full documentation of each calibration and a complete history of
calibrations performed on each system are absolutely essential.  This permits
a systematic review of each system reliability.
                                       16

-------
3.0  GUIDELINES FOR DEMONSTRATION PROJECT QUALITY ASSURANCE PROGRAMS

3.1  General Statements
     The objective of quality assurance is to independently assess the quality
control program of the project.  This assessment should normally take two major
forms:  (1) a qualitative on-site systems review, and (2) a quantitative per-
formance audit.  These are discussed in detail below, as sections 3.2 and 3.3,
respectively.
     The frequency of a systems review and/or a performance audit obviously
should be dictated by the specific project.   It is recommended that a minimum
frequency be once each calendar year.  The initial systems review and perform-
ance audit should take place within the first quarter of the first project year.
Subsequent scheduling should be dependent on the requirements of management
and the apparent quality of the day-to-day data being obtained.  More frequent
auditing may be necessary in the initial stages of the project.

3.2  The On-Site Qualitative System Review
     The objective of the on-site qualitative systems review is to assess and
document facilities; equipment; systems documentation; operation, maintenance,
and calibration procedures; recordkeeping; data validation; and reporting
aspects of the total quality control program for a demonstration project.  The
review should accomplish the following:
     1.   Identify existing system documentation, i.e., maintenance manuals,
          organizational structure, operating procedures, etc.
     2.   Evaluate the adequacy of the procedures as documented.
     3.   Evaluate the degree of use of and adherence to the documented pro-
          cedures in day-to-day operations based on observed conditions
          (auditor) and a review of applicable records on file.
     To aid the auditor in performing the review, a checklist is included as
appendix A.

3.3  The Performance Audit
     In addition to a thorough on-site systems review, quantitative performance
audits should be periodically undertaken at each demonstration project.  The
                                        17

-------
objective of these audits is to-evaluate the validity of project data by inde-
pendent measurement techniques.  It is convenient to classify the major measure-
ment methods into three areas:  physical measurements, gas stream measurements
and liquid stream measurements  (the latter including analysis of any suspended
solids).  Appendix B lists in matrix form a number of standard techniques for
auditing in the three major areas just mentioned.  Table 1 is a compilation of
commonly measured physical properties, with a selection of possible measure-
ment, calibration and audit techniques.  Table 2, concentrating on analysis of
gas effluent streams, lists the material to be analyzed, and measurement, cali-
bration and audit techniques for that material.  Finally table 3 very briefly
and generally deals with measurement methods appropriate to liquids and solids.
The specific techniques vary widely from project to project, but the audit
technique generally involves use of control (reference) samples of known com-
position and/or splitting a sample among several laboratories for independent
analyses.

3.4  Material Balances
     Material balance serves as a gross indication of the validity of the total
measurement system complex of the project.  The extent of closure will be
directly related to the precision and bias of each measurement taken.  In
general, both physical measurements of flow rates, temperatures, pressures
(and so on), and chemical analyses of material composition will bear on the
degree of closure attained.  The achievable extent of closure must be estimated
for each project and used as a target figure.   The frequency with which material
balances are run is related to how successful one is in attaining the estimated
closure over a signifcant period of time.

3.5  Assessment of Data Quality
     Standard methods exist for estimation of the precision and accuracy of
measurement data.   Efficient usage of the audit data requires that a rationale
be followed which gives the best possible estimates of precision and accuracy
within the limits imposed by timing, sample size, etc.
                                        18

-------
     For a given measurement, the difference between the field (or plant)
result and the audited,
                              d. = X. - X .   ,
                               3    J    aj
is used to calculate a mean and standard deviation as follows;
                                         n
                                   d = ./ ,   d./n,
                              Sd =
                                      n
(d  - d)2/(n - 1)
                                                              1/2
Now d is an estimate of the bias in the measurements (i.e., relative to the
audited value).  Assuming the audited data to be unbiased, the existence of
a bias in the field data can be checked by the appropriate t-test, i.e.,
                                     d - 0
                                 t =
     If t is significantly large, say greater than the tabulated value of t
with n - 1 degrees of freedom, which is exceeded by chance only 5 per cent of
the time, then the bias is considered to be real, and some check should be made
for a possible cause of the bias.  If t is not significantly large, then the
bias should be considered zero, and the accuracy of the data is acceptable.
     The standard deviation s, is a function of both the standard deviation of
                             d
the field measurements and of the audit measurements.  Assuming the audit values
to be much more accurate than the field measurements, then s, is an estimate of
                                                            a
a{X}, the population standard deviation for the measurements.  For example, the
estimated standard deviation, s,, may be directly checked against the assumed
value, a{X}, by using the statistical test procedure
                                     2L
                                     f
 a2{X}
                                         19

-------
       2
where x /f is the value of a random variable having the chi-square distribu-
                                             2
tion with f = n - 1 degrees of freedom.  If x /f is larger than the tabulated
value exceeded only 5 percent of the time, then it would be concluded that
the test procedure is yielding more variable results due to faulty equipment
or operational procedure.
     The measured values should be reported along with the estimated biases,
standard deviations, the number of audits, n, and the total number of field
tests, N, sampled (n < N).  Estimates (such as s  and d) which are signifi-
cantly different from the assumed population parameters should be identified
on the data sheet.
                     2
     The t-test and x -test described above are used to check on the biases
and standard deviations separately.
     Other statistical techniques exist which may apply to specific projects
(or to highly specialized areas of a given project).  It is usually well worth
your while to acquire the services of a statistical consultant in order to most
effectively treat the available data.
                                        20

-------
4.0  ASSESSMENT AND MODIFICATION OF THE ONGOING QUALITY ASSURANCE PROGRAM
     The guidelines put forth in the preceding sections serve as a basis for
development of a Quality Assurance program specific to the needs of a particu-
lar project.  A program should not be attempted without a thorough study of .
the entire facility, supplemented by at least one site visit.  It is to be
desired that provision for QA be made from the project's inception.  The EPA
Project Officer's responsibility is then to see that a program of adequate QA
practices is incorporated into the day-to-day project operations, along with
periodic QA audits conducted by outside organizations.
     Implementation of the program, at whatever point in the lifetime of the
project, exposes weaknesses of approach and problems which were not anticipated
in the planning stages.  Certainly it is necessary to maintain maximum flexi-
bility of approach as the interface with project realities is made.  The report
on the short-term QA program conducted at the EPA wet limestone S0_ scrubber
facility at Shawnee Steam Plant (Paducah, Kentucky), documents several problems
of implementation and suggests procedures for avoiding those problems in
future efforts.
     Generally, one should expect that a degree of modification would be
required in the areas listed below:
     1.   audit instrumentation and general equipment requirements
     2.   sampling frequencies
     3.   audit personnel requirements.
Experience must always be the final judge of the effectiveness of a Quality
Assurance program, as one monitors data quality on a continuing basis.
                                       21

-------
APPENDIX A   QUALITATIVE AUDIT CHECKLIST FOR DEMONSTRATION PROJECTS
     This checklist gives three descriptions to each facet of a typical quality
control system.  In all cases the "5" choice indicates the most desirable and
effective mode of operation; "3" is marginal and tolerable; "1" is definitely
unacceptable and ineffective as a mode of operation.
     It is not always possible to describe accurately all options with only
three choices.  Therefore, a "2" or "4" rating may be selected if the evalua-
tor feels that an inbetween score is more descriptive of the actual situation.
     After all the applicable questions are answered, an average is computed
to give an overall indication of the quality control system effectiveness.
     Generally, a rating of 3.8 or better is considered acceptable.
     A rating between 2.5 and 3.8. indicates a need for improvement but there
is not imminent threat to project performance as it now stands.
                                        22

-------
A.I  QUALITY ORGANIZATION
                                                                     SCORE

     (1.1)  Overall responsibility for quality assurance (or
            quality control) for the organization is:

            (a)  Assigned to one individual by title (e.g.,
                 Quality Control Coordinator).                          5

            (b)  Assigned to a specific group within the organization. 3
            (c)  Not specifically assigned but left to the discre-
                 tion of the various operational, analytical, inspec-
                 tion, and testing personnel.                          1
     (1.2)  The Quality Control Coordinator is located in the
            organization such that:

            (a)  He has direct access to the top management level
                 for the total operation, independent of others in-
                 volved in operational activities.                     5

            (b)  He performs as a peer with others involved in
                 operational activities, with access to top manage-
                 ment through the normal chain of command.             3

            (c)  His primary responsibility is in operational
                 activities, with quality assurance as an extra or
                 part-time effort.                                     1
     (1.3)  Data reports on quality are distributed by the Quality
            Control Coordinator to:

            (a)  All levels of management.*               .             5

            (b)  One level of management only.                         3

            (c)  The quality control group only.                       1


     (1.4)  Data Quality Reports contain:

            (a)  Information on operational trends, required
                 actions, and danger spots.                            5

            (b)  Information on suspected data/analyses and
                 their causes.                                         3

            (c)  Percent of valid data per month.                      1
     *Management at appropriate levels in all applicable organizations such
as subcontractors, prime contractor, EPA.
                                     23

-------
A.2  THE QUALITY SYSTEM
                                                                     SCORE

     (2.1)  The quality control system is:

            (a)  Formalized and documented by a set of procedures
                 which clearly describe the activities necessary
                 and sufficient to achieve desired quality objec-
                 tives, from procurement through to reporting data
                 to the EPA/RTF.                                        5

            (b)  Contained in methods procedures or is implicit in
                 those procedures.  Experience with the materials,
                 product, and equipment is needed for continuity
                 of control.                                           3

            (c)  Undefined in any procedures and is left to the cur-
                 rent managers or supervisors to determine as the
                 situation dictates.                                   1
     (2.2)  Support for quality goals and results is indicated by:

            (a)  A clear statement of quality objectives by the top
                 executive, with continuing visible evidence of its
                 sincerity, to all levels of the organization.         5

            (b)  Periodic meetings among operations personnel and the
                 individual(s) responsible for quality assurance, on
                 quality objectives and progress toward their achieve-
                 ment .                                                  3

            (c)  A "one-shot" statement of the desire for product
                 quality by the top executive, after which the quality
                 assurance staff is on its own.                        1
     (2.3)  Accountability for quality is:

            (a)  Clearly defined for all sections and operators/
                 analysts where their actions have an impact on
                 quality.                                              5

            (b)  Vested with the Quality Control Coordinator who
                 must use whatever means possible to achieve quality
                 goals.                                                3

            (c)  Not defined.
                                      24

-------
A.2  THE QUALITY SYSTEM (continued)
                                                                     SCORE

     (2.4)  The acceptance criteria for the level of quality
            of the demonstration projects routine performance are:

            (a)  Clearly defined in writing for all characteris-
                 tics.                                                  5

            (b)  Defined in writing for some characteristics
                 and some are dependent on experience,  memory
                 and/or verbal communication.                           3

            (c)  Only defined by experience and verbal  communica-
                 tion.                                                  1
     (2.5)   Acceptance criteria for the level of quality of the
            project's routine performance are determined by:

            (a)   Monitoring the performance in a structured pro-
                 gram of inter- and intralaboratory evaluations.

            (b)   Scientific determination of what is technically
                 feasible.

            (c)   Laboratory determination of what can be done using
                 currently  available equipment, techniques, and
                 manpower.
     (2.6)  Decisions on acceptability of questionable results are
            made by:

            (a)   A review group consisting of the chief chemist or
                 engineer, quality control, and others who can render
                 expert judgment.                                       5

            (b)   An informal assessment by quality control.             3

            (c)   The  operator/chemist.                                 1
                                     25

-------
A.2  THE QUALITY SYSTEM (continued)
                                                                     SCORE

     (2.7)  The quality control coordinator has the authority to:

            (a)  Affect the quality of analytical results by in-
                 serting controls to assure that the methods meet
                 the requirements for precision, accuracy, sensi-
                 tivity, and specificity.                              5

            (b)  Reject suspected results and stop any method that
                 projects high levels of discrepancies.                3

            (c)  Submit suspected results to management for a
                 decision on disposition.                              1
A.3  IN-PROCESS QUALITY ASSURANCE


     (3.1)  Measurement methods are checked:

            (a)  During operation for conformance to operating
                 conditions and to specifications, e.g., flow rates,
                 reasonableness of data, etc.                          5

            (b)  During calibration to determine acceptability
                 of the results.                                       3

            (c)  Only when malfunctions are reported.                  1
     (3.2)  The capability of the method to produce within
            specification limit is:

            (a)  Known through method capability analysis (X-R
                 Charts) to be able to produce consistently
                 acceptable results.                                   5

            (b)  Assumed to be able to produce a reasonably
                 acceptable result.                                    3

            (c)  Unknown.                                              1
     (3.3)  Method determination discrepancies are:

            (a)  Analyzed immediately to seek out the causes and
                 apply corrective action.                              5

            (b)  Checked out when time permits.                        3

            (c)  Not detectable with present controls and procedures.  1
                                      26

-------
A.3  IN-PROCESS QUALITY ASSURANCE (continued)
                                                                     SCORE

     (3.4)  The operating conditions (e.g., flow rate, range,
            temperature, etc.) of the methods are:

            (a)  Clearly defined in writing in the method for each
                 significant variable.                                 5

            (b)  Controlled by supervision based on general guide-
                 lines .                                                3

            (c)  Left up to the operator/analyst.                      1


     (3.5)  Auxiliary measuring, gaging, and analytical
            instruments are:

            (a)  Maintained operative, accurate, and precise
                 by regular checks and calibrations against
                 stable standards which are traceable to the
                 U.S. Bureau of Standards.                            5

            (b)  Periodically checked against a zero point or
                 other reference and examined for evidence of
                 physical damage, wear or inadequate maintenance.      3

            (c)  Checked only when they stop working or when ex-
                 cessive defects are experienced which can be
                 traced to inadequate instrumentation.                 1
A. 4  CONFIGURATION CONTROL
     (4.1)  Procedures for documenting, for the record, any design
            change in the system are:
            (a)  Written down and readily accessible to those
                 individuals responsible for configuration con-
                 trol.                                                 5

            (b)  Written down but not in detail.                       3

            (c)  Not documented.                                       1
                                      27

-------
A.4  CONFIGURATION CONTROL  (continued)
                                                                     SCORE

     (4.2)  Engineering schematics are:

            (a)  Maintained current on the system and subsystem
                 levels.  .                                             5

            (b)  Maintained current on certain subsystems only.        3

            (c)  Not maintained current.                               1


     (4.3)  All computer programs are:

            (a)  Documented and flow charted.                          5

            (b)  Flow charted.                                         3

            (c)  Summarized.                                           1


     (4.4)  Procedures for transmitting significant design changes
            in hardware and/or software to the EPA project officer
            are:

            (a)  Documented in detail sufficient for implementation.   5

            (b)  Documented too briefly for implementation.            3

            (c)  Not documented.                                       1


A.5  DOCUMENTATION CONTROL
     (5.1)  Procedures for making revisions to technical documents
            are:

            (a)  Clearly spelled out in written form with the line
                 of authority indicated and available to all involved
                 personnel.                                            5

            (b)  Recorded but not readily available to all personnel.  3

            (c)  Left to the discretion of present supervisors/mana-
                 gers.                                                 1
                                      28

-------
A.5  DOCUMENTATION CONTROL (continued)
                                                                     SCORE

     (5.2)  In revising technical documents, the revisions are:

            (a)  Clearly spelled out in written form and distrib-
                 uted to all parties affected,  on a controlled basis
                 which assures that the change  will be implemented
                 and permanent.                                         5

            (b)  Communicated through memoranda to key people who
                 are responsible for effecting  the change through
                 whatever method they choose.                          3

            (c)  Communicated verbally to operating personnel who
                 then depend on experience to maintain continuity
                 of the change.                                         1
     (5.3)  Changes to technical documents pertaining to opera-
            tional activities are:

            (a)  Analyzed to make sure that any harmful side effects
                 are known and controlled prior to revision effectiv-
                 ity.                                                  5

            (b)  Installed on a trial or gradual basis, monitoring
                 the product to see if the revision has a net bene-
                 ficial effect.                                        3

            (c)  Installed immediately with action for correcting side
                 effects taken if they show up in the final results.   1
     (5.4)  Revisions to technical documents are:

            (a)  Recorded as to date, serial number, etc.  when the
                 revision becomes effective.                           5

            (b)  Recorded as to the date the revision was  made on
                 written specifications.                               3

            (c)  Not recorded with any degree of precision.             1
                                      29

-------
A.5  DOCUMENTATION CONTROL (continued)
                                                                     SCORE

     (5.5)  Procedures for making revisions to computer software
            programs are:

            (a)  Clearly spelled out in written form with the line
                 of authority indicated.                               5

            (b)  Not recorded but changes must be approved by the
                 present supervisor/manager.                           3

            (c)  Not recorded and left to the discretion of the
                 programmer.                                            1
     (5.6)  In revising software program documentation, the re-
            visions are:

            (a)  Clearly spelled out in written form, with reasons
                 for the change and the authority for making the
                 change distributed to all parties affected by the
                 change.                                               5

            (b)  Incorporated by the programmer and communicated
                 through memoranda to key people.                      3

            (c)  Incorporated by the programmer at his will.           1
     (5.7)  Changes to software program documentation are:

            (a)  Analyzed to make sure that any harmful side
                 effects are known and controlled prior to
                 revision effectivity.

            (b)  Incorporated on a trial basis, monitoring  the
                 results to see if the revision has a net bene-
                 ficial effect.

            (c)  Incorporated immediately with action for detecting
                 and correcting side effects taken as necessary.
                                      30

-------
A.5  DOCUMENTATION CONTROL (continued)
                                                                     SCORE

     (5.8)  Revisions to software program documentation are:

            (a)  Recorded as to date, program name or number, etc.,
                 when the revision becomes effective.                  5

            (b)  Recorded as to the date the revision was made.        3

            (c)  Not recorded with any degree of precision.            1


A.6  PREVENTIVE MAINTENANCE


     (6.1)  Preventive maintenance procedures are:

            (a)  Clearly defined and written for all measurement
                 systems and support equipment.                        5

            (b)  Clearly defined and written for most of the measure-
                 ment systems and support equipment.                   3

            (c)  Defined and written for only a small fraction of the
                 total number of systems.                              1


     (6.2)  Preventive maintenance activities are documented:

            (a)  On standard forms in station log books.               5

            (b)  Operator/analyst summary in log book.                 3

            (c)  As operator/analyst notes.                            1
     (6.3)  Preventive maintenance procedures as written appear
            adequate to insure proper equipment operation for:

            (a)  All measurement systems and support equipment.

            (b)  Most of the measurement systems and support equip-
                 ment .

            (c)  Less than half of the measurement systems and sup-
                 port equipment.
                                       31

-------
A.6  PREVENTIVE MAINTENANCE
                                                                     SCORE

     (6.4)  A review of the preventive maintenance records indicates
            that:         Cr                           .:.•.,'  V  : . :    •

            (a)  Preventive maintenance procedures have been carried       i
               .  out on schedule and completely documented.            5

            (b)  The procedures were carried out on schedule but not
                 completely documented.                                3

            (c)  The procedures were not carried out on schedule all
                 the time and not always documented.                   1
     (6.5)  Preventive maintenance records (histories) are:

            (a)  Utilized in revising maintenance schedules, de-
                 veloping an optimum parts/reagents inventory and
                 development of scheduled replacements to minimize
                 wear-out failures.                                    5

            (b)  Utilized when specific questions arise and for
                 estimating future work loads.                         3

            (c)  Utilized only when unusual problems occur.            1


A.7  DATA VALIDATION PROCEDURES


     (7.1)  Data validation procedures are:

            (a)  Clearly defined in writing for all measurement
                 systems.                                              5

            (b)  Defined in writing for some measurement systems,
                 some dependent on experience, memory, and/or
                 verbal communication.                                 3

            (c)  Only defined by experience and verbal communica-
                 tion.                                                 1
                                      32

-------
A.7  DATA VALIDATION PROCEDURES (continued)
                                                                     SCORE

     (7.2)  Data validation procedures are:

            (a)  A coordinated combination of computerized and
                 manual checks applied at different levels in the
                 measurement process.                                  5

            (b)  Applied with a degree of completeness at no more
                 than two levels of the measurement process.           3

            (c)  Applied at only one level of the measurement pro-
                 cess.                                                  1
     (7.3)  Data validation criteria are documented and include:

            (a)  Limits on:  (1) operational parameters such as
                 flow rates; (2) calibration data, (3) special
                 checks unique to each measurement; e.g., succes-
                 sive values/averages; (4) statistical tests; e.g.,
                 outliers; (5) manual checks such as hand calcula-
                 tions .

            (b)  Limits on the above type checks for most of the
                 measurement systems.

            (c)  Limits on some of the above type checks for only
                 the high-priority measurements.
     (7.4)  Acceptable limits as set are reasonable and adequate
            to insure the detection of invalid data with a high
            probability for:

            (a)  All measurement systems.                              5

            (b)  At least 3/4 of the measurement systems.              3

            (c)  No more than 1/2 of the measurement systems.          1
                                      33

-------
A.7  DATA VALIDATION PROCEDURES (continued)
                                                                     SCORE

     (7.5)  Data validation activities are:

            (a)  Recorded on standard forms at all levels of the
                 measurement process.                                  5

            (b)  Recorded in the operator's/analyst's log book.         3

            (c)  Not recorded in any prescribed manner.                 1


     (7.6)  Examination of data validation records indicates that:

            (a)  Data validation activities have been carried out
                 as specified and completely documented.                5

            (b)  Data validation activities appear to have been
                 performed but not completely documented.              3

            (c)  Data validation activities, if performed, are not
                 formally documented.                                  1


     (7.7)  Data validation summaries are:

            (a)  Prepared at each level or critical point in the
                 measurement process and forwarded to the next level
                 with the applicable block of data.                     5

            (b)  Prepared by and retained at each level.                3

            (c)  Not prepared at each level nor communicated between
                 levels.                                                1
     (7.8)   Procedures for deleting invalidated data are:

            (a)   Clearly defined in writing for all levels  of  the  meas-
                 urement process,  and invalid data are  automatically
                 deleted when one  of the computerized validation cri-
                 teria is exceeded.                                    5

            (b)   Programmed for automatic deletion when computerized
                 validation criteria are exceeded  but procedures not
                 defined when manual checks detect invalid  data.       3

            (c)   Not  defined for all levels of the measurement  pro-
                 cess.                                                1
                                     34

-------
A.7  DATA VALIDATION PROCEDURES (continued)
                                                                     SCORE

     (7.9)  Quality audits (i.e., both on-site system reviews and/or
            quantitative performance audits) independent of the normal
            operations are:

            (a)  Performed on a random but regular basis to ensure
                 and quantify data quality.                            5

            (b)  Performed whenever a suspicion arises that there
                 are areas of ineffective performance.                 3

            (c)  Never performed.                                      1


A.8  PROCUREMENT AND INVENTORY PROCEDURES


     (8.1)  Purchasing guidelines are established and documented
            for:

            (a)  All equipment and reagents having an effect on data
                 quality.                                              5

            (b)  Major items of equipment and critical reagents.       3

            (c)  A. very few items of equipment and reagents.           1


     (8.2)  Performance specifications are:

            (a)  Documented for all items of equipment which have
                 an effect on data quality.                            5

            (b)  Documented for the most critical items only.          3

            (c)  Taken from the presently used items of equipment.     1


     (8.3)  Reagents and chemicals (critical items) are:

            (a)  Procured from suppliers who must submit samples
                 for test and approval prior to initial shipment.      5

            (b)  Procured from suppliers who certify they can meet
                 all applicable specifications.                        3

            (c)  Procured from suppliers on the basis of price and
                 delivery only.                                        1
                                      35

-------
A.8  PROCUREMENT AND INVENTORY PROCEDURES (continued)
                                                                     SCORE

     (8.4)  Acceptance testing for incoming equipment is:

            (a)  An established and documented inspection procedure
                 to determine if procurements meet the quality assurance
                 and acceptance requirements.  Results are document-
                 ed.                                                   5

            (b)  A series of undocumented performance tests performed
                 by the operator before using the equipment.           3

            (c)  The receiving document is signed by the responsible
                 individual indicating either acceptance or rejection.  1
     (8.5)  Reagents and chemicals are:

            (a)  Checked 100% against specification, quantity, and
                 for certification where required and accepted
                 only if they conform to all specifications.            5

            (b)  Spot-checked for proper quantity and for shipping
                 damage.                                               3

            (c)  Released to analyst by the receiving clerk without
                 being checked as above.                               1

     (8.6)  Information on discrepant purchased materials is:

            (a)  Transmitted to the supplier with a request for
                 corrective action.                                    5

            (b)  Filed for future use.                                  3

            (c)  Not maintained.                                        1
     (8.7)   Discrepant purchased materials are:

            (a)   Submitted to a review by Quality Control and
                 Chief Chemist for disposition.                         5

            (b)   Submitted to Service Section for determination
                 on acceptability.                 .                    3

            (c)   Used because of scheduling requirements.              1
                                      36

-------
A.8  PROCUREMENT AND INVENTORY PROCEDURES (continued)
                                                                     SCORE

     (8.8)  Inventories are maintained on:

            (a)  First-in, first-out basis.                             5

            (b)  Random selection in stock room.                       3

            (c)  Last-in, first-out basis.                             1


     (8.9)  Receiving of materials is:

            (a)  Documented in a receiving record log, giving a
                 description of the material, the date of receipt,
                 results of acceptance test, and the signature
                 of the responsible individual.                         5

            (b)  Documented in a receiving record log with material
                 title, receipt date, and initials of the individual
                 logging the material in.                              3

            (c)  Documented by filing a signed copy of the requisi-
                 tion.                                                  1


    (8.10)  Inventories are:

            (a)  Identified as to type, age, and acceptance status.    5

            (b)  Identified as to material only.                       3

            (c)  Not identified in writing.                             1


    (8.11)  Reagents and chemicals which have limited shelf life are:

            (a)  Identified as to shelf life expiration data and
                 systematically issued from stock only if they
                 are still within that date.                           5

            (b)  Issued on a first-in, first-out basis, expecting
                 that there is enough safety factor so that the
                 expiration date is rarely exceeded.                   3

            (c)  Issued at random from stock.                          1
                                      37

-------
A.9  PERSONNEL TRAINING PROCEDURES
                                                                     SCORE

     (9.1)  Training of new employees is accomplished by:

            (a)  A programmed system of training where elements of
                 training, including quality standards, are included
                 in a training checklist.  The employee's work is
                 immediately rechecked by supervisors for errors or
                 defects and the information is fed back instanta-
                 neously for corrective action.                        5

            (b)  On-the-job training by the supervisor who gives
                 an overview of quality standards.  Details of
                 quality standards are learned as normal results
                 are fed back to the chemist.                          3

            (c)  On-the-job learning with training on the rudi-
                 ments of the job by senior coworkers.                 1
     (9.2)  When key personnel changes occur:

            (a)  Specialized knowledge and skills are retained in
                 the form of, documented methods and descriptions.      5

            (b)  Replacement people can acquire the knowledge of
                 their predecessors from coworkers, supervisors,
                 and detailed study of the specifications and
                 memoranda.                                             3

            (c)  Knowledge is lost and must be regained through long
                 experience or trial-and-error.                         1
     (9.3)  The people who have an impact on quality, e.g.,  cali-
            bration personnel,  maintenance personnel, bench  chemists,
            supervisors, etc.,  are:

            (a)  Trained in the reasons for and the benefits of
                 standards of quality and the methods by which
                 high quality can be achieved.

            (b)  Told about quality only when their work falls below
                 acceptable levels.

            (c)  Are reprimanded when quality deficiencies are
                 directly traceable to their work.
                                      38

-------
A.9  PERSONNEL TRAINING PROCEDURES (continued)
                                                                     SCORE

     (9.4)  The employee's history of training accomplishments
            is maintained through:

            (a)  A written record maintained and periodically
                 reviewed by the supervisor.                           5

            (b)  A written record maintained by the employee.          3

            (c)  The memory of the supervisor/employee.                1


     (9.5)  Employee proficiency is evaluated on a continuing
            basis by:

            (a)  Periodic testing in some planned manner with the
                 results of such tests recorded.                       5

            (b)  Testing when felt necessary by the supervisor.        3

            (c)  Observation of performance by the supervisor.         1


     (9.6)  Results of employee proficiency tests are:

            (a)  Used by management to establish the need for and
                 type of special training.                             5

            (b)  Used by the employee for self-evaluation of needs.    3

            (c)  Used mostly during salary reviews.                    1


A.10  FEEDBACK AND CORRECTIVE ACTION
    (10.1)  A and corrective action mechanism to assure
            that problems are reported to those who can correct them
            and that a closed loop mechanism is established to assure
            that appropriate corrective actions have been taken is:

            (a)  Clearly defined in writing with individuals assigned
                 specific areas of responsibility.                     5

            (b)  Written in general terms with no assignment of
                 responsibilities.                                     3

            (c)  Not formalized but left to the present supervisors/
                 managers.                                             1
                                      39

-------
A.10  FEEDBACK AND CORRECTIVE ACTION (continued)
                                                                     SCORE

    (10.2)  Feedback and corrective action activities are:

            (a)  Documented on standard forms.                         5

            (b)  Documented in the station log book.                   3

            (c)  Documented in the operator's/analyst's notebook.      1


    (10.3)  A review of corrective action records indicates that:

            (a)  Corrective actions were systematic, timely, and
                 fully documented.                                     5

            (b)  Corrective actions were not always systematic,
                 timely, or fully documented.                          3

            (c)  A closed loop mechanism did not exist.                1


    (10.4)  Periodic summary reports on the status of corrective
            action are distributed by the responsible individual to:

            (a)  All levels of management.                             5

            (b)  One level of management only.                         3

            (c)  The group generating the report only.                  1


    (10.5)  The reports include:

            (a)  A listing of major problems for the reporting
                 period; names of persons responsible for correc-
                 tive actions; criticality of problems; due dates;
                 present status;  trend of quality performance (i.e.,
                 response time, etc.);  listing of items still open
                 from previous reports.                                 5

            (b)  Most of the above items.                              3

            (c)  Present status of problems and corrective actions.     1
                                      40

-------
A.11  CALIBRATION PROCEDURES
                                                                     SCORE

    (11.1)  Calibration procedures are:

            (a).  Clearly defined and written out in step-by-step
                 fashion for each measurement system and support
                 device.                                               5

            (b)  Defined and summarized for each system and device.    3

            (c)  Defined but operational procedures developed by
                 the individual.                                       1
    (11.2)  Calibration procedures as written are:

            (a)  Judged to be technically sound and consistent with
                 data quality requirements.                            5

            (b)  Technically sound but lacking in detail.              3

            (c)  Technically questionable and lacking in detail.       1


    (11.3)  Calibration standards are:

            (a)  Specified for all systems and measurement devices
                 with written procedures for assuring, on a con-
                 tinuing basis, traceability to primary standards.     5

            (b)  Specified for all major systems with written
                 procedures for assuring traceability to pri-
                 mary standards.                                       3

            (c)  Specified for all major systems but no procedures
                 for assuring traceability to primary standards.       1
    (11.4)  Calibration standards and traceability procedures as
            specified and written are:

            (a)  Judged to be technically sound and consistent
                 with data quality requirements.                       5

            (b)  Standards are satisfactory but traceability is
                 not verified frequently enough.                       3

            (c)  Standards are questionable.                           1
                                     41

-------
A.11  CALIBRATION PROCEDURES (continued)
                                                                     SCORE

    (11.5)  Frequency of calibration is:

            (a)  Established and documented for each measurement
                 system and support measurement device.                5

            (b)  Established and documented for each major meas-
                 urement system.                                       3

            (c)  Established and documented for only certain
                 measurement systems.                                  1
    (11.6)  A review of calibration data indicates that the
            frequency of calibration as implemented:

            (a)  Is adequate and consistent with data quality
                 requirements.                                         5

            (b)  Results in limits being exceeded a small frac-
                 tion of the time.                                     3

            (c)  Results in limits being exceeded frequently.          1
    (11.7)  A review of calibration history indicates that:

            (a)  Calibration schedules are adhered to and results
                 fully documented.                                      5

            (b)  Schedules are adhered to most of the time.             3

            (c)  Schedules are frequently not adhered to.              1
    (11.8)   A review of calibration history and data validation
            records indicates that:

            (a)   Data are always invalidated and deleted when
                 calibration criteria are exceeded.

            (b)   Data are not always invalidated and/or deleted
                 when criteria are exceeded.

            (c)   Data are frequently not invalidated and/or deleted
                 when criteria are exceeded.
                                      42

-------
A. 11  CALIBRATION PROCEDURES (continued)
                                                                     SCORE

    (11.9)  Acceptability requirements for calibration results
            are:

            (a)  Defined for each system and/or device requiring
                 calibration including elapsed time since the
                 last calibration as well as maximum allowable
                 change from the previous calibration.                 5

            (b)  Defined for all major measurement systems.             3

            (c)  Defined for some major measurements systems only.      1


   (11.10)  Acceptability requirements for calibration results as
            written are:

            (a)  Adequate and consistent with data quality require-
                 ments :                                                5

            (b)  Adequate but others should be added.                  3

            (c)  Inadequate to ensure data of acceptable quality.      1


   (11.11)  Calibration records (histories) are:

            (a)  Utilized in revising calibration schedules (i.e.,
                 frequency).                                           5

            (b)  Utilized when specific questions arise and re-
                 viewed periodically for trends, completeness,
                 etc.                                                  3

            (c)  Utilized only when unusual problems occur.             1


A.12  FACILITIES/EQUIPMENT


    (12.1)  Facilities/Equipment are:

            (a)  Adequate to obtain acceptable results.                5

            (b)  Adequate to obtain acceptable results most of
                 the time.                                             3

            (c)  Additional facilities and space are needed.           1
                                      43

-------
A.12  FACILITIES/EQUIPMENT (continued)
                                                                     SCORE

    (12.2)  Facilities, equipment, and materials are:

            (a)  As specified in appropriate documentation and/or
                 standards.                                            5

            (b)  Generally as specified in appropriate standards.      3

            (c)  Frequently different from specifications.             1


    (12.3)  Housekeeping reflects an orderly, neat, and
            effective attitude of attention to detail in:

            (a)  All of the facilities.                                5

            (b)  Most of the facilities.                               3

            (c)  Some of the facilities.                               1


    (12.4)  Maintenance Manuals are:

            (a)  Complete and readily accessible to maintenance
                 personnel for all systems, components, and
                 devices.                                              5

            (b)  Complete and readily accessible to maintenance
                 personnel for all major systems, components, and
                 devices.                                              3

            (c)  Complete and accessible for only a few of the
                 systems.                                              1


A.13  RELIABILITY
    (13.1)  Procedures for reliability data collection, processing,
            and reporting are:

            (a)  Clearly defined and written for all system com-
                 ponents.                                              5

            (b)  Clearly defined and written for major components
                 of the system.                                        3

            (c)  Not defined.                                          1
                                      44

-------
A.13  RELIABILITY (continued)
                                                                     SCORE

    (13.2)  Reliability data are:

            (a)  Recorded on standard forms.                           5

            (b)  Recorded as operator/analyst notes.                   3

            (c)  Not recorded.                                         1


    (13.3)  Reliability data are:

            (a)  Utilized in revising maintenance and/or replace-
                 ment schedules.                                       5

            (b)  Utilized to determine optimum parts inventory.        3

            (c)  Not utilized in any organized fashion.                1
                                      45

-------
APPENDIX B   STANDARD TECHNIQUES USED IN
             QUANTITATIVE PERFORMANCE AUDITS
                     46

-------
                                         Table 1.   Physical measurements
Property
Measurement methods
                         Calibration methods
                                   Audit techniques
Density
Flow
Humidity
a.  Vibrating U-tube
b.  Mass/flow meter
c.  Bubble tube
a.  Orifice meter
    i.  manometer
   ii.  differential
       pressure cell
  iii.  mechanical gauges
   iv.  electrical cells
                 Pitot tube
                 i.  manometer
                ii.  mechanical gauges
               iii.  electrical cells
                iv.  differential
                    pressure cell
                 Venturi meter
                 Magnetic flow meter
                 Ultrasonic flow meter
c.
d.
e.

a.

b.
c.

d.
Wet bulb/dry bulb
thermometers
Dewpoint meters
Electronic humidity
cells
Fluidic
                         Take sample, get weight and
                         volume at process temperature,
                         and calculate density.
                         Primary method is to remove
                         meter from process and cali-
                         brate on test stand.

                         Secondary method is calibrn-
                         tion of elements following
                         sensor.
Calculation of humidity from
wet and dry bulb measurements
and psychrometric relations.
                                   Frequency:  Before start and at
                                     end of demonstration,
                                     monthly in between.
                                   Technique:  Use of appropriate
                                     laboratory weight and volume
                                     measures.

                                   Frequency:  Before start and at
                                     end of demonstration,
                                     monthly in between.
                                   Remove sensor element and in-
                                     spect for corrosion or foul-
                                     ing.
                                   Carry out manufacturer recom-
                                     mended calibration procedure.
                                   For transducer and output,
                                     apply substitute signal and
                                     calibrate.
Frequency:  Before start and at
  end of demonstration, weekly
  for wet/dry bulb, monthly
  for others.

Technique:  Remove sensor and
  subject to air stream having
  wet/dry apparatus for com-
  parison.

-------
                                        Table 1.   Physical measurements (con.)
   Property
Measurement methods
                          Calibration methods
                                  Audit techniques
   Level
   Pressure,
   differen-
   tial
   pressure
00
   Tempera-
   ture
a.
b.
c.
d.
e.

f.
g-

a.
b.
c.
a.
b.

c.
d.
e.
Bubble tube
Float
Conductivity cell
Capacitance cell
Differential
pressure cell
Ultrasonic
Sight glass

Mechanical guage
Manometer
Electrical pressure
cell
Differential pressure
cell
Thermocouple
Resistance tempera-
ture detector
Thermistor
Filled bulb
Mercury thermometer
Measure level with sight
glass or dip stick.
Use of dead weight tester is
primary standard.  Secondary
standards are

a.  Manometer with known
    fluid
b.  Precision mechanical
    gauges
c.  Standard electrical
    pressure cells
Comparison to reference point
a.  Ice point HaO
b.  Boiling point H20
c.  Standard thermometer
d.  Electronic standard
Frequency:  Before start and at
  end of demonstration,
  monthly in between.
Technique:  Measure level at
  several points in range and
  compare to readout.
Frequency:  Before start and at
  end of demonstration,
  in between.
Technique:  Pressure sensors to
  be provided with test taps
  and valves, also tap for
  secondary source for d/p
  or electrical cells.
Manometer is preferred for
  calibration in field where
  possible.

Frequency:  Before start and at
  end of demonstration,
  monthly in between.
Techniques:  Remove sensor,
  insert in reference tempera-
  ture.  For nonremovable
  sensor, measure output of
  sensor, and insert substitute
  signal into instrument input
  and check calibration.

-------
                                            Table 2.  Gas effluent streams
   Material
Measurement method
Calibration method
Audit techniques
   Carbon monoxide

   Nitrogen oxide
Method 10 (continuous)

Method 7 (grab)
                      Continuous
VD
   Particulates
Method 5 (sampling train)
                      Optical (transmissometer)
Standard calibration gas.

Calibrate sampling train
components and use control
samples for analysis phase.
                             1.  Use standard calibra-
                                tion gases, plus
                             2.  Compare results to
                                Method 7.
Calibrate components of
sampling train:  pitot
tube, dry gas meter,
orifice meter, tempera-
ture measurement devices,
probe heater, filter
holder, temperature sys-
tem.
                             Filters
Provide SRM* for measurement.

1. Independent duplicate
   sampling and analysis.
2. Review and observe operating
   procedures, check calibra-
   tion of train components,
   and prepare blind samples
   for field team to measure.

1. Provide N0/N02 calibration
   gas:  NBS-SRM (analysis
   phase).
2. Compare to Method 7 (total
   measurement method).

1. Audit of total method by
   independent, simultaneous
   measurement from sampling
   through analysis.
2. Calibration check on sample
   train components* percent
   isokinetic rate, and visual
   observation of operating
   procedures.

Calibration check with inde-
pendent set of NBS filters.
     *Standard Reference Material, from the National Bureau of Standards.

-------
                                         Table 2.  Gas effluent streams  (con.)
    Material
Measurement method
Calibration method
Audit techniques
    Sulfur  dioxide
Method 6 (batch)
Calibrate sampling train
components and use stand-
ard samples for analysis
phase.
                      Method 12  (continuous)
Ln
O
                             1. Use standard calibration
                                gases, plus
                             2. Use calibrated absorb-
                                ance filter furnished
                                by instrument manu-
                                facturer.
1. Independent duplicate
   sampling and analysis.
2. Review/observe operating
   procedures, check calibra-
   tion of train components,
   and analyze split samples
   and/or prepare blind sam-
   ples for field team to
   measure.

1. Compare to Method 6.
2. Provide SRS for measure-
   ments.

-------
                                   Table 3.   Liquid streams, suspended solids
Material
Measurement method
Calibration metbod
                                                                                       Audit  techniques
Liquid stream samples:
percent solids, ionic
species.
Effluent solids; e.g.,
percent water, CaO,
S03~2, SO.*"2, C02,
inerts, etc.*
yi   pH
Standard chemical and
instrumental techniques
(atomic absorption
spectrometry,'
potentiometry,
amperometry, etc.)

Standard gravimetric,
volumetric, instru-
mental techniques
(such as X-ray
fluorescence).

a. pH cell
b. Indicators (test
   papers)
c. Wet chemical
Calibrate glassware, balances,
instruments.
Calibrate volmetric, gravi-
metric, instrumental mea-
surement devices (glassware,
balances, meters, etc.)
                                                 a. Subject to known pH buffer
                                                    solution
                                                 b. Sample stream and measure
                                                    pH with independent method/
                                                    instrument.
                                                                                       Control samples,  split
                                                                                       samples.
                                                                                       Control samples,  split
                                                                                       samples.
                                  Frequency:  Daily to
                                    weekly
                                  Technique:  Remove pH
                                    cell from process and
                                    insert in buffered
                                    solutions in range of
                                    process pH.
  *Typical of solids composition in the effluent of a wet limestone S02 scrubber.

-------
                               . TECHNICAL REPORT DATA
                          (Please read Instructions on the reverse before completing)
 I. REPORT NO.
 EPA- 600/2-76-081
                           2.
                                                       3. RECIPIENT'S ACCESSION>NO.
4. TITLE AND SUBTITLE
 Guidelines for Demonstration Project Quality
 Assurance Programs
                                                      5. REPORT DATE
                                                      March 1976
                                                      6. PERFORMING ORGANIZATION CODE
7. AUTHOR(S)

 James Buchanan
                                                       8. PERFORMING ORGANIZATION REPORT NO.
9. PERFORMING ORGANIZATION NAME AND ADDRESS
 Research Triangle Institute
 P.O. Box 12194
 Research Triangle Park, NC 27709
                                                       10. PROGRAM ELEMENT NO.
                                                      EHB-557: ROAP ABA-011
                                                      11. CONTRACT/GRANT NO.

                                                      68-02-1398, Task 20
 12. SPONSORING AGENCY NAME AND ADDRESS
 EPA, Office of Research and Development
 Industrial Environmental Research Laboratory
 Research Triangle Park, NC 27711
                                                      13. TYPE OF REPORT AND PERIOD COVERED
                                                      Task Final; 7-12/75	
                                                      14. SPONSORING AGENCY CODE
                                                       EPA-ORD
15.SUPPLEMENTARY NOTES  Project officer for this report is L.D. Johnson,  Mail Drop 62,
 Ext 2557.
  . ABSTRACT
               repOrt presents general guidelines for planning and implementing
 quality assurance programs at EPA/IERL-RTP demonstration projects.  Because
 quality assurance, a system of activities whose purpose is to assure that overall
 quality is being controlled effectively, requires a thorough understanding of quality
 control,  the report initially addresses the major components of a project quality
 control program, including a discussion of quality control in the request for proposal,
 the proposal,  and the work plan.  The two major functional areas of quality assurance
 are the qualitative systems review and the quantitative performance audit. A
 detailed  checklist is provided to aid in the systems review,  and three tables provide
 general information on available techniques for the performance audit.  These tables
 cover the auditing of physical measurements , gas effluent streams . and liquid pro-
 cess streams.  The report is designed for project officers, contractors,  and  others
 concerned with quality assurance programs at IERL-RTP demonstration projects.
17.
                             KEY WORDS AND DOCUMENT ANALYSIS
                 DESCRIPTORS
 Air Pollution
 Quality Assurance
 Quality Control
 Systems
 Reviewing
 Performance
 Auditing	
                    Measurement
                    Flue Gases
                    Industrial Processes
Air Pollution Control
Stationary Sources
Liquid Process Streams
IS
            STATT.MENT
Unlimited
                                          h.lDENTIFIERE/OPEN ENDED TERMS
                                           13. SECURITY CI./-SS tlliiik-pott,
                                           Unclassified
                                          20. SECURITY CL.AGS (Ti.is pagej
                                           Unclassified
                                                                  :. CCSf-~, Field/Group
13B
13H.14D   21B

14B
                                                                   05A
                                                                  21 NO. C? PAGES
                                                                       58
                                                                   22. PP'CE
EPA
      2220-1 (9-73,
                                         52

-------