x>EPA
United States
Environmental Protection
Agency
Industrial Environmental Research
Laboratory
Cincinnati OH 45268
EPA-600-'9-79-046
December 1979
Research and Development
Quality Assurance
Guidelines for lERL-Ci
Project Officers
-------
EPA-600/9-79-046
December 1979
QUALITY ASSURANCE GUIDELINES
FOR IERL-CI PROJECT OFFICERS
by
C.L. Stratton and J.D. Bonds
Environmental Science and Engineering, Inc.
Post Office Box 13454
Gainesville, Florida 32604
Contract No. 68-03-2656
Project Officer
Paul E. Mills
Quality Assurance Branch
Industrial Environmental Research Laboratory
Cincinnati, Ohio 45268
INDUSTRIAL ENVIRONMENTAL RESEARCH LABORATORY
OFFICE OF RESEARCH AND DEVELOPMENT
U.S. ENVIRONMENTAL PROTECTION AGENCY
CINCINNATI, OHIO 45268
-------
DISCLAIMER
This report has been reviewed by the Industrial Environmental Research
Laboratory, U.S. Environmental Protection Agency, and approved for publica-
tion. Approval does not signify that the contents necessarily reflect the
views and policies of the U.S. Environmental Protection Agency, nor does
mention of trade names or commercial products constitute endorsement or
recommendation for use.
ii
-------
FOREWORD
When energy and material resources are extracted, processed, converted,
and used, the related pollutional impacts on our environment and even on our
health often require that new and increasingly more efficient pollution
control methods be used. The Industrial Environmental Research Laboratory-
Cincinnati (lERL-Ci) assists in developing and demonstrating new and improved
methodologies that will meet these needs both efficiently and economically.
These quality assurance guidelines consist of three sections:
(1) Quality assurance guidelines for procurement of projects requiring
sampling and analysis
(2) Quality assurance guidelines for monitoring of projects requiring
sampling and analysis
(3) Quality assurance guidelines for auditing of projects requiring
sampling and analysis
The first section provides guidelines and checklists to assist the
Project Officer in project conception, preparation of procurement requests,
evaluation of proposals, and recommendations for selection to the Contracting
Officer. The second section presents quality assurance aspects and
checklists to assist in monitoring projects from project initiation through
the final report; and the third section presents information in relation to
planning and conducting project audits.
Further information may be obtained through the Industrial Environmental
Research Laboratory-1 s Quality Assurance Officer, Cincinnati, Ohio.
David G. Stephan
Director
Industrial Environmental Research Laboratory
Cincinnati, Ohio
iii
-------
ABSTRACT
This report presents quality assurance guidelines to assist Project
Officers in the procurement, monitoring, and auditing phases of extramural
projects requiring sampling and analysis.
The first section presents guidelines to insure that quality assurance
(QA) is adequately addressed during project conception and solicitation and
that prospective grantees are informed of QA requirements. A technical eval-
uation system is presented that should disqualify those offerers who do not
provide adequate sampling and analysis QA for the purposes of the program. A
checklist is provided for the evaluation of the quality assurance aspects of
proposals and grant applications.
In the second section, the Project Officer's responsibilities are
described for the initiation, monitoring, and satisfactory conclusion of
contracts, research and demonstration grants, and cooperative agreements of
the type normally funded by lERL-Ci. The three basic foundations of project
quality assurance are described: (1) the contractor's/grantee's quality
assurance program, (2) the Project Work Plan, and (3) quality assurance
monitoring of the performance of the contractor/grantee. Checklists are
included to assist the Project Officer in assessing the completeness of a
contractor's or grantee's overall QA program and his Project Work Plan.
The third section provides information concerning the scheduling and
performance of laboratory audits. Checklists are provided to assist in
performing the audit.
These quality assurance guidelines were submitted in fulfillment of
Contract No. 68-03-2656 by Environmental Science and Engineering, Inc. under
the sponsorship of the U.S. Environmental Protection Agency. This report
covers the period August 1978 to December 1979, and work was completed as of
December 1979.
iv
-------
CONTENTS
Disclaimer ii
Foreword iii
Abstract iv
Figures vii
Tables viii
1. QUALITY ASSURANCE GUIDELINES FOR PROCUREMENT OF PROJECTS REQUIRING
SAMPLING AND ANALYSIS
Introduction 1
Importance of Quality Assurance 1
Purpose of These Guidelines 2
Definitions of Quality Assurance and
Quality Control 3
QA in the Procurement Process 5
QA Criteria in Project Conception and Solicitation 8
Project Conception (Contracts) 8
Contract Solicitation 12
Project Conception and Application for
Research and Demonstration Grants 19
Use of Performance Test Samples in
Contractor/Grantee Selection 20
Cost Considerations 21
Preproposal Conference 23
QA Criteria in Technical Evaluation of Proposals and Grant
Applications 23
Technical Evaluation of Contract Proposals 23
QA Evaluation Criteria and Scoring Procedures
for Contracts 24
Written/Oral Discussions with Offerers 25
QA Evaluation of Grant Applications 28
Evaluation of Performance Test Samples 29
Evaluation of Previous Performance History 29
Cost Evaluation 30
Best and Final Offer Evaluation 31
Pre-Award Surveys 31
-------
CONTENTS (Continued)
2. QUALITY ASSURANCE GUIDELINES FOR MONITORING OF
PROJECTS REQUIRING SAMPLING AND ANALYSIS
Introduction 35
Project Officer's QA Responsibilities 37
Contracts 37
Grants and Cooperative Agreements 40
Summary of the Project Officer's Role in
Achieving QA 42
The Contractor's/Grantee's QA Program 43
Elements of a QA Program 43
QA Program Checklist 47
The Project Work Plan 48
Elements of a Project QA Plan 48
Project Work Plan Checklist 50
Project QA Monitoring 50
Methods of Monitoring Contractor/Grantee QA 50
3. QUALITY ASSURANCE GUIDELINES FOR AUDITING OF
PROJECTS REQUIRING SAMPLING AND ANALYSIS
Introduction 53
Quality Assurance Audits 53
Quality Assurance in Audits 53
Quality Assurance Audit Guidelines 54
When to Conduct a Quality Assurance Audit 54
Audit Worksheet and Checklist 54
Use of Performance Test Samples 58
Conducting the Quality Assurance Audit 58
Review of Worksheet 58
Site Visit 60
Quality Assurance Audit Report 62
4. CONCLUSIONS
Procurement 67
Monitoring 67
Auditing 67
BIBLIOGRAPHY 68
VI
-------
CONTENTS (Continued)
APPENDICES
A. Quality Assurance Evaluation Criteria Checklist for
Proposals and Grant Applications Offering Sampling
and Analysis Services 71
B. Quality Control Performance/Reference Test Samples 75
C. EPA-Accepted Analytical Methods 79
D. Quality Assurance Program Checklist 93
E. Project Quality Assurance Plan Checklist 105
F. Quality Assurance Pre-Audit Worksheet 113
G. Instrumentation, Equipment, and Personnel Skill
Rating for Specific Methods 126
H. Sample Preservation Methods and Recommneded
Holding Times 146
I. Quality Assurance Audit Checklist 150
GLOSSARY 179
Vll
-------
FIGURES
Number Page
Quality assurance (QA) and quality control (QC)
at lERL-Ci
Processing sequence for contract source evaluation
and selection
Processing sequence for research and demonstration
4
S
fS
7
8
9
Example of use of QA evaluation criteria checklist
for scoring proposals
Example of technical evaluation scoring system . . . .
Foundations of project quality assurance
Contracts administration ..........
Grants and cooperative agreements administration.
Quality assurance audit report
.... 26
.... 27
.... 36
. ... 38
.... 41
.... 63
viii
-------
TABLES
Number Page
1 Assignment of Evaluation Points for QA 11
2 Evaluation of the Importance of Sampling and Analysis QA . . . 13
3 Example of the Assessment of the Importance of
Sampling and Analysis QA 15
4 Example Technical Evaluation Criteria Incorporating
5
6
7
8
9
Suggested Criteria for Using Performance Test Samples
for Offerer Evaluation
Criteria for Conducting a Pre~Award Survey
Suggested Pre-Award Survey Agenda
Available Quality Assurance Audit Procedures
Recommended Agenda for QA Audit Site Visit
. . . . 22
. . . . 32
. . . . 34
. . . . 56
. . . . 61
IX
-------
SECTION 1
QUALITY ASSURANCE GUIDELINES FOR PROCUREMENT OF
PROJECTS REQUIRING SAMPLING AND ANALYSIS
INTRODUCTION
Importance of Quality Assurance
Quality assurance (QA) encompasses all actions taken by an organization
to achieve accurate and reliable results for programs undertaken. An estab-
lished QA program is essential for any organization to produce valid sampling
and analytical data to support research, demonstration, and monitoring
efforts. QA requirements are becoming increasingly more important to the
Environmental Protection Agency (EPA) since:
1. The number of commercial laboratories and research institutions
participating in sampling and analysis programs is increasing,
2. The sampling and analytical equipment and procedures used are
becoming more varied and complex,
3. There is an increasing movement toward consolidated data bases,
4. More and more data must withstand legal scrutiny, and
5. Policy decisions of national economic importance must be made on the
basis of reliable data.
It is the responsibility of lERL-Ci to assist in developing and demon-
strating new and improved pollution control technology that will meet our
nation's needs both efficiently and economically. To accomplish this, our
research must conform to the highest practicable quality assurance standards.
Sampling and analytical programs are an important aspect of most research and
demonstration projects conducted by lERL-Ci. Decisions of substantial
technical and economic importance are often based on the data generated by
lERL-Ci internal sampling and analytical programs, by programs conducted by
contractors, and by grantees.*
The Contracting Officer and the Project Officer bear the responsibility
of procuring and directing sampling and analysis services for lERL-Ci
research and demonstration projects. To fulfill this responsibility, they
must be knowledgeable about the basic principles underlying currently
* Throughout this document, the definition of the word "grant" includes
cooperative agreements.
-------
acceptable QA practices. These guidelines have been prepared to assist the
Project Officer and the Contracting Officer in procuring high quality sam-
pling and analytical services. They establish standards for acceptable QA
practices on lERL-Ci research and demonstration projects. Project Officers
and Contracting Officers should use these guidelines during preparation of
procurement requests and in the technical evaluation of offers and grant
applications.
This is the first in a series of three sections describing:
1. Quality assurance guidelines for procurement of projects requiring
sampling and analysis,
2. Quality assurance guidelines for monitoring projects requiring
sampling and analysis, and
3. Quality assurance guidelines for auditing projects requiring
sampling and analysis.
Together, these three sections should assist TERL-Ci Project Officers in
applying stringent, but equitable, quality assurance requirements throughout
the life of all research and demonstration projects that require sampling and
analysis. The requirements should apply to contractor, grantee, and in-house
programs equally.
This report is intended as a working document and will be revised as
necessary to more accurately describe processes, legal obligations, and
current policy.
The Director has assigned to the lERL-Ci Quality Assurance Officer the
responsibility of overseeing the quality assurance aspects of all lERL-Ci
contracts, research and demonstration grants, and in-house research and
demonstration projects. The Quality Assurance Officer has developed the
guideline document to assist in fulfilling that responsibility. He is
available to work with Project Officers, Contracting Officers, and in-house
program managers in any way possible to assure the quality of lERL-Ci
projects.
Purpose of These Guidelines
This section has been prepared to insure adequate consideration of QA
requirements in the procurement stage of lERL-Ci research and demonstration
projects. It provides specific guidelines to: (1) accurately reflect QA
requirements in the proposal solicitation stages of procurement; (2) solicit
the necessary QA information from prospective contractors, grantees, and
bidders to allow accurate evaluation of capabilities; (3) evaluate proposals
received with respect to QA; (4) evaluate past performance history of the
prospective contractor or grantee, if available; and (5) determine the
overall acceptability of an offerer's QA program with respect to the research
or demonstration project under consideration.
-------
These guidelines are intended to be used by lERL-Ci Project Officers
responsible for procuring sampling and analysis services in support of
lERL-Ci research and demonstration projects. They should also be used in the
evaluation of research and demonstration grant applications when sampling and
analysis aspects are involved. Contracting Officers will find these guide-
lines of value in preparing Request for Proposal (RFP) packages when
soliciting service contracts that include sampling and analysis.
Definitions of Quality Assurance and Quality Control
The terms quality assurance (QA) and quality control (QC) are often used
synonymously, although they represent two distinct concepts. The following
definitions have been established by the Director, lERL-Ci.
Quality Control (QC): Actions taken by the Laboratory (lERL-Ci)
organization (on in-house projects) and by
contractors/grantees (on extramural projects)
in day-to-day activities to achieve desired
accuracy, reliability, and comparability in the
results obtained from sampling and analysis
activities. Review by contractors/grantees of
their overall quality control activities is
"quality assurance" to them, but "quality
control" from the Laboratory's viewpoint.
Quality Assurance (QA): Actions taken by the Laboratory (lERL-Ci) line
organization under the specific auspices of the
Office of the Director, to assure that quality
control policies and procedures are being
properly implemented and appropriate levels of
accuracy, reliability, and comparability are
being achieved in the sampling and analysis
activities (including data reduction and
handling) of the Laboratory to fulfill the
Laboratory's assigned mission.
In broad terms, QA is the overall program that specifies the quality
control practices applied to the many individual aspects of a program.
Figure 1 graphically presents QA and QC responsibilities at lERL-Ci for both
in-house and contractor/grantee sampling and analytical programs.
Simplified definitions of quality assurance and quality control have
been adopted by the Office of Monitoring and Technical Support, Office of
Research and Development to aid in eliminating confusion. "Quality assurance
is the total program for assuring the reliability of monitoring data.
Quality control is the routine application of procedures for controlling the
measurement process."
-------
OFFICE OF THE DIRECTOR
DIVISIONAL
QUALITY ASSURANCE
QUALITY ASSURANCE OFFICER
DIVISIONAL
IN-HOUSE QUALITY CONTROL
IN-HOUSE
SAMPLING AND ANALYSIS
LABORATORY QUALITY ASSURANCE
EXTERNAL QUALITY CONTROL
CONTRACTOR/GRANTEE
QUALITY ASSURANCE
CONTRACTOR/GRANTEE
QUALITY CONTROL
CONTRACTOR/GRANTEE
SAMPLING & ANALYSIS
SOURCE: In-house memorandum from lERL-Ci Director, March 1978.
Figure 1. Quality assurance (QA) and quality control (QC) at lERL-Ci.
-------
QA in the Procurement Process
Quality assurance begins with, and is most directly controlled in, the
procurement process. If the Contracting Officer and the Project Officer fail
to procure high quality sampling and analytical services, then successful
project accomplishment is seriously jeopardized from the beginning. Hence,
prime emphasis must be placed on giving adequate attention to QA during pro-
curement of sampling and analysis services in support of lERL-Ci programs.
Overview of the Contracts Procurement Process—
Many lERL-Ci research and demonstration projects are awarded on a
competitive contract basis to qualified commercial contractors. The
procurement process for contracts is described in Procurement Information
Notice (PIN) 77-15, which is entitled "Source Evaluation and Selection
Procedures." It is EPA policy that source selection and evaluation shall be
conducted in accordance with standards and procedures that insure fair and
impartial treatment of all offerers, and further insure the selection of
sources whose performance is expected to best meet EPA objectives.
The contract procurement processing sequence is illustrated in Figure 2.
This figure also indicates those points in the procurement process where QA
considerations should apply and the page of this section where each step in
the processing sequence is discussed.
The contract procurement process begins with conception of the work to
be performed. This conception is translated into a solicitation by comple-
tion of a procurement request rationale document with accompanying EPA Form
No. 1900-8 (Procurement Request/Requisition). QA requirements must be clear-
ly specified within the scope of work sections of the procurement request
rationale document and must be given a proper weighting in the evaluation
criteria. After advertisement for interested contractors, a Request for Pro-
posal (RFP) is prepared for the Project Officer by the Contracts Management
Division. The QA requirements delineated in the procurement request ration-
ale document are described in the RFP and will eventually become a part of
the contract. A preproposal conference may be requested by the Project
Officer and called by the Contracting Officer to further discuss procurement
requirements with prospective offerers. If this is the case, prospective
offerers should be advised of the level of importance attached to QA. After
receipt of offers, a series of technical reviews is conducted by the
Technical Evaluation Panel. QA aspects must be given proper consideration
during each of the technical reviews. The initial objective at this stage is
to disqualify those offerers who do not propose the minimally required QA,
provided the evaluation requirements have been clearly and adequately stated
in the RFP. Beyond that, it is of value to rank the responsive offerers with
respect to QA and all other technical aspects as stated in the proposal
evaluation criteria. A checklist has been provided in these guidelines to
assist in the ranking of offerers in this regard. The Technical Evaluation
Panel may elect to provide performance test samples to the responsive
offerers if this provision was stated in the RFP.
-------
*PROJECT CONCEPTION (Page 8)
*PROCUREMENT REQUEST (Pages 8, 12, 21)
*DEVELOP EVALUATION CRITERIA (Pages 8, 12, 20)
PREPARE AND ISSUE SOLICITATION
*PREPROPOSAL CONFERENCE (OPTIONAL) (Page 23)
RECEIVE OFFERS
*PRELIMINARY TECHNICAL REVIEW ^-TECHNICALLY UNACCEPTABLE »-REJECT OFFER
*TECHNICAL AND COST EVALUATION
(Pages 23, 24, 30)
DETERMINE COMPETITIVE RANGE »- NOT IN COMPETITIVE RANGE
*CONDUCT WRITTEN/ORAL DISCUSSIONS (Page 25)
REQUEST "BEST AND FINAL" OFFERS
*FINAL EVALUATION
(Pages 29, 31)
I
*PRE-AWARD SURVEY (OPTIONAL)
(Page 31)
SELECT SOURCE FOR NEGOTIATIONS
CONDUCT NEGOTIATIONS
AWARD CONTRACT
*QA considerations are important at these points in the process.
Figure 2. Processing sequence for contract source evaluation and selection.
-------
Qualified offerers are identified on the basis of technical review. A
business and cost evaluation of the offer is conducted at the same time.
Since the cost to conduct an effort requiring sampling and analysis tasks is
clearly affected by the QA methods employed, QA must be given proper consid-
eration in the business evaluation. The costs of the necessary QA program
for the sampling and analysis aspects are a legitimate charge to the project,
and the costs are covered in the contract by EPA. After both the technical
and business evaluation, the competitive range is established, and offerers
are notified by the Contracting Officer whether or not they are in the
competitive range.
According to agency regulations, written or oral discussions must be
conducted with all responsive offerers who have submitted proposals within
the competitive range. During these discussions, any uncertainties should be
resolved concerning the offerer's compliance with the specified QA aspects of
the effort. Following the receipt of "best and final" offers, it is fre-
quently of value to conduct a pre-award survey of the offerers in the final
competitive range. The pre-award survey provides an ideal opportunity for
the Project Officer to gain an in-depth knowledge of the QA programs of the
various offerers. For this reason, pre-award surveys should be encouraged
for projects in which sampling and analysis constitute a large portion of the
effort or are, in themselves, a major cost factor.
After "best and final" offers are received, a final evaluation, which
may include a pre-award survey, is conducted, and a determination made of the
source to be selected for negotiations. Negotiations are conducted, and the
business evaluation is completed by performance of a financial audit or a
cost advisory. When this step is successfully concluded, a contract is
awarded to the successful bidder.
Overview of the Research and Demonstration Grant or Cooperative Agreement
Procurement Process—
lERL-Ci research and demonstration projects are also conducted by the
granting of funds to qualified research institutions for the entire program
or for various portions of a program. The Project Officer's role in the
procurement process for research and demonstration grants is described in the
"EPA Project Officer's Guide (Research & Demonstration Grants)" (EPA,
undated). The procurement process for research and demonstration grants is
entirely different than for contracts. Unlike contractors, grantees are not
normally selected on a competitive basis. The assigned Project Officer or
the prospective Project Officer for a research and demonstration grant has a
much more interactive role with the prospective grantee. He, in fact, may be
the principal EPA contact with the prospective grantee during the application
and processing stages, whereas this is specifically prohibited for contracts.
He has major authority in the selection of a grantee and the awarding of the
grant. Assistance is provided to the Project Officer by the Grants
Administration Division (GAD) in Headquarters. A technician in the Grants
Operations Branch of GAD is assigned to assist the Project Officer with
administrative matters during the procurement process and throughout the life
of the project.
-------
The processing sequence for research and demonstration grants and
cooperative agreements is shown in Figure 3. QA considerations enter the
procurement process during preparation and technical review of the
application.
After preliminary discussions with the appropriate program office, a
research or demonstration grant application is normally submitted in an area
of particular interest to or expertise of the applicant. The Project Officer
may advise the applicant in the preparation of the formal grant application.
One objective at this point in the process is to assure that the applicant
gives adequate regard to the QA aspects of his work. The applicant should be
provided guidance on the QA requirements of the Laboratory (lERL-Ci) and, in
his application, should commit to meeting these requirements as stated in
Section 2, "Quality Assurance Guidelines for Monitoring of Projects Requiring
Sampling and Analysis."
QA aspects should be given an appropriate and equitable level of atten-
tion during the technical review of grant applications. Even though the
evaluation criteria are more subjective than for contracts, similar QA cri-
teria can be used for guidance as discussed in this section under "QA Eval-
uation of Grant Applications" on page 28. Pre-award surveys of prospective
grantees may be conducted and, in fact, are strongly encouraged when possi-
ble. Performance testing may also be used to evaluate an applicant.
During the decision process on whether or not to fund a grant, negotia-
tions with the applicant may occur. If this is the case, the Project Officer
works with the applicant to modify the proposal to meet the objectives of the
program office. For example, if the proposed QA procedures were found to be
inadequate or inappropriate, they may be adjusted by the applicant at this
time.
QA CRITERIA IN PROJECT CONCEPTION AND SOLICITATION
Project Conception (Contracts)
QA must be given proper consideration in the conceptual stages of a
project. Project sampling and analysis programs may range from non-existent
to greatly important for the outcome of the work, regardless of the magnitude
of the sampling and analytical effort. Correspondingly, the resulting data
may be ancillary to the final outcome of the effort or of vital importance.
Since cost effectiveness is another criterion to be considered, and since QA
programs affect cost, judgment must be used in stipulating QA requirements.
It is not categorically true that the more QA applied to a given sampling and
analysis program, the more effective that program will be.
The level of importance ascribed to QA in any given project must be
clearly transmitted through the Contracting Officer to those organizations
that will propose to accomplish the work. QA requirements are to be
-------
PRE-APPLICATION
*PRE-APPLICATION DISCUSSIONS (Page 19)
J
*FORMAL GRANT APPLICATION (Page 19) •+
INITIAL ADMINISTRATIVE PROCESSING
RELEVANCE REVIEW »-NON-RELEVANT-
RELEVANT
*TECHNICAL REVIEWS (Pages 28, 29)
*PRE-AWARD SURVEY (OPTIONAL) (Page 31)
FUNDING DECISION ^REJECT-
I \
-NOTIFY APPLICANT
FUND
NEGOTIATE
MODIFY APPLICATION'
AWARD
*QA considerations are important at these points in the process.
Figure 3. Processing sequence for research and demonstration grants and
cooperative agreements.
-------
specified in the procurement request rationale document with accompanying EPA
Form 1900-8 (Procurement Request/Requisition) and, ultimately, in the RFP
package prepared by the Contracting Officer. The QA requirements must be
adequately defined either in the scope of work section of the procurement
request document or by attachments to this document. The proper level of
importance must be assigned in the scoring system of the proposal evaluation
criteria.
It is recommended that specific evaluation criteria and points be
assigned to the offerer's QA program when sampling and analysis is required
to accomplish the project. Guidance is provided in this section for assign-
ing evaluation points. Many other criteria routinely applied in the evalua-
tion of proposals are closely allied with QA considerations and should be
viewed in this regard, however. For example, proposal evaluation criteria
and points are generally assigned to demonstrated qualifications and experi-
ence of key project personnel. The availability of highly competent person-
nel on the offerer's staff is clearly a benefit to overall QA. Furthermore,
an offerer must have or must obtain suitable sampling equipment and analy-
tical instrumentation in order to accomplish high quality work. Since
facilities and personnel qualifications are generally awarded evaluation
points, QA aspects will normally enter into the evaluation criteria under
several headings.
As general guidance, when QA of the sampling and analytical program is
of critical importance to the success of a project, up to 30 percent of the
evaluation points should be assigned specifically to this area. If an
offerer fails to show competence in QA, he should be technically disqualified
overall and should not be privileged to receive further technical review. An
offerer may be disqualified by this method only if it is clearly stated in
the RFP. Conversely, if the sampling and analytical effort is only of
ancillary importance to the project, as few as 5 percent of the total
evaluation points should be assigned to QA. This should be a minimum point
assignment. If QA of the sampling and analysis effort is not worth at least
5 percent of the total technical evaluation points, then the necessity of the
sampling and analysis effort should be questioned. The individual preparing
the procurement request must make a judgment of the relative importance of QA
efforts in order to assign the proper weighting within the range of 5 to
30 percent of the total technical evaluation points. The point range should
be adjusted so as to disqualify offers where the proposed QA program is
totally inadequate for the purposes and objectives of the project.
The relative importance of QA efforts should be assessed by determin-
ation of the ultimate use of the sampling and analytical data, the methods
required, and the magnitude of the effort. Table 1 is provided to assist the
Project Officer, who prepares the procurement request, in assigning technical
evaluation points for the QA program offered by each RFP respondent. The
point ranges in Table 1 are broad and overlapping. The Project Officer must,
therefore, use his judgment in selecting the appropriate point assignment,
taking into consideration all other evaluation criteria and technical aspects
of the project.
10
-------
TABLE 1. ASSIGNMENT OF EVALUATION POINTS FOR QA
Relative importance of
sampling and analysis data
Suggested assignment
of points (% of total)
Sampling and analysis data of only ancillary
importance to the overall objectives of the
project or sampling and analysis are only a
very small portion of the total effort
Semi-quantitative data is adequate
Data will be used for screening purposes
and will probably be validated by other data
Data will be used to make economically-
important decisions on equipment design
Sampling and analysis is a substantial
portion of the total effort
Data will be used for regulatory support
All data will clearly be subject to legal
scrutiny and defense
(minimum)
5-10
10-15
15-25
20-25
25-30
30
11
-------
Table 2 may be of further assistance to the Project Officer. This table
is a checklist that can be used to assess the relative level of importance of
the sampling and analysis QA program by rating the requirements of the pro-
gram based on the questions listed. Ten questions are provided, with a pos-
sible total score of 30 points. The Project Officer should rate his response
to each question and add the total points. The total point value placed on
these responses corresponds to the suggested approximate assignment of points
in the technical evaluation criteria, on the basis of the percent of the
maximum evaluation point score for all technical criteria. The suggested
point assignment arrived at by use of Table 2 should also correspond with the
appropriate ranges suggested in Table 1. Hence, either approach can be used
to determine the approximate value of technical evaluation points for Quality
Assurance.
Example: Requirement to demonstrate the feasibility of a new type of
waste treatment process in the removal of numerous contaminants from waste-
waters produced by shale oil recovery processes. Bench scale tests have
shown the treatment process works exceptionally well, and it has been decided
by EPA to fund a pilot plant study. If the results from the pilot treatment
plant prove successful, further scale-up will be attempted, and the shale oil
industry may be required to install this type of treatment facility at each
location at a cost of millions of dollars. The approved funding for this
contract is $450,000, and the Project Officer estimates that $50,000 will be
expended for analytical laboratory services. Sampling and analysis will be
conducted of wastewaters typical of the shale oil industry during pilot plant
tests.
The Project Officer, in this example, may evaluate the importance of
sampling and analytical QA as shown in Table 3. A total of 23 points has
been assigned. Hence, the Project Officer should assign points within the
range of 20-25 percent of the total evaluation points to sampling and
analytical QA.
Alternatively, the Project Officer may have drawn the same conclusion by
assessing the evaluation points in Table 1. Since the data clearly will be
used to make economically-important decisions and to design major equipment,
the point total for QA should lie in the high end of the 15-25 percent
range.
Once the proper relative weighting for QA is established, the evaluation
criteria for this solicitation may appear as shown in Table 4. It should
also be stated in the RFP that a bidder must show adequate qualifications in
each Quality Assurance area to be qualified for consideration with regard to
the other technical areas.
Contract Solicitation
The Contracting Officer will prepare a Request for Proposal (RFP) pack-
age based on the information submitted in the procurement request rationale
document with accompanying EPA Form 1900-8 (Procurement Request/Requisition).
12
-------
TABLE 2. EVALUATION OF THE IMPORTANCE OF SAMPLING AND ANALYSIS QA
Low High
01 23
Evaluation considerations «10%) (10-50%) (50-90%) (>90%)
What percent of the total project effort
is related to sampling and analysis?
What is the probability the resulting
data or conclusions drawn from the data
will be subject to legal scrutiny?
What is the probability the data will
be used to make economically-important or
policy decisions?
What is the likelihood the data will be
used to set compliance standards or
performance standards?
What is the probability the data will be
used for equipment or facilities design?
What possibility exists that the data will
be used for regulatory support functions?
Is the data to be used only for screening
purposes or for validation purposes? If
for screening only, assign relatively low
importance (0 to 1). If for validation,
assign relatively high importance (2 or 3).
Is semi-quantitative data adequate for the
needs of the project, or is a high degree
of accuracy required? Assign importance
relative to the degree of quantitative
accuracy required: 0 if semi-quantitative
data is adequate; 3 if the highest possible
accuracy is necessary.
(continued)
13
-------
TABLE 2 (continued)
Low High
01 23
Evaluation considerations «10%) (10-50%) (50-90%) (>90%)
Are a variety of sampling and analytical
methods required, including complex
methods? Assign relative importance,
compared to other lERL-Ci projects,
based on the degree of variety and
complexity of the methods required.
What is the possibility that the sampling
and analytical data will be included in a
consolidated data base?
TOTAL POINTS
Approximate percent of proposal
technical evaluation points to
be assigned to QA procedures.
14
-------
TABLE 3. EXAMPLE OF THE ASSESSMENT OF THE IMPORTANCE
OF SAMPLING AND ANALYSIS QA
Low High
01 23
Evaluation considerations «10%) (10-50%) (50-90%) (>90%)
What percent of the total project effort
is related to sampling and analysis?
What is the probability the resulting
data or conclusions drawn from the data
will be subject to legal scrutiny?
What is the probability the data will
be used to make economically-important or
policy decisions?
What is the likelihood the data will be
used to set compliance standards or
performance standards?
What is the probability the data will be
used for equipment or facilities design?
What possibility exists that the data will
be used for regulatory support functions?
Is the data to be used only for screening
purposes or for validation purposes? If
for screening only, assign relatively low
importance (0 to 1). If for validation,
assign relatively high importance (2 or 3).
Is semi-quantitative data adequate for the
needs of the project, or is a high degree
of accuracy required? Assign importance
relative to the degree of quantitative
accuracy required: 0 if semi-quantitative
data is adequate; 3 if the highest possible
accuracy is necessary.
(continued)
15
-------
TABLE 3 (continued)
Low High
01 23
Evaluation considerations «10%) (10-50%) (50-90%) (>90%)
Are a variety of sampling and analytical
methods required, including complex
methods? Assign relative importance,
compared to other lERL-Ci projects,
based on the degree of variety and
complexity of the methods required.
What is the possibility that the sampling
and analytical data will be included in a
consolidated data base?
TOTAL POINTS |23|
Approximate percent of proposal
technical evaluation points to
be assigned to QA procedures.
16
-------
TABLE 4. EXAMPLE TECHNICAL EVALUATION CRITERIA
INCORPORATING QUALITY ASSURANCE REQUIREMENTS
Evaluation criteria Numerical weight
I. Adequacy of the Technical Proposal 60
a. Logic of approach to the study 20
b. Proposed pilot plant design 20
c. Presentation of findings 20
II. Project Management 50
a. Previous experience the Project 15
Manager has had in this type of
effort
b. Company resources available to the 25
Project Manager
c. Project management organization and 10
plan
III. Quality Assurance 50
a. Quality assurance management policy/ 10
written procedures
b. Quality assurance procedures for 15
sampling
c. Quality assurance procedures for 15
analysis
d. Quality assurance procedures for 10
data management
IV. Personnel Qualifications 40
a. Technical experience of the principal 20
project staff related to the project
b. Educational background of principal 20
project staff
TOTAL 200
17
-------
The RFP must be written in such a manner as to elicit adequate information
from the offerers so the Technical Evaluation Panel can evaluate each
offerer's technical proposal as well as the level of QA. In addition, the
evaluation criteria must be clearly stated so that all offerers will attach
the appropriate level of importance to QA aspects of the effort.
There are two conceptual approaches to defining QA requirements in the
RFP. One is to state clearly and unequivocally in the scope of work section
the QA and QC steps required of the contractor. It may be stated, for
example, that all samples are to be collected in duplicate using a specified
technique and that certain analytical precision must be achieved and
demonstrated. Another approach is to state clearly that QA and QC are of
importance and that the offerer is to explain his proposed procedures.
Either approach is acceptable, but both have shortcomings.
If the required QA and QC steps are specifically delineated, it is
likely that proposals will be submitted that simply "parrot" those require-
ments. The offeror cannot very well do otherwise. As a result, technical
evaluation of proposals is simplified and cost comparisons are more easily
made. The Project Officer and the Contracting Officer will know exactly the
level of QC that will be applied, since it has been specified. This approach
is recommended for selection of contractors when the scope of work is clearly
defined, and selection can be made on costs proposed by basically qualified
bidders. It is generally impractical, however, to apply this approach to
less well-defined efforts. In such cases, statements should be made as to
the importance of QA to the overall success of the program, and the offeror
should be free to propose QA procedures appropriate to the nature of the
sampling and analytical effort. This will allow a better evaluation of the
offerer's knowledge of QA procedures. The offeror should be asked to provide
a description of his QA administration and procedures, to show an awareness
of proper sampling and analytical techniques, or to describe his methods of
documenting data quality and of handling data. The Project Officer may
choose to restate the applicable questions listed in Table 2 as statements of
fact so that the offeror will understand lERL-Ci's interest in QA on the
project.
In the case of the pilot plant example previously discussed, a statement
may be made in the RFP to the effect that: "The performance results of the
pilot plant to be constructed and evaluated under this contract may be used,
in part, to determine EPA policy on the applicability of this method of waste
treatment for the shale oil industry. For this reason, considerable impor-
tance is attached to quality assurance of the sampling and analytical aspects
of the scope of work. Offerers are requested to describe, in detail, their
quality assurance procedures as they would apply to this program."
Specific provisions may also be used in a solicitation to define the
level of QA required and to reduce the number of non-responsive proposals.
Some typical provisions are as follows:
18
-------
1. "If subcontractors are to be used in this effort, the prime
contractor has full responsibility for quality assurance of the
subcontractor's efforts as well as his own efforts."
2. "Offerers whose proposals are determined to be technically accep-
table under the initial evaluation criteria stated herein shall
(may) be required to demonstrate acceptable performance by analyzing
not more than unknown samples for parameters per sample.
The cost for analysis of these samples shall be at the offerer's
expense. Sample analysis results shall be compared to known EPA
results for scoring purposes."
3. "Split sample analysis will be required from time to time during the
course of the project to validate the quality of the data. Split
samples shall not exceed a maximum of one percent of the total
samples analyzed."
4. "lERL-Ci has established basic quality assurance guidelines relating
to the sampling and analysis required under this contract. These
guidelines are attached. The successful offerer's quality assurance
procedures should generally conform to these guidelines or
significant deviations must be justified and approved by the Project
Officer."
The Quality Assurance Officer is familiar with the applicability of
typical provisions that may be used in any particular case. He should be
contacted for guidance in this regard.
Project Conception and Application for Research and Demonstration Grants
As discussed previously, the Project Officer may be the key contact
person between EPA and a prospective grantee. The Agency, through the Pro-
ject Officer, should make sure that each applicant for grant funds receives
full and impartial consideration of his application and early notification of
its disposition. Although grant applications may be submitted at any time on
any research and development project, EPA encourages pre-application contact,
which can benefit both the applicant and the Agency by avoiding time-
consuming preparation and review of applications that cannot be funded and by
sharpening the focus of proposed projects in terms of EPA needs. It is the
responsibility of the Project Officer to assure that grant allocations are
provided only to those organizations where work of acceptable quality will be
performed.
QA should be given as much consideration in the allocation of grant
funding as in contracts or in-house research projects. It is somewhat more
difficult, however, to accord sufficient QA consideration for grants, since
competitive selection based on defined evaluation criteria is not generally
practiced. The Project Officer, therefore, must assure that QA is adequately
addressed in the application stages of the grant process (see Figure 3).
In those cases in which the Project Officer is involved in pre-
application activities, he may be called upon to work with the applicant in
the development of an adequate application that meets the requirements of EPA
19
-------
and falls within general EPA objectives. This would include assisting the
applicant in addressing the necessary QA aspects of sampling and analysis
efforts as required by lERL-Ci. It is appropriate, if possible, for the
Project Officer to make site visits or otherwise assist the applicant in
developing and documenting the application in accordance with available
criteria for application evaluation.
When the Project Officer is involved in pre-application activities, he
should provide the prospective grantee with the QA guidelines that will be
used to evaluate the formal grant application. These guidelines are
described on page 28. The applicant should also be made aware of lERL-Ci
requirements as described in Section 2 of this document, "Quality Assurance
Guidelines for Monitoring of Projects Requiring Sampling and Analysis." If
an ongoing project is under consideration for further grant funding, the
Project Officer can conveniently review QA requirements with the applicant at
that time.
The prospective grantee's pre-application should be reviewed with regard
to his proposed QA program by using essentially the same criteria for con-
tracts. In review of the pre-application and response to the applicant, any
QA shortcomings should be noted and guidance provided on the general level of
QA expected. It may be of value to inform the applicant that a pre-award
survey will be conducted as part of the formal grant review process and that
the applicant's QA procedures will be evaluated during this survey.
Use of Performance Test Samples in Contractor/Grantee Selection
Performance test samples can be used by the Project Officer to assist in
selecting qualified contractors or grantees for projects requiring analytical
services. These are samples of known concentration (known only to EPA) that
are provided to the prospective contractor or grantee for analysis. Analysis
is conducted and the analytical results are reported within a specified time.
The results are then compared with the EPA known values to determine the
accuracy of the laboratory's results.
If it has been decided to use performance test samples to evaluate
offerers, the lERL-Ci Quality Assurance Officer should be notified. He will
arrange for delivery of the samples and evaluation of the results. The
procedure for obtaining test samples is described on page 29.
Performance test samples have been prepared and made available to
Project Officers by the EPA Environmental Monitoring Systems Laboratories
(Cincinnati, Las Vegas, and Research Triangle Park) through the lERL-Ci
Quality Assurance Officer. Reference standards can also be obtained from the
National Bureau of Standards (NBS) through the lERL-Ci Quality Assurance
Officer. Appendix B contains information concerning the availability of
performance test samples.
Indiscriminate or excessive use of performance test samples for the
evaluation of offerers is discouraged. The samples are costly to both EPA
20
-------
and to the offerer if he must conduct many such tests. As general guidance,
performance test samples should be considered for use only when two or more
of the criteria listed in Table 5 apply. Some of these criteria cannot be
assessed, of course, until all responsive offers have undergone technical
evaluation and have been scored according to the stated evaluation criteria.
Hence, it is suggested that if the Project Officer is uncertain about the
need for performance test samples when preparing the procurement request, he
state the requirement for analyzing performance test samples in the RFP and
choose later whether to exercise the option. This can be accomplished by
inserting the following provision in the procurement request:
"Offerers whose proposals are determined to be technically acceptable
under the initial evaluation criteria stated herein shall (may) be
required to demonstrate acceptable performance by analyzing not more
than _ unknown samples for parameters per sample. The cost
for analysis of these samples shall be at the offerer's expense. Sample
analysis results shall be compared to known EPA results for scoring
purposes."
The final determination of whether or not performance test samples would
be of sufficient value in the assessment of offeror(s) performance to warrant
their cost must be made by the Project Officer. If the Project Officer
chooses to use performance test samples, it must be made clear at what point
in the selection process these samples will be used. Will they be sent to
all offerers, or just those in the running at the final evaluation stage?
Furthermore, the proposal evaluation criteria must allow for the scoring of
test results. The Project Officer is encouraged to seek the advice of the
lERL-Ci Quality Assurance Officer on the proper use of performance test
samples.
Cost Considerations
To complete the procurement request rationale document with accompanying
EPA Form 1900-8 (Procurement Request/Requisition) for a project, the Project
Officer must prepare an estimate of the anticipated cost of the required
scope of work. For cost estimating, proper consideration should be given to
the cost incurred for QA procedures on sampling and analysis programs. It is
recommended by EPA in "Handbook for Analytical Quality Control in Water and
Wastewater Laboratories" (EPA, 1979) that daily control of analytical per-
formance in the laboratory requires approximately 15 to 20 percent of the
analyst's time. The good laboratory QA program will also call for the main-
tenance of analytical control charts, training of personnel, the frequent
analysis of performance test samples, and extensive recordkeeping. Hence, an
adequate QA program will normally require a minimum of 20 percent of the
total analytical budget. When sampling is required, a minimum of 10 percent
of the samples should normally be collected in duplicate. Samples can be
split after collection, or several aliquots can be taken from the same
sample. This increases both the sampling and analytical costs by 2 to
3 percent, depending, of course, on the scope and complexity of the sampling
effort.
21
-------
TABLE 5. SUGGESTED CRITERIA FOR USING
PERFORMANCE TEST SAMPLES FOR OFFEROR EVALUATION*
The sampling and analysis effort will exceed $25,000 in value.
Analytical QA is very important to the overall success of the
project. See Table 2.
_-___ There is some doubt concerning the performance capability of the
offerers.
The offeror(s) cannot provide documented results of previous
performance tests.
*Assuming appropriate test samples are available (see Appendix B).
22
-------
Preproposal Conference
Preproposal conferences are an optional but important part of the
solicitation process for contracts. They can serve an information function
similar to pre-application discussions with prospective grantees. However,
they must be conducted in a fair and impartial manner that will not give any
prospective offerer an unfair advantage over another. The determination to
conduct a preproposal conference may be made by the Project Officer through
the Contracting Officer for any of the following reasons:
1. To clarify or explain complex specifications, statements of work, or
proposed contractual provisions (e.g., QA program requirements),
2. To discuss or emphasize the importance of any qualification
requirements, such as QA requirements,
3. To provide additional background material to prospective offerers,
such as documents that are too voluminous to include with the
solicitation package, a site tour, or visits to the place of
performance,
4. To respond to numerous questions of potential offerers regarding the
solicitations, or
5. To comply with the request of an important segment of the industry.
If the QA aspects of the proposed research effort are particularly
extensive, difficult to convey, or important to the project, the Project
Officer may elect to have the Contracting Officer call a preproposal con-
ference to provide further explanation of the QA requirements to potential
offerors. The preproposal conference must be conducted in accordance with
PIN 77-15. A record must be kept on the proceedings of the preproposal
conference; amendments to the RFP may be necessary as a result of the con-
ference. Proceedings of the preproposal conference may be made available to
all potential offerors.
QA CRITERIA IN TECHNICAL EVALUATION OF PROPOSALS AND GRANT APPLICATIONS
Technical Evaluation of Contract Proposals
Technical evaluation of proposals must be made by using only the
criteria set forth in the RFP in accordance with "Source Evaluation and
Selection Procedures" (PIN 77-15). In order to assist in applying QA in the
technical evaluation of proposals, the proposal evaluation checklist in
Appendix A provides a mechanism for the Technical Evaluation Panel to deter-
mine if an offerer meets basic QA program requirements. This checklist
describes the basic requirements for any adequate QA program for sampling and
analysis. Questions on the checklist are grouped into the following four
areas:
1. QA management policy and written procedures,
2. QA procedures for sampling,
3. QA procedures for analysis, and
4. QA procedures for data management.
23
-------
Several questions are provided in each of these areas with which to test
the offerer's proposal. The questions are weighted according to their impor-
tance to a basic QA program. Proper use of this checklist when evaluating
proposals should allow the Technical Evaluation Panel to identify organi-
zations that do not offer an acceptable level of QA and those that do.
Furthermore, the checklist should assist in the technical ranking of offerers
who meet and surpass the basic QA program requirements.
QA Evaluation Criteria and Scoring Procedures for Contracts
The technical reviewers should evaluate each proposal with respect to
the offerer's demonstrated understanding of the basic requirements of a QA
program. Assuming the guidelines for project conception and solicitation on
page 10 have been followed in preparation of the procurement request, and it
has been indicated in the RFP that QA is important to the technical
reviewers, then they can expect adequate discussion of the proposed QA
program from responsive offerors. Each proposal should be evaluated with the
checklist (see Appendix A). This checklist can be copied and used by the
Technical Evaluation Panel. The following scoring plan (PIN 77-15) should be
used:
SCORING PLAN
Scoring Percent of
Value Available Points Evaluation
0 0 Not addressed in the offer.
1 20 Addressed, but totally deficient.
2a 40 Deficient, but appears to be capable of
improvements to adequate or better
without adopting a new approach.
2b — Appears to be deficient; however, final
scores will be determined subsequent
to written questions and/or oral
discussions.
3 60 Adequate; overall it meets the
specifications.
4 80 Good; has some superior features.
5 100 Generally superior in most features.
The relationship of the scoring plan to written or oral discussions and
to subsequent negotiations is as follows:
1. Value of "0," "1," or "2a"—The element or sub-element clearly is
deficient and is not to be questioned or discussed during written or
oral discussion. Such values are solely for the purpose of scoring,
ranking, and determination of the technical competitive range. If,
however, the offer attains an overall score, because of other fac-
tors, that places it in a sufficiently high position to be selected
for negotiations, the offerer shall be allowed to correct these
deficiencies during negotiations.
24
-------
2. Value of "2b"—The element or sub-element contains uncertainties
which must be resolved before the offer is fully understood. Such
uncertainties are to be resolved during written or oral discussions,
and the offer is to be given a final score that is based on the
offerer's clarifications.
3. Values of "3", "4," or "5"—The element or sub-element is fully
understood and there is no need for clarification by the offerer.
However, discussions involving any such elements or sub-elements are
not precluded.
Each offerer's proposal is evaluated in the four QA areas which have
been identified in the technical evaluation criteria included in the RFP (see
Table 4). In the area of QA procedures for analysis, one of the criteria is
an evaluation of the use of standard analytical methods (a listing of these
methods is presented in Appendix C).
If there is insufficient information provided in the proposal to assess
any particular question(s) in the checklist, the reviewer should assign a
score of 2b on that question. Those questions assigned a value of 2b should
be specifically addressed during subsequent written or oral discussions with
technically competitive offerers in accordance with PIN 77-15 procedures.
Example; To return to the previous example of a wastewater treatment
pilot plant for the shale oil industry, one offerer may have scored as shown
in Figure 4 on QA procedures for data management, one sub-element of the
proposal evaluation criteria. This offerer scored a total of 50 out of a
possible 75 points on this sub-element. Hence, he was awarded 66 percent of
the total points. Since this is closest to 60 percent of the points avail-
able for this sub-element, this offerer is assigned a score of 3 for sub-
element Hid (Table 4) of the proposal evaluation criteria. This sub-element
is weighted according to the original proposal evaluation criteria. The
offerer, therefore, may have achieved a total technical score as shown in
Figure 5. In this way, the scoring of the QA evaluation criteria checklist
is integrated with the proposal evaluation criteria.
This offerer was awarded a 2b on one question on the checklist
(Figure 4). It was not clear in the proposal whether data and other records
associated with the project would be retained for a minimum of three years.
This point should be resolved during written or oral discussions with this
offerer if he is in the technical competitive range.
Written/Oral Discussions with Offerers
Public law requires written or oral discussions in negotiated procure-
ments with all responsible offerers who submit proposals within the competi-
tive range (PIN 77-15). The Project Officer and the Contracting Officer will
participate in these discussions. The purposes of these discussions are to:
25
-------
QUALITY ASSURANCE EVALUATION CRITERIA CHECKLIST FOR
PROPOSALS OFFERING SAMPLING AND ANALYSIS SERVICES'
Criteria
Scoring Numerical Individual
Value x Weight • Score
D. Quality assurance procedures for data management.
Id. Does the offerer possess
appropriate data handling,
processing, and retrieval
capabilities?
2d. Will QC data (e.g., standard
curves, duplicate results, spike
sample results) be maintained and
be accessible to the Project
Officer?
3d. Does the organization routinely
maintain analytical performance
records such as quality control
charts?
4d. Are all. laboratory results and
QC data reviewed by laboratory
supervisory personnel?
Sd. Are all data and records retained
for a minimum of 3 years?
6d. Are field notebooks used to
record sampling and engineering
data (e.g., sample number, date/
time of collection, flow,
operating conditions, etc.)?
*f
*/
Total Score for Sub-Element D
Maximum Possible Score
75
Percent of maximum possible score awarded for sub-element D (circle
closest value). —^
OZ 20Z 40Z — (60ZJ 80Z 100Z
Score for this sub-element of the proposal evaluation criteria (circle
corresponding value). -•.
0 1 2a 2b U) 4 5
Figure 4. Example use of QA evaluation criteria checklist for
scoring proposals.
26
-------
Evaluation Criteria
Numerical Scoring Individual
Weight Plan Score
I.
II.
III.
IV.
Adequacy of the Technical Proposal 60
principal project staff related
to the project
Educational background of
principal project staff
40
a.
b.
c .
Logic of approach to the study
Proposed pilot plant design
Presentation of findings
20
20
20
3
2b
5
Project Management 50
a.
b.
c.
Previous experience the
Project Manager has had in
this type of effort
Company resources available to
the Project Manager
Project management organization
and plan
15
25
10
2
5
3
Quality Assurance 50
a.
b.
c .
d.
Quality assurance management
policy/written procedures
Quality assurance procedures
for sampling
Quality assurance procedures
for analysis
Quality assurance procedures
for data management
10
15
15
10
4
3
4
3
Personnel Qualifications 40
a.
Technical experience of the
20
4
12
8
20
37
6
25
6
35
8
9
12
6
28
16
20 3 12
TOTAL SCORE 140
Figure 5. Example of technical evaluation scoring system.
27
-------
1. Provide offerers an opportunity to further explain their offers,
2. Afford the Contracting Officer and the Project Officer an oppor-
tunity to understand fully what is being offered,
3. Arrive at preliminary agreements regarding price, cost, performance,
contract terms and conditions, and
4. Resolve minor technicalities in offers.
This is the point in the procurement process where any of the offerers
may be asked to clarify their proposed QA procedures. If, during the course
of the scoring of an offerer's proposal, a score of 2b is given for any of
the QA Evaluation Criteria Checklist questions, clarification or more
information should be requested during the written or oral discussions. The
response to these questions will be used in the final review (Figure 4).
Upon conclusion of the written or oral discussions, the Project Officer and
the Contracting Officer will determine of whom a "best and final" offer is to
be requested.
QA Evaluation of Grant Applications
If a grant application is judged relevant to the lERL-Ci program mission
and funding can be provided, then arrangements are made for technical
reviews. A minimum of one intramural and two extramural reviews of tech-
nical and scientific merit are required for new grant applications.
The technical evaluation criteria for research and demonstration grant
applications are much more subjective than those for competitive procure-
ments. There is no standard set of criteria that can apply in each case
because of the diversity of the programs, the sampling and analytical
requirements, and the organizations performing the work. This should not,
however, be used as an excuse for failing to recognize the importance of a
sound QA program.
When the project involves sampling and analysis activities, each tech-
nical reviewer should be requested to address specifically the applicant's
proposed QA program or plan during his evaluation of the technical and scien-
tific merit of the proposal. To aid project officers and reviewers in this
regard, the checklist in Appendix A should be used. It is not necessary,
however, to score the checklist results since grants are not awarded on a
competitive basis. For grants, the scoring plan should be disregarded and
each checklist question answered in the affirmative or negative. The Project
Officer should assure that a copy of this checklist, which may be reproduced
from Appendix A, is forwarded along with the grant application in the review
package provided to each reviewer. The transmittal letter that accompanies
the review package should request that the reviewer consider use of this
checklist to evaluate the technical and scientific merit of the proposed
study. Each reviewer's comments should be used to support the funding
decision (see Figure 3) and/or to improve the technical or scientific merit
of the proposed study.
28
-------
Evaluation of Performance Test Samples
If it has been decided to use performance test samples to help in selec-
tion of a contractor, in accordance with the guidelines provided on page 20,
then they must be provided to all offerers whose proposals are technically
acceptable and are in the competitive range. Select those performance test
samples from the list of available samples, Appendix B, that evaluate analy-
tical capabilities that will be required on the project. Inform the lERL-Ci
Quality Assurance Officer of your needs for performance test samples, and he
will arrange to have them sent to the competitive offerers. Instructions
will be provided for return of the analytical results to the Quality
Assurance Officer.
Allow each offerer sufficient time to receive the test samples from EPA,
to conduct the analyses, and to report the results along with his normal lab-
oratory routine. A minimum of one month will normally be required. It may
be advisable to ask the offeror(s) what period of time he requires to com-
plete the analyses and report the results. The lERL-Ci Quality Assurance
Officer can also provide recommendations in this regard. If insufficient
analysis time is allotted, special handling of these samples will be
required, and this would result in an increased cost to the offeror(s).
After receiving all results from responsive offerers within the
specified period of time, the Quality Assurance Officer will assist in
interpreting the results.
Performance test samples become part of the technical evaluation and may
be discussed during post-award debriefings. The Project Officer is to inform
the Contracting Officer of the results of the performance test sample analy-
sis, and the Contracting Officer will convey the results to the offeror(s).
Each parameter should be defined as acceptable or unacceptable, and the
evaluation criteria should be provided. Any unacceptable results should be
described as either high or low, as appropriate, since true values cannot be
provided. Since offerers must bear the cost of performance test sample
analysis, they should be informed of their performance as soon as possible so
that any necessary corrective action may be taken.
Performance test samples can be used to evaluate research and demon-
stration grant applicants in the same manner as prospective contractors.
EVALUATION OF PREVIOUS PERFORMANCE HISTORY
One indication of the probability of successful project accomplishment
on the part of a prospective contractor or grantee is his performance record
on previous projects. The offerer is required to list current and recently
completed EPA contracts in his proposal. The interested Project Officer can
request a list of the Project Officers on these contracts. As part of the
technical evaluation of the offerers, Project Officers on other government
projects can be contacted. These Project Officers could provide useful
information concerning their level of satisfaction with the prospective
29
-------
contractor's sampling and analytical QA, if the project calls for these
services. Likewise, a prospective grantee may be evaluated by contacting
Project Officers on past or present grants he holds.
If the offerer has performed on a previous EPA contract, the Contracting
Officer should have access to records regarding his performance. If the
offerer has performed past work for other agencies of the government, the
Contracting Officer may be able to obtain this information from the
Contractor Relations Section, Contracts Management Division, Washington,
D.C., 20460.
Each Project Officer is required to complete a Project Officer's Eval-
uation of Contractor Performance (EPA Form 1900-27) at the conclusion of a
contract. Review of this form may provide valuable information to help
assess the likelihood that a given offerer may perform well on your project.
Several points must be remembered, however, when reviewing these forms
reporting on past project performance:
1. The work performed should be similar to that proposed in the current
scope of work,
2. Only performance in the recent past should be considered, since
changes in the organization may invalidate previous performance
information, and
3. Past performance, whether good or poor, may not be used as pass/fail
criteria.
COST EVALUATION
The cost evaluation of a proposal is conducted by the Contracting
Officer concurrently with the evaluation of the technical aspects by the
Project Officer.
The Contracting Officer reviews each element of the offerer's business
and management proposal for the following:
1. Reasonableness of price or estimated cost with respect to the
requirement,
2. Investigation of unreasonably high or low cost elements,
3. Evaluation of the proposed management structure to be utilized for
performance,
4. Indirect cost management, and
5. Analysis of manhours and materials.
The Project Officer is notified of those cost proposals considered to be
acceptable and receives copies of the cost proposals to be compared against
those offerers' technical proposals which have been judged to be within the
acceptable technical range.
The Project Officer compares the cost and business proposals and decides
which proposals merit additional consideration. QA cost considerations are
presented on page 21.
30
-------
After a thorough review of the 5 specific items mentioned previously,
the Contracting Officer, with input from the Project Officer, makes the
decision as to which cost proposals are considered to be in the acceptable
range.
BEST AND FINAL OFFER EVALUATION
The evaluation of cost and technical proposals results in the selection
of candidates who are asked to present a best and final offer. Negotiations
are begun with the successful offeror(s) based on their "best and final"
offers. The funding decision serves the same purpose for non-competitive
grants. The extent of this final evaluation is determined by the Contracting
Officer. All scores of 2b must be reassessed and rescored. The Project
Officer may choose to conduct a pre-award survey of one or more of the
offerers following the "best and final" offer evaluation. The Project Offi-
cer may be accompanied by the Contracting Officer during pre-award surveys.
Following completion of final evaluation, negotiations, and award, all
records pertaining to the selection process (e.g., checklists) are provided
to and retained by the Contracting Officer. The contractor or the grantee
should be informed upon award that quality assurance procedures will be moni-
tored throughout the life of the project by the Project Officer. (Section 2
of this document, "Quality Assurance Guidelines for Monitoring Projects
Requiring Sampling and Analysis", describes project monitoring procedures.)
PRE-AWARD SURVEYS
It was stated on page 25 that written or oral discussions with each of
the offerers who submitted proposals in the competitive range were mandatory.
At the option of the Contracting Officer and the Project Officer, these
discussions may include a visit to the prospective contractor's or grantee's
facilities, but only after the "best and final" offer has been received.
Such visits offer an excellent way to assess thoroughly an offerer's QA
program and awareness. Some of the criteria that may justify a pre-award
survey are listed in Table 6.
Pre-award surveys must be conducted in such a manner as to avoid giving
an unfair advantage to any offerer on competitive procurements. All com-
munications must be conducted in accordance with the guidelines stated in
PIN 77-15. The Contracting Officer must accompany the Project Officer on
pre-award surveys. At the request of the Project Officer, the lERL-Ci
Quality Assurance Officer or an lERL-Ci contractor may participate in the
pre-award survey.
The pre-award survey is a good time to resolve any questions that remain
unresolved from the quality assurance evaluation criteria checklist. Any
scores of 2b on this checklist should, in particular, be addressed in the
pre-award survey if they have not been resolved in previous discussions.
Each of the QA criteria should be rescored during or after the visit.
Table 7 is a suggested agenda for a pre-award survey.
31
-------
TABLE 6. CRITERIA FOR CONDUCTING A PRE-AWARD SURVEY*
After technical evaluation of the proposal(s), there remain suffi-
cient unresolved questions to merit a face-to-face meeting with the
prospective contractor or grantee after submission of the "best and
final" offer.
A greater in-depth knowledge is required of the prospective
contractor's or grantee's QA procedures.
An inspection of the prospective contractor's or grantee's facil-
ities would assist in the evaluation process after submission of
the "best and final" offer.
Face-to-face discussions with key project personnel would assist in
the evaluation process.
Past performance history of the prospective contractor or grantee
has not been particularly good; however, there is reason to believe
improvements have been made.
A decision among the top contenders cannot be conclusively justi-
fied without further information.
*If any of these questions are answered in the affirmative, then a pre-award
survey should be strongly considered.
32
-------
A pre-award survey may also precede the funding decision for a research
and demonstration grant application. The Project Officer is encouraged to
invite the participation of the lERL-Ci Quality Assurance Officer in this
survey. During the survey, any unresolved questions that remain from the
recommended quality assurance criteria discussed on page 28 pertaining to the
evaluation of grant applications should be addressed and settled.
33
-------
TABLE 7. SUGGESTED PRE-AWARD SURVEY AGENDA
A. Discussions with key project personnel
1. Project management organization
2. Administrative management/project management interface
3. Approach to accomplishing the scope of work
4. QA aspects of the program
5. Establish communications
B. Review prospective contractor's/grantee's facilities
1. Discuss equipment/instrumentation that will be used
2. Evaluate overall appearance, resources, and scope of capabilities
C. Review prospective contractor's/grantee's QA program
1. Ask for copy of written QA plan
2. Discuss how this plan will apply to the project under discussion
3. Review QA documentation on typical work
4. Discuss QA procedures for the sampling effort*
5. Discuss QA procedures for analysis*
*To assist in this regard, the Project Officer is encouraged to use the QA
Evaluation Criteria Checklist (Appendix A) and the QA Audit Checklist
(Appendix I).
34
-------
SECTION 2
QUALITY ASSURANCE GUIDELINES FOR MONITORING OF
PROJECTS REQUIRING SAMPLING AND ANALYSIS
INTRODUCTION
These guidelines have been prepared to assist the lERL-Ci Project
Officer in directing contractor's efforts and in overseeing research and
demonstration grants which incorporate sampling and analysis activities. The
Project Officer's QA responsibilities are described beginning on page 37 for
both contracts and grants. These responsibilities begin with project
initiation and extend through the monitoring of all sampling and analysis
activities. The Project Officer's QA responsibilities do not end with the
field and the laboratory work, however; they continue through close review of
the results of the sampling and analysis program, the interpretation of the
quality control data, and the use to which the sampling and analytical data
are applied, including the conclusions and recommendations of the project.
l
There are three basic foundations upon which project quality assurance
is built. They are:
1. The contractor's or grantee's QA program,
2. The Project QA Plan, and
3. QA monitoring of project activities and results.
These foundations of project quality assurance are depicted in Figure 6,
along with the elements of each.
All contractors and grantees should have a QA program. This QA program
describes the in-house procedures used by the contractor/grantee to guaran-
tee, to the extent possible, the quality of his work. The elements of a
suitable and acceptable QA program are described under Contractor's/Grantee's
QA Program on page 43. A checklist is provided in Appendix D to assist the
Project Officer in reviewing a contractor's or a grantee's QA program. This
checklist can also be used to help the contractor or the grantee to develop
an acceptable QA program if his present program does not meet lERL-Ci
requirements.
Each project should begin by preparation of a Project Work Plan which
describes how the effort will be accomplished. A key part of any Project
Work Plan is a discussion of the sampling and analysis program and the
35
-------
CONTRACTOR'S/GRANTEE'S
QUALITY ASSURANCE
PROGRAM
(Page 43)
Organization and Personnel
*Facilities and Equipment
*Analytical Methodology
*Samp ling and Sample
Handling Procedures
*Quality Control
*Data Handling
PROJECT
QUALITY
ASSURANCE
PROJECT WORK
PLAN
(Page 48)
EXTERNAL QUALITY
ASSURANCE MONITORING
(Page 50)
* Project Objective
* Project Staff
* Facilities/Equipment
* Sampling Plan/Methods
* Analytical Methods
* Quality Control
* Data Management
* Project Schedule
* Review of Project Reports
* Conferences/Project Reviews
* Site Visits/QA Audits
* Performance Tests
* Sub-Contracts
Figure 6. Foundations of project quality assurance.
36
-------
procedures that will be followed to validate the quality of the resulting
data. A checklist is provided in Appendix E to .assist the Project Officer in
reviewing project work plans and to assure that the QA aspects of the plan
are given adequate attention.
Project QA monitoring procedures are discussed on page 50. The Project
Officer has full responsibility for monitoring of project QA during the
course of project accomplishment. Several techniques are described for
fulfilling this key project responsibility, including: (1) the effective use
of project conferences and project reviews, (2) a number of independent
performance tests (e.g., QA audits, performance test samples, split samples),
and (3) review of project reports.
PROJECT OFFICER'S QA RESPONSIBILITIES
This section briefly describes the QA-related responsibilities of the
Project Officer and discusses how he can influence the quality of the work
during the course of the project.
Contracts
Figure 7 is a diagram of the major aspects of contract administration.
After award of the contract to the contractor, the Project Officer assumes
technical responsibility for project initiation, project monitoring, conclu-
sion, and closeout. The Project Officer fulfills his QA role by interacting
with the contractor in each of these five phases of the work.
Project Initiation—
After completion of all negotiations and award of a contract, the
Project Officer should discuss w'ith the contractor their respective roles in
project quality assurance and the necessity for cooperative efforts to
achieve a high-quality result.
The Project Officer should discuss with the contractor how he intends to
maintain quality assurance on this project commensurate with the objectives
of the effort. He should discuss the three basic foundations of project QA
with the contractor. They are: (1) the contractor's QA program, (2) the
Project Work Plan, and (3) QA monitoring by the Project Officer or the
lERL-Ci Quality Assurance Officer.
At this time, if not provided during procurement, the Project Officer
should request a review copy of the contractor's QA program manual. Guide-
lines for reviewing the contractor's QA program are discussed under "The
Contractor's/Grantee's QA Program" (page 43); Appendix D is a checklist to
assist the Project Officer in this review. The Project Officer and the con-
tractor should also discuss quality assurance procedures to be addressed in
the Project Work Plan, which will be prepared by the contractor before the
work begins. The elements of a Project Work Plan are described on page 48. A
checklist is also provided in Appendix E to assist the Project Officer in
review of the QA aspects of the plan.
37
-------
Contract Procurement
Contract Award
Project Initiation
Project Monitoring
Project Conclusion-
Project Closeout-
Initial Contact
Project Plan
Reports
Conferences and Project Reviews
Site Visits/QA Audit
Performance Tests
Sub-Contracts
Final Project Report
Project Evaluation
Figure 7. Contracts administration.
38
-------
Project Monitoring—
The Project Officer serves in a key position in QA monitoring of the
contractor's project work. There are a number of ways in which he can
fulfill this responsibility (Figure 7). He can review all project reports,
conduct project conferences and reviews, conduct site visits and QA audits,
use performance tests when appropriate, and assure that the same level of QA
is applied to all subcontracted work. The lERL-Ci Quality Assurance Officer
can assist the Project Officer in accomplishing these responsibilities.
The Project Officer is responsible for the technical content and QA
review of all project reports. This includes periodic progress reports,
interim reports, task reports, QA reports, and final project reports. Since
the Project Officer must approve the Project Work Plan, he should be aware of
all sampling, analysis, and QA procedures that are to be employed in develop-
ment of the data. There remain two areas, then, where the Project Officer
should focus his attention in the review of reports: (1) Has all the QA
information been documented in the report?, and (2) Are the conclusions which
are drawn from the work supported by the sampling and analysis data? Each
task report, interim report, and certainly the final report should be
reviewed by one or more persons who are technically qualified to assess the
quality of the work and the conclusions which are drawn.
In his role as technical representative for EPA, the Project Officer
must insure that the contractor is complying with the procedures delineated
in his QA program and in the Project Work Plan. One of the most effective
ways of doing this is to conduct occasional visits to the study site and to
the contractor's laboratories. The contractor should be contacted as early
in advance of the visit as practicable and advised of any specific matters to
be discussed. The Project Officer must be knowledgeable of the status of the
work and of the procedures being used. The visit and observations from the
visit should be documented in the project file.
At least one site visit, preferably the first one, should incorporate a
QA audit. Procedures and checklists for conducting a QA audit may be found
in Section 3 of this document, "Quality Assurance Guidelines for Auditing of
Projects Requiring Sampling and Analysis." The Quality Assurance Officer may
participate in site visits whenever possible and, in particular, when a QA
audit is to be conducted.
The Project Officer may call upon the contractor to conduct one or more
performance tests. Performance test samples for many parameters are made
available by EPA. Procedures for obtaining these materials are found on
page 50. The Project Officer may request the contractor to analyze reference
materials in order to establish analytical accuracy and precision of the
analytical methods, or he may have split samples analyzed by another labora-
tory for purposes of comparison. Independent sampling by lERL-Ci or a third
party may also be called for if it is important to document certain param-
eters upon which critical decisions are to be made.
39
-------
When sampling and/or analytical activities are sub-contracted, it must
be made clear that the prime contractor bears full responsibility for the
application of QA procedures and for the quality of the final product. The
Project Officer should exert the same control as when all work is done only
by the prime contractor and no sub-contractor is used. All parties should
understand the QA requirements of the program and agree to cooperate in the
QA effort. The Project Officer should have direct access to the sub-
contractor or, at least, he should feel confident that someone on the prime
contractor's project staff is knowledgeable of the technical aspects of the
sampling and analysis activities and can represent this area of project
activity.
Project Conclusion and Closeout—
Most projects conclude with a final project report. A draft final
report is submitted to the Project Officer for review and approval. This is
the culmination of the Project Officer's QA responsibilities. He should
assure all sampling and analytical data are thoroughly documented and sub-
stantiated. Independent review of the final report by other qualified Agency
personnel is required.
Each Project Officer is required to complete a Project Officer's Evalu-
ation of Contractor Performance (EPA Form 1900-27) at the conclusion of a
contract. The degree to which the contractor has complied with the con-
tracted work scope and the required QA should be reflected in this evaluation
so that the Agency may develop an accurate historical performance record of
contractors.
Grants and Cooperative Agreements
Unlike contracts management, the Project Officer's role is not one of
directing the activities of a grantee. He still has the responsibility, how-
ever, of assuring a quality product is delivered to the Agency upon project
completion. As shown in Figure 8, he has many of the same responsibilities
for project initiation, project monitoring, and project conclusion as he does
for a contract (see Figure 2). His basic methods of fulfilling these respon-
sibilities would be the same as described above for contracts, but his
approach would be somewhat different, since he is not expected to "direct"
the grantee but to "assist" him in achieving the quality requirements of the
Agency.
Project Initiation—
Upon grantee acceptance of the grant or cooperative agreement, the Pro-
ject Officer should inform the grantee that the Agency is concerned that the
quality of all sampling and analysis can be documented. He should describe
his role in monitoring project QA procedures.
Monitoring of Project QA—
The key responsibility of the Project Officer is the technical
monitoring of the grant, basically as a means of assuring that the grantee
40
-------
Grant Application
I
Grant Award
•Study Plan
Project Initiation-
Project Monitoring
Project Conclusion-
Project Closeout
Initial Contact
Continuing Communication
Encourage Grantee Performance
Site Visits/QA Audit
Sub-Agreements
Progress Reports
Final Project Report
Figure 8. Grants and cooperative agreements administration.
41
-------
carries out the approved scope of work on schedule and in conformance with
applicable rules, regulations, and any special conditions that may have been
imposed by the grant agreement. The grantee should use currently accepted QA
procedures commensurate with the objectives of the effort.
As shown in Figure 8, the study plan is normally approved before project
initiation as a result of funding of the grantee's project. Hence, it is
critically important that QA be a major consideration in the technical evalu-
ation of grant and cooperative agreement applications. Guidance is provided
in the first section of this document, "Quality Assurance Guidelines for
Procurement of Projects Requiring Sampling and Analysis."
In carrying out the project monitoring function for a grant or a cooper-
ative agreement, the Project Officer should: (1) conduct site visits and a
QA audit as appropriate, (2) monitor the QA practices on any sub-agreements,
and (3) review the QA aspects of all progress reports and the final report
submitted by the grantee.
The Project Officer may have access to grantee records and to records of
contractors under the grant, as well as to the performance site, and should
use such access as one tool in monitoring grant QA activities. The frequency
of visits is a matter of Project Officer judgment, although it may be
affected by the availability of travel funds. As a general rule, a project
QA audit should be conducted once during each project. The QA audit is most
effectively accomplished early in the project when the first sampling and
analysis activities are begun. Recommendations for conducting a QA audit are
discussed beginning on page 50. QA audit procedures and checklists may be
found in Section 3, "Quality Assurance Guidelines for Auditing of Projects
Requiring Sampling and Analysis."
The Project Officer reviews any sub-agreements or sub-contracts under a
grant or cooperative agreement for technical content. He should make sure
that all sub-agreements are written to include necessary and reasonable QA
requirements. The Project Officer should not approve the award of a sub-
agreement until he is satisfied that QA requirements equivalent to those
expected of the prime grantee have been accepted.
Summary of the Project Officer's Role in Achieving QA
The role of the Project Officer in achieving QA can be best fulfilled by
keeping in mind a few working principles.
1. Have a good basic understanding of conventional QA practices and the
principles behind project QA. Work with the Quality Assurance
Officer whose job it is to be thoroughly knowledgeable of ways to
achieve and document good quality work.
2. Be knowledgeable of the sampling and analytical procedures used or
ask for assistance from someone who is. Understand the precision
42
-------
and accuracy which is achievable with the methods and what can most
affect the quality of the work.
3. Know the level of QA required to achieve the objectives of the
project. Is qualitative or screening information adequate or must
the highest possible accuracy be achieved?
4. Be certain the contractor/grantee has adequate funds budgeted to
accomplish the work at the level of quality desired without taking
unjustified shortcuts to the detriment of QA.
5. Use the lERL-Ci Quality Assurance Officer for independent review of
the QA procedures, quality control data, and project reports. Ask
him to point out any QA weaknesses in the program.
6. Recognize that the purpose of a QA program is to discover and
correct errors. Use well-founded QA practices to identify any
sampling or analytical problems before effort is lost in producing
unreliable data.
THE CONTRACTOR'S/GRANTEE'S QA PROGRAM
Each contractor or grantee should have a written QA program which
describes the in-house procedures that he employs to guarantee, to the extent
possible, the quality of all sampling and analysis activities. The
contractor's/grantee's QA program is not project specific. It describes the
quality assurance and the quality control procedures used on any project
which requires sampling and analysis.
There is no standard, universally-accepted QA program. Each contractor
and each grantee prepares his own to suit the needs of his organization as
he has best determined. There are, however, a number of publications which
discuss the recommended contents of a comprehensive QA program. These
publications are listed in the Bibliography. Although each author describes
the essential elements of a QA program in his own manner and with his own
perspective, there is general agreement on what a QA program should contain.
These essential elements are described in this chapter. In addition, a
checklist has been prepared to aid the Project Officer in reviewing a QA
program and to facilitate his feedback to the contractor or grantee on the
suitability of his program with respect to lERL-Ci requirements.
Elements of a QA Program
A comprehensive QA program will address the six major elements upon
which the final quality of the laboratory's work depends. Generally, these
include:
1. Organization and personnel,
2. Facilities and equipment,
3. Analytical methodology,
43
-------
4. Sampling and sample handling procedures,
5. Quality control, and
6. Data handling.
A comprehensive, well-written QA program will address each of these
essential aspects of QA. In the following descriptions these six major areas
have been divided into sub-elements, where applicable.
Organization and Personnel—
QA policy and objectives—Each organization should have a written quality
assurance policy that should be made known to all organization personnel.
Objectives should be established to produce data that meet project require-
ments in terms of completeness, precision, accuracy, representativeness,
documentation, and comparability. The QA program should require the prepara-
tion of a project-specific QA plan for each major project.
QA organization—The organization and management of the QA function should be
described in the contractor's/grantee's QA document. Reporting relationships
and responsibilities should be clearly defined. A QA Coordinator or Super-
visor should be appointed and his responsibilities established. A descrip-
tion of the QC paperwork flow should be available. There should be a clear
designation of those who are authorized to approve data and results. Respon-
sibilities for taking corrective action should be assigned to appropriate
management personnel.
Personnel training—It is highly desirable that there be a programmed
training system for employees. This system should include motivation toward
producing data of acceptable quality and should involve "practice work" by
the new employee. The quality of this work can be immediately verified and
discussed by the supervisor, with appropriate corrective action taken.
Document cpntrol and revisions—A QA program should include a system for
documenting:(1)sampling procedures, (2) calibration procedures, (3) analy-
tical procedures, (4) computational procedures, (5) quality control proce-
dures, (6) field data, and (7) operating procedures, or any changes to these
procedures.
Procedures for making revisions to technical documents must be clearly
defined, with the lines of authority indicated. Procedural revisions should
be written and distributed to all affected individuals, thus insuring imple-
mentation of changes.
Documentation control becomes increasingly important in field studies,
since procedures must often be adapted to the particular situation, tested,
and approved by project management authority. Any revisions to the sampling
program must be strictly documented and approved by the Project Officer.
Undocumented changes in either the sampling or analysis program can seriously
affect the substantiation of the final project conclusions.
44
-------
Facilities and Equipment—
Procurement and inventory procedures—Purchasing guidelines for all equipment
and reagents having an effect on data quality should be well-defined and
documented. Similarly, performance specifications should be documented for
all items of equipment having an effect on data quality. Once any item which
is critical to the sampling and analysis program such as a flowmeter, in situ
instrument, or reagent is received and accepted by the organization, docu-
mentation should be retained of the type, age, and acceptance status of the
item. Reagents should be dated upon receipt in order to establish their
order of use and to minimize the possibility of exceeding their useful shelf
life.
Preventive maintenance—Preventive maintenance procedures should be clearly
defined and written for each measurement system and required support equip-
ment. When maintenance activity is necessary, it should be documented on
standard forms maintained in logbooks. A history of the maintenance record
of each system serves as an indication of the adequacy of maintenance sched-
ules and parts inventory.
Analytical Methodology—
Calibration and operating procedures—Calibration is the process of estab-
lishing the relationship of a measurement system output to a known stimulus.
In essence, calibration is a reproducible reference point to which all sample
measurements can be correlated. A sound calibration program should include
provisions for documentation of frequency, conditions, standards, and records
reflecting the calibration history of a measurement system.
The accuracy of the calibration standards is an important point to con-
sider since all data will be in reference to the standards used. A program
for verifying the accuracy of all working standards against primary grade
standards should be routinely followed.
Feedback and corrective action—The QA program should specify the corrective
action that is to be taken when an analytical or sampling error is dis-
covered. The program should require documentation of the corrective action
and notification of the analyst or sample collector of the error and correct
procedures.
Sampling and Sample Handling Procedures—
Configuration control—Some sampling and analysis programs require a more or
less elaborate array of sampling equipment, sampling systems, or in situ
instrumentation. It is important for quality assurance that once such an
array has been established, the configuration of the array should be docu-
mented. Furthermore, any changes in the configuration must be made only
after due consideration of the effects on the data which are gathered. Any
45
-------
changes in configuration or design changes in the sampling or analysis system
must be thoroughly documented.
Reliability—The reliability of each component of a measurement system is
related directly to the probability of obtaining valid data from that system.
It follows that procedures for reliably collecting, processing, and reporting
data should be clearly defined and in written form for each system component.
Reliability data should be recorded on standard forms and kept in a logbook
for easy reference. If this procedure is followed, the data can be used in
revising maintenance and/or replacement schedules.
Quality Control—
Quality control procedures—The quality control procedures used during samp-
ling and analysis should be described. The quality control checks routinely
performed during sample analysis include reagent blank analysis to establish
background absorbance, duplicate analysis to establish analytical precision,
and spiked and blank sample analysis to determine analytical accuracy. The
frequency of these quality assurance checks should be specified. Limits of
acceptance or rejection should be defined for analysis and control charts
should be used where practicable. Gas chromatography confirmation procedures
should be discussed.
Control checks and internal audits—A good QA program will make provision for
periodic control checks and internal audits by the performing organization.
Several approaches are commonly used for control checks. These include:
1. Reference material analysis. Analytical reference materials are
available from several commercial and government sources, or they
may be prepared in-house. The National Bureau of Standards (NBS)
has made available a series of standard reference materials which
may be purchased. The chemical analysis of these materials has been
well established. Such materials can be analyzed alongside routine
samples and the results used to check the accuracy of analytical
procedures.
2. Split sample analysis. The analysis by two or more analysts of a
single sample that has been well mixed and divided is useful for
establishing precision of the analytical techniques and the perfor-
mance of the analysts.
3. Spiked sample analysis. The analysis of a routine sample which has
been spiked with a known amount of the measured material should be
commonly employed to establish the recovery of an analytical
method.
4. Side-by-side analysis. Under particularly intractable conditions
where it is important to acquire useful data but difficult to con-
trol all important variables, it may be useful to conduct
46
-------
side-by-side analysis by two analysts. These analysts may use the
same or perhaps different analytical methods to acquire comparable
data.
5. Reference devices. Some measurement systems, in particular flow
measurement systems, can be checked by the use of available refer-
ence devices. The use of these devices are generally disruptive to
the operation and can seldom be employed without the knowledge of
the operator or the analyst.
Internal audits should be periodically conducted to evaluate the
functioning of the QA program. This involves an independent check of the
performance of the field crew or the laboratory analysts to determine if
prescribed procedures are closely followed.
Data Handling—
Data handling, reporting, and recordkeeping—Data handling, reporting, and
recordkeeping procedures should be described. Data handling and reporting
includes all procedures used to record data on standard forms, laboratory or
field notebooks. This includes bench data and field data. The reporting
format for different types of bench data should be discussed and the forms
provided. The contents of field notebooks should be specified.
Recordkeeping of this type serves at least two useful functions: (1) it
makes possible the reanalysis of a set of data at a future time, and (2) it
may be used in support of the experimental conclusions if various aspects of
the study are called into question.
Data validation—Data validation procedures, defined ideally as a set of
computerized and manual checks applied at various appropriate levels of the
measurement process, should be in written form and clearly defined for all
measurement systems. Criteria for data validation must be documented and
include limits on: (1) operational parameters such as flow rates; (2) cali-
bration data; (3) special checks unique to each measurement, e.g., successive
values/averages; (4) statistical tests, e.g., outliers; and (5) manual checks
such as hand calculations. The limits must be adequate to insure a high
probability of detecting invalid data for either all or the majority of the
measurement systems. The required data validation activities (flow rate
checks, analytical precision, etc.) must be recorded on standard forms in a
logbook.
QA Program Checklist
An important responsibility of the Project Officer, as described on
page 37 for contracts and page 40 for grants and cooperative agreements, is
to review the QA program of the contractor or grantee before project work
begins. A checklist has been provided in Appendix D to assist the Project
Officer in reviewing a contractor's or grantee's QA program. In the event a
contractor/grantee does not have a written QA program, this checklist may
47
-------
still be used to evaluate the QA procedures employed or to assist the
contractor/grantee to develop a suitable QA program.
This checklist applies to a "model" QA program; hence, many programs
will have deficiencies. It is up to the judgment of the Project Officer,
after the review of the QA program and possibly clarifying discussions with
the contractor/grantee, whether to accept the program or require upgrading of
the program before project work begins. The Quality Assurance Officer can
help the Project Officer in applying standards that conform with those used
on other lERL-Ci projects of a similar nature.
THE PROJECT WORK PLAN
A work plan should be prepared before commencement of each project. The
work plan normally consists of:
1. A statement of project objectives,
2. Description of the project staff,
3. Facilities and equipment,
4. A sampling plan,
5. An analytical plan,
6. The Project QA Plan, and
7. The project schedule.
It is Agency policy that all extramural projects involving environmental
measurements must have a Project QA Plan (Costle, 1979). Although the Proj-
ect QA Plan, which is generally a section of the Project Work Plan, specifi-
cally addresses quality assurance of the sampling and analysis effort, in the
broader sense the total work plan encompasses aspects of project quality
assurance. For contracts, the QA Plan is an integral part of the Project
Work Plan, which is prepared after award of the contract (see Figure 2) and
submitted to the Project Officer for review and approval before sampling and
analysis begins. For research and demonstration grants or cooperative
agreements, the QA Plan should be a key section of the Study Plan which
accompanies the grantee's formal application. If this is not the case, the
Project Officer should request that the grantee prepare a QA Plan for the
project and submit it for review in the initial stage of the effort.
Elements of a Project QA Plan
Project Objective—
The objective of the project and of the sampling and analysis effort
should be stated in clear and concise terms. From the statement of the
objective, one should derive the level of QA to be applied to sampling and
analysis activities and this, also, should be clearly stated.
The Project Officer should review the recommendations of the contractor
or grantee and assure that he is in agreement with the stated use of the data
and the appropriateness of the level of QA to be applied to the data collec-
tion activities.
48
-------
Project Staff—
Project reporting relationships should be shown. Normally on projects
requiring sampling and analysis, a Quality Assurance Supervisor should be
assigned. His responsibility should be to monitor the quality (internal
audit) of the sampling and analysis program and to assure that stated proce-
dures are, in fact, being employed. He should initiate corrective action
when problems are detected.
Facilities and Equipment—
The facilities, instrumentation, and equipment that are unique or cri-
tical to the success of the effort should be described. This would include:
(1) on-site and in situ instrumentation, (2) sampling equipment, (3) mobile
facilities, (4) temporary facilities, (5) subcontractor- or vendor-supplied
equipment, (6) government equipment or facilities, and (7) special safety
equipment. Operation and maintenance procedures should be described.
Sampling Plan and Methods—
The sampling plan should be discussed in the detail necessary. Sample
points should be precisely located on a site or system diagram and the meth-
ods of collecting the sample(s) should be described. The sampling schedule
should be established, as should the procedures to guarantee representative
samples. The containers and preservatives that are to be used should be
described as well as methods to avoid sample contamination. The sampling
plan should describe how maximum sample holding time limitations can be met
and how samples will be transported to the laboratory.
Analytical Methods—
Standard analytical methods should be employed when applicable. When
standard methods are not available, the methods employed should be documented
in step-by-step detail. For example, method validation would be required for
adoption of a standard method to a sample matrix other than that for which it
was designed, or development of a non-standard method. Analytical detection
limits and an assessment of the anticipated variance, precision, and accuracy
should be stated.
Project QA Plan—
The Project QA Plan should describe the application of the grantee's
general QA program to the project. It should be project-specific and should
describe what steps are to be taken to assure that the resultant sampling and
analytical data are reliable and suitable to project needs. The Project QA
Plan should specify such items as:
1. The steps taken to avoid sample contamination,
2. The method chosen to assure each sample is representative of the
source,
3. The collection of background or baseline samples,
4. The frequency of duplicate sample collection,
5. The use of blank samples and field spiked samples,
6. Split sample analysis,
49
-------
7. The method of establishing data precision, and
8. The method of establishing data accuracy.
Project Schedule—
All project plans require a schedule of accomplishment and milestones.
The schedule should allow adequate time for the sampling and analysis pro-
gram, for QA review of the results, and for corrective actions.
Project Work Plan Checklist
Appendix E is a checklist which has been developed to assist the Project
Officer in reviewing the Project Work Plan submitted by the contractor or
grantee. This checklist is intended to represent the degree of detail that
may be expected of a "model" Project Work Plan. The results of the checklist
responses should help the Project Officer to specify improvements that are
necessary or to have confidence that the project plan adequately addresses QA
for the purposes of the effort. Before use of this checklist, the Project
Officer should have a clear understanding of the level of QA required to meet
project requirements. The lERL-Ci Quality Assurance Officer can assist the
Project Officer in this review, and he can help the Project Officer apply
standards that conform with those used on other lERL-Ci projects of a similar
nature.
PROJECT QA MONITORING
The third foundation of project quality assurance is external QA
monitoring. To be thoroughly effective, a QA program and a Project QA Plan
should be tested from time to time. This is a responsibility of the Project
Officer. He is encouraged, however, to request the assistance of the lERL-Ci
Quality Assurance Officer.
External QA monitoring may be accomplished by use of some of the same
techniques used for internal audits (page 43), namely: (1) reference
materials analysis, (2) split sample analysis, (3) spiked sample analysis,
(4) side-by-side sample analysis, and (5) reference devices. In addition to
these, the Project Officer may make use of available performance test
samples, or he may conduct a QA audit of the project.
The lERL-Ci Quality Assurance Officer is prepared to assist the Project
Officer in using any of these methods to monitor the quality of the sampling
and analysis aspects of the project. At the request of the Project Officer,
he will arrange for reference or performance test samples to be sent to the
contractor/grantee, he will arrange for a project QA audit, or he will
arrange for split or side-by-side sample analysis.
Methods of Monitoring Contractor/Grantee QA
The QA performance of a contractor or grantee can be monitored from time
to time throughout the project by:
50
-------
1. Review of project reports,
2. Conference and project reviews,
3. Site visits,
4. Performance tests, and
5. Monitoring any sub-contracts.
The Project Officer should apply these methods throughout the course of a
project.
Review of Project Reports—
The Project Officer has the responsibility of reviewing all technical
project reports submitted by the contractor or grantee. This would include
progress reports, interim reports, task reports, QA reports, and final
project reports. In reviewing project technical reports, the Project Officer
should expect to see a summary of the quality control data in accordance with
the approved Project QA Plan. Any conclusions resulting from the project
should be supported by the sampling and analysis results when one takes the
quality control data into consideration. For example, the author should not
attempt to interpret different values for a given parameter measured at two
points in a system when the difference between the values is not greater than
the parameter variance with a high level of confidence.
The analytical methods should be discussed in the final project report.
If standard methods have been used, references should be provided to their
source. Non-standard methods should be described in step-by-step detail.
The data should be tabulated or displayed in a logical and understand-
able manner with the appropriate number of significant figures and appropri-
ate units. Detection limits should be indicated for parameters where values
are below the detection limit. Mean values should be reported, but supported
by the range and standard deviation determined by replicate analysis.
The final report should describe the sampling program and methods.
Sampling points should be shown on an appropriate schematic diagram. Any
specialized sampling techniques should be discussed in detail while standard
methods should be properly referenced.
Conferences and Project Reviews—
Whenever a conference or project review meeting is held, the Project
Officer should make a special effort to review project QA. He should review
field notebooks or laboratory notebooks for completeness. A summary should
be provided of the quality control data and the results reviewed. Are accu-
racy and precision adequate to accomplish the objectives of the program or
must changes be made in either sampling or analytical techniques? Are sample
points properly located to collect representative samples? Are short holding
time parameters being analyzed quickly enough after sample collection? Is
the Project QA Plan being followed or have undocumented changes been made?
Questions of this type should be asked by the Project Officer in order to
test the efficacy of both the contractor's/grantee's QA program and the
Project QA Plan.
51
-------
Site Visits and QA Audits—
Site visits to the facilities of the contractor or grantee and to the
sampling or demonstration site can be conducted as often as practicable and
necessary to keep abreast of technical progress. Such visits can serve as a
major QA function. At least one site visit during the course of the techni-
cal work on a project should incorporate a QA audit.
QA audit procedures and checklists are provided in Section 3, "Quality
Assurance Guidelines for Auditing of Projects Requiring Sampling and
Analysis." The Project Officer is referred to this document for further
information on this very important tool for project QA monitoring.
Performance Tests—
Performance tests may consist of: (1) analysis of performance test
samples provided by the Agency, (2) analysis of split samples by another
laboratory, (3) independent sampling and analysis by another laboratory, or
(4) reference sample analysis.
Performance test samples are prepared samples of known concentration.
They have been prepared and are made available to Project Officers by the EPA
Environmental Monitoring Systems Laboratories (Cincinnati, Las Vegas, and
Research Triangle Park) through the lERL-Ci Quality Assurance Officer.
Reference standards can also be obtained from the National Bureau of
Standards (NBS) through the lERL-Ci Quality Assurance Officer.
If it has been decided to use performance test samples, inform the
lERL-Ci Quality Assurance Officer of your needs for test samples, and he
will arrange to have them sent to the contractor or grantee.
If the Project Officer wishes to have one or more samples split and
analyzed by another laboratory, he should contact the lERL-Ci Quality Assur-
ance Officer who can make arrangements for the independent analysis. In a
similar manner, he may arrange for independent sampling and analysis if
this is considered necessary by the Project Officer as a QA check on the work
of the contractor/grantee.
Sub-Contracts and Sub-Agreements—
In the event sub-contracted services are used for either sampling or
analysis, the Project Officer should review the sub-contractor's QA program
using the checklist in Appendix D. It should meet the same requirements that
would be expected of the prime contractor/grantee if he were doing the work
in-house. The prime contractor/grantee has full responsibility for the
quality of the work just as if it were conducted within his own organization.
When a substantial amount of the effort or a particularly critical part of
the sampling or analysis is to be accomplished by a sub-contractor or by a
sub-agreement, a QA audit should be conducted.
52
-------
SECTION 3
QUALITY ASSURANCE GUIDELINES FOR AUDITING OF
PROJECTS REQUIRING SAMPLING AND ANALYSIS
INTRODUCTION
Quality Assurance Audits
This section provides guidelines for the evaluation of the performance
of a contractor, grantee, or agency sampling and analysis program and
describes the QA audit process. Specific guidelines are provided to:
1. Determine when to conduct a QA audit,
2. Assist Project Officers in conducting QA audits of grants and
contracts requiring sampling and analysis,
3. Assist those who conduct the QA audit in evaluation of the results,
and
4. Outline the procedures for the reporting of information to the con-
tractor, grantee, or agency doing the work thereby allowing them to
correct any significant QA deficiencies.
In addition, the Quality Assurance Officer is available to conduct or to
assist in conducting quality assurance audits. A checklist is provided to
facilitate the QA auditing process.
Quality Assurance in Audits
In the procurement phase of contracting, the EPA Project Officer uses
quality assurance criteria to aid in the selection of the most qualified
contractor. Selection of a contractor or specifying QA criteria for
in-house projects and grants, however, does not guarantee that the overall
performance on the project will meet the QA requirements. Therefore, it is
necessary at times to perform laboratory quality assurance audits. These
audits help to assure the Project Officer that all the necessary quality
assurance is being applied by the project team in order to deliver a quality
product. Quality assurance audits allow the Project Officer to determine
that:
1. The Organization and Personnel are qualified to perform assigned
tasks;
2. Adequate Facilit ies are available;
3. Proper Analytical Methodology is being used;
4. Proper Sample and Sample Handling Procedures, including chain-
of-custody of samples, are being used;
53
-------
5. Adequate analytical Quality Control, including reference samples,
control charts, and documented corrective action measures, is being
provided; and
6. Acceptable Data Handling and documentation techniques are being
used.
QUALITY ASSURANCE AUDIT GUIDELINES
When to Conduct a Quality Assurance Audit
A QA audit should be conducted on all projects requiring sampling and
analysis services, especially those which provide data that are:
1. Used to make economically important decisions,
2. Used for regulatory monitoring,
3. Used for regulation promulgation, or
4. Used for enforcement activities.
A QA audit of an on-going project can assure the Project Officer that
adequate QA measures are being taken to yield data of acceptable quality.
It will also convey to the laboratory organization audited that the Project
Officer places a high degree of importance on the quality of sampling and
analytical effort. A QA audit should not be used as a punitive device to
express displeasure with the performance of an organization, but should be
used to jointly establish acceptable QA procedures for the given project.
The Project Officer is dependent on reliable sampling and analytical data to
accomplish the objectives of the project. The contractor, grantee, or gov-
ernment organization conducting the sampling and analysis is also interested
in providing reliable data. It should be remembered that both have the same
objectives. Hence, cooperative efforts toward this end are likely to be
highly successful.
Ideally, a QA audit should be conducted early in a project, preferably
before the first sampling and analysis effort is completed. This is the best
time to influence the outcome of the entire project. If a project plan calls
for periodic sampling and analysis, it is recommended that the QA audit be
scheduled during the first or second phase of the sampling and analytical
effort. For example, if a project is of 12-month duration and requires
monthly sampling, the QA audit should be conducted during the first or second
month of the project. Of course, QA audits may be conducted more than once
during a project or may be repeated when necessary to affirm that acceptable
QA procedures are in use.
Audit Worksheet and Checklist
There are a large number of quality control procedures which can be
observed during the course of a laboratory audit. Since an audit rarely
lasts longer than one day and can often be as short as a few hours, an
auditing procedure must be used which can provide the Project Officer or his
54
-------
designee with the greatest amount of relevant information in the shortest
amount of time.
The approach taken by these guidelines is to combine all applicable
aspects of a laboratory quality assurance program into the general categories
listed below.
1. Organization and Personnel
2. Facilities
3. Sampling and Sample Handling Procedures
4. Analytical Methodology
5. Quality Control
6. Data Handling
Two forms incorporating the preceding information have been prepared to
assist the Project Officer in performing a laboratory audit. The first form
is a worksheet which is sent to the laboratory scheduled to be audited. This
form should be mailed to the project manager of the laboratory approximately
one month before the anticipated audit.
The laboratory should complete the worksheet and return it to the EPA
Project Officer within two weeks of the anticipated audit. The Project Offi-
cer will then screen the worksheet for areas which may require more detailed
explanations. This screening process should be limited to those areas of
importance in the sampling and analysis efforts of the project.
Some information which is requested in the worksheet may be available in
the original proposal or grant application. The use of this information in
completing the worksheet is encouraged to minimize the QA audit cost.
The second form which has been developed is a checklist for use by the
Project Officer or his representative in performing the audit. The checklist
has been developed to:
1. Verify a representative number of responses received in the
worksheet,
2. Elicit more project-specific information, and
3. Provide the Project Officer additional information to use in
evaluating the laboratory.
The worksheet and checklist have been developed for performing QA audits
on projects with either: (1) a scope of work encompassing several scientific
areas, or (2) projects with a limited scope of work which do not have specif-
ic audit procedures available. Specific QA elements for all IERL-Ci projects
are addressed in Section 2. Available auditing procedures are listed in
Table 8. Since auditing procedures are project-specific and can be used to
perform an audit in much greater detail, they should be employed where
available. The laboratory worksheet and the checklist are appended.
55
-------
TABLE 8. AVAILABLE QUALITY ASSURANCE AUDIT PROCEDURES
Parameter
Reference
Ambient Sulfur Dioxide
(Pararosaniline)
Lab and Field
Ambient Nitrogen Dioxide
(Sodium Arsenite)
Lab and Field
Total Suspended Particulate
(Hi-Volume Method)
Photochemical Oxidants
(Chemiluminescent)
Carbon Dioxide
(Non-Dispersive Infrared)
Metals, Organics, and Inorganics
in Drinking Water
Radiation in Drinking Water
Bacteria in Drinking Water
U.S. Environmental Protection
Agency. 1976. Criteria and
Procedures for the Evaluation of
Ambient Air Monitoring Programs
—Laboratory and Field.
(same as above)
(same as above)
(same as above)
(same as above)
U.S. Environmental Protection
Agency. 1977. Manual for the
Interim Certification of Labora-
tories Involved in Analyzing
Public Drinking Water Supplies.
Washington, D.C.
EPA-600/8-78-008.
U.S. Environmental Protection
Agency. 1979. Handbook for
Analytical Quality Control.
Cincinnati, Ohio.
(same as above)
U.S. Environmental Protection
Agency. National Environmental
Research Laboratory. 1975.
Handbook for Evaluating Water
Bacteriological Laboratories.
Cincinnati, Ohio.
EPA-670/9-75-006.
(continued)
56
-------
TABLE 8 (continued)
Parameter Reference
Bacteria in Drinking Water (Cont.) U.S. Environmental Protection
Agency. Environmental Monitor-
ing and Support Laboratory.
1978. Microbiological Methods
for Monitoring the Environment,
Cincinnati, Ohio.
EPA-600/8-78-017.
Biological Sampling U.S. Environmental Protection
Agency. Environmental Monitor-
ing and Support Laboratory.
1978. Quality Assurance Guide-
lines for Biological Testing.
Las Vegas, Nevada.
EPA-600/4-78-043.
57
-------
Use of Performance Test Samples
Performance test samples, when available, present the Project Officer
with an alternate method of performing a laboratory audit on a project which
requires sampling and analysis. EPA has prepared and has available test
samples for numerous parameters in many different matrices.
Certain projects will not warrant a full-scale, on-site audit even
though the data produced by the sampling and analysis effort meets the cri-
teria established (see page 54). Table 5, page 22, establishes the criteria
to be used in determining the necessity of requiring performance test
samples. In these cases, test samples can be used to audit the analytical
performance of the laboratory.
Other projects may be of sufficient importance to require both an
on-site audit and performance test samples. When problems in analytical
procedures are identified during an on-site audit, performance test samples
can be used to verify that discrepancies have been corrected by the
laboratory.
Performance test samples should not be used indiscriminately. They are
generally expensive for EPA and the laboratory to prepare, analyze, and
evaluate. The supply of test samples is not unlimited; however, in cases
where their value can be established, they should be used. Performance test
samples for audit purposes can be obtained through the Quality Assurance
Officer (QAO), lERL-Ci. The QAO can also provide guidelines in the inter-
pretation of the results. A listing of currently available performance test
samples is found in Appendix B.
After performance test samples are analyzed and the results evaluated,
the performing laboratory should be notified of the true value of the samples
and the range of acceptance. In cases where limited sets of samples are yet
unknown by laboratories, it may not be possible to provide the absolute true
values. When it is inadvisable to inform the laboratory of the true value,
the EMSL through the lERL-Ci Quality Assurance Officer should provide infor-
mation, in writing, as to the acceptability/unacceptability and whether the
results tended to be high or low.
CONDUCTING THE QUALITY ASSURANCE AUDIT
Review of Worksheet
This section describes the three basic steps for performing a QA audit.
These steps are: (1) evaluating the returned worksheet to determine areas to
be examined during the site visit, (2) conducting the site visit, and
(3) evaluating the QA audit results, including preparation of an audit
report. This section describes how to evaluate the worksheet returned by the
organization to be audited. This worksheet is in Appendix F.
58
-------
Organization and Personnel—
This section of the worksheet is designed to: (1) familiarize the
person performing the audit with the laboratory's organizational structure,
(2) identify the key personnel involved in the project, and (3) acquaint the
QA auditor with the skills and training of personnel actually involved in the
sampling and analysis effort. The QA auditor should familiarize himself with
the Project Manager, the person responsible for field sampling (if required
on this project), and the person responsible for the overall analytical
program.
Appendix G has been prepared to aid the QA auditor in evaluating the
skills and training of the personnel responsible for sample collection and
analyses. Individuals with assigned responsibilities, as indicated in the
worksheet, should be compared with the qualifications noted Appendix G to
ascertain that the laboratory is placing proper emphasis on quality assur-
ance. Any areas which may appear weak should be noted for discussion during
performance of the QA audit.
Facilities and Equipment—
The instrumentation necessary for the successful performance of a labo-
ratory on a given project is highly dependent on the nature of the project.
The laboratory has been selected based on skills in the project field and,
therefore, will provide a comprehensive list of the necessary instrumenta-
tion. The QA auditor should review the list of major instrumentation in
order to: (1) determine the type of instrumentation he will encounter during
his site visit, (2) note any instrumentation which may have been overlooked
in the preparation of the worksheet, and (3) acquaint himself with any unfam-
iliar instrumentation in order to perform a proper evaluation during the QA
audit. A listing of instrumentation and equipment needed for various
sampling and analysis functions is listed in Appendix G.
Other equipment and facilities, such as analytical balances, lighting,
etc., which were not covered by the worksheet will be evaluated on the check-
list during the QA audit. As a general rule, laboratory space for analytical
requirements should include approximately 120 square feet of working space
and 6 linear feet of unencumbered bench space for each analyst.
Analytical Methodology—
Methods of analysis vary according to the sample matrix (e.g., air,
soil, sediment, wastewater, drinking water, seawater, etc.). A list of
accepted methods for various media and scientific fields is presented in
Appendix C. These accepted methods should be compared with the analytical
methodologies listed in the completed worksheet. Any differences between
these methods and the methods listed by the contractor/grantee should be
noted and discussed during the on-site QA audit.
Sampling and Sample Handling Procedures—
If sample collection constitutes a major portion of the work effort, the
QA auditor should consider it as part of the QA audit program and should
59
-------
study the sampling plan, sample handling, and chain-of-custody procedures
proposed by the laboratory. The QA auditor should make every attempt to meet
with laboratory personnel at the sampling location and to observe the quality
control procedures practiced during sampling. Appendix H has been compiled
to aid the auditor in determining correct sample preservation methods.
Quality Control—
The worksheet poses many basic quality control questions. These
questions should be screened by the QA auditor to note: (1) any questions
answered no; and (2) if QA manuals, standard operating procedures, etc., are
available for inspection during the on-site survey. Each of the questions
answered yes is a potential area where the depth of the QA applied by the
laboratory can be probed to determine the degree of commitment to quality
assurance on the project.
Data Handling—
This area covered by the worksheet allows the QA auditor to determine
the availability of such items as field sampling notebooks and past data. It
also gives some idea as to the degree of quality assurance practiced by the
laboratory in: (1) analyzing samples within recommended holding times,
(2) rechecking of calculations, (3) rejecting data, and (4) cross-checking
data to reduce the possibility of confusing data and sample numbers.
Site Visit
QA audits should be planned to minimize the amount of time on site and
to maximize the amount of information which can be obtained. Implementation
of these guidelines should assure that the QA auditor's portion of the on-
site inspection can be completed in a maximum of six hours. In meeting this
goal, the following procedures should be used:
1. Contact the Project Manager and arrange a mutually agreeable time
and date for the site visit. If field sampling will be included in
the audit, the necessary arrangements should be made at this time.
2. Study the worksheet completed by the laboratory and identify the key
personnel. Note any questions unresolved by the pre-audit worksheet
on the QA audit form. Ask that the key project personnel be
available for discussions.
3. Prepare an agenda for the meeting which includes: (a) meeting with
key personnel in the laboratory organization, (b) a brief discussion
of the purpose of the QA audit, (c) a tour of the facilities, and
(d) discussions with all personnel who can assist in completing the
QA checklist form (Appendix I). Table 9 is a proposed agenda.
4. If a field audit is to be performed, an agenda should be arranged
with the laboratory to minimize any interference that the audit
might have on the field sampling effort.
60
-------
TABLE 9. RECOMMENDED AGENDA FOR QA AUDIT SITE VISIT*
A. Meet with Key Project Personnel
1. Introductions
2. Describe purpose of visit
B. Description of the Sampling and Analytical Effort
1. Sampling
2. Analysis
3. Data presentation and interpretation
C. Quality Assurance Program
1. Description
2. Discussions with key project personnel
3. Discussion of worksheet review
D. Tour of Facilities
1. Checklist
2. Discussions with staff personnel
E. Debriefing
1. Preliminary evaluation
2. Serious problems requiring immediate corrective action
3. Schedule for QA audit
*A minimum of one hour should be devoted to each section of B through E.
61
-------
Quality Assurance Audit Report
A written report of the QA audit results must be prepared. This report
should include:
1. An assessment of the performing organization's status in each of the
major areas addressed, including personnel, facilities, procedures,
and quality control.
2. A clear statement of areas requiring improvement or problems to be
corrected.
3. A timetable for correction of problem areas.
To effectively achieve QA audit objectives, the report should be sent to
the performing organization as soon as possible, preferably within 15 working
days of completion of the audit evaluation. Corrective action, when
required, should be scheduled to meet the timetable of the project.
Figure 9 is a suggested format for the QA audit report. Both the Pre-
Audit Worksheet and the Audit Checklist should be appended to this report.
The original QA Audit Report is maintained by the lERL-Ci Quality
Assurance Officer, and copies should be distributed as follows:
Copies
1 Project Officer
1 Extramural Laboratory Project Manager
If the QA audit was satisfactory and no specific followup action is
required, the QA audit is complete. If deficiencies have been noted and cor-
rective actions specified, then the Project Officer must assure that correc-
tions have been made in accordance with the specified timetable. Written
communications should be entered into the project file to document the cor-
rection of noted deficiencies. The principal investigator should be invited
to submit any response he desires for inclusion into the project file.
62
-------
QUALITY ASSURANCE AUDIT REPORT
Project:
Contract/Grant No.: Project Officer:
Laboratory Audited:
City: State:
Laboratory Director:
Audit Conducted By:
Agency:
Date:
PURPOSE AND OBJECTIVES OF THE PROJECT:
BRIEF DESCRIPTION OF THE SAMPLING AND ANALYTICAL REQUIREMENTS:
Figure 9. Quality Assurance Audit Report.
63
-------
RESULTS OF THE QUALITY ASSURANCE AUDIT
Organization and Personnel
Facilities
Analytical Methodology
Figure 9 (continued)
64
-------
RESULTS OF THE QUALITY ASSURANCE AUDIT
Sampling and Sample Handling
Quality Control
Data Handling
Figure 9 (continued)
65
-------
QUALITY ASSURANCE DEFICIENCIES:
RECOMMENDED CORRECTIVE ACTIONS AND TIMETABLE:
Signed Date
Title
Distribution:
Project Officer
Quality Assurance Officer
Contractor/Grantee
Figure 9 (continued)
66
-------
SECTION 4
CONCLUSIONS
PROCUREMENT
Section 1 of this document provides suggestions, recommendations, and
procedures whereby Project Officers can introduce QA considerations into the
procurement process when sampling and analysis is involved in the work.
Specific recommendations are given for determining the appropriate weighting
to apply to QA in the proposal evaluation criteria for contracts. Contrac-
tors and grantees must comply with lERL-Ci quality assurance requirements, if
the Laboratory is to achieve its assigned mission of developing new and
improved pollution control technology. These quality assurance program
requirements are intended to be stringent but justifiable. Conscientious
application by lERL-Ci Project Officers of the procedures described in
Section 1 should result in equitable treatment of all offerers and grant
applicants.
MONITORING
Section 2 of this document has been prepared by the lERL-Ci Quality
Assurance officer to describe procedures to maintain the quality of sampling
and analysis activities conducted by contractors and grantees under the
direction or funding of lERL-Ci. Of paramount importance in achieving and
maintaining quality work is the role and responsibilities of the Project
Officer. These responsibilities have been defined and discussed for
contracts, grants, and cooperative agreements.
The three foundations of project QA: (1) the contractor's/grantee's QA
program, (2) the Project Work Plan, and (3) project QA monitoring have been
discussed from the standpoint of the review responsibilities of the Project
Officer. Checklists are provided to assist the Project Officer in reviewing
contractor/grantee QA programs and project work plans.
If the Laboratory is to achieve its assigned mission of developing new
and improved pollution control technology in a cost-effective manner, then
each project must incorporate sound QA practices. Contractors and grantees
must be informed of and comply with lERL-Ci quality assurance requirements.
AUDITING
Procedures and a checklist are provided in Section 3 for conducting
quality assurance audits of on-going lERL-Ci projects which require sampling
and analysis services. It is recommended that a QA audit be conducted during
the beginning stages of all projects which require significant amounts of
sampling and analysis.
67
-------
BIBLIOGRAPHY
American Public Health Association, American Water Works Association, and
Water Pollution Control Federation. 1975. Standard Methods for the
Examination of Water and Wastewater. 14th Edition. Washington, D.C.
American Society for Testing and Materials. 1978. Annual Book of ASTM
Standards, Part 31: Water. Philadelphia, Pennsylvania.
Bicking, C., Olin, S., and King, P. 1978. Procedure for the Evaluation of
Environmental Monitoring Laboratories. U.S. Environmental Protection
Agency, Environmental Monitoring and Support Laboratory, Office of
Research and Development, Cincinnati, Ohio. EPA-600/4-78-017.
Costle, D. June 14, 1979. Memorandum—Quality Assurance Requirements for
All EPA Extramural Projects Involving Environmental Measures.
Environmental Science and Engineering, Inc. 1977. Water Quality Field
Sampling Manual. Gainesville, Florida.
Fairless, B. 1977. Quality Assurance Practices and Procedures.
U.S. Environmental Protection Agency, Region V, Surveillance and
Analysis Division, Chicago, Illinois. EPA-905/4-77-004.
Geldreich, E.E. 1975. Handbook for Evaluating Water Bacteriological
Laboratories. Second Edition. U.S. Environmental Protection Agency,
Office of Research and Development, Municipal Environmental Research
Laboratory, Cincinnati, Ohio. EPA-670/9-75-006.
Kittrell, F.W. 1969. A Practical Guide to Water Quality Studies of Streams.
U.S. Department of Interior, Federal Water Pollution Control
Administration. CWR-5.
Krawczyk, D.F. n.d. Analytical Quality Control. U.S. Environmental
Protection Agency, Pacific Water Laboratory, Corvallis, Oregon.
Linch, A.L. 1973. Quality Control for Sampling and Laboratory Analysis.
In: The Industrial Environment—Its Evaluation and Control,
pp. 277-297. U.S. Department of Health, Education, and Welfare, Public
Health Service, Center for Disease Control, National Institute for
Occupational Safety and Health.
68
-------
BIBLIOGRAPHY (Continued)
Sherraa, J. 1976. Manual of Analytical Quality Control for Pesticides and
Related Compounds in Human and Environmental Samples: A Compendium of
Systematic Procedures Designed to Assist in the Prevention and Control
of Analytical Problems. Prepared for U.S. Environmental Protection
Agency, Office of Research and Development, Health Effects Research
Laboratory, Research Triangle Park, North Carolina. EPA-600/1-76-017.
U.S. Environmental Protection Agency, n.d. EPA Project Officer's Guide
(Research & Demonstration Grants). U.S. Environmental Protection
Agency, Office of Planning and Management, Office of Administra-
tion, Grants Administration Division, Washington, D.C.
U.S. Environmental Protection Agency, n.d. Guidance Package for Evaluation
of State Laboratories (Source Sampling)—Draft. Cincinnati, Ohio.
U.S. Environmental Protection Agency. 1976. Minimal Criteria and Procedures
for the Evaluation of Ambient Air Monitoring Programs—Laboratory and
Field. Draft III.
U.S. Environmental Protection Agency. 1977. Materials from Course 470—
Quality Assurance for Air Pollution Measurement Systems. Research
Triangle Park, North Carolina.
U.S. Environmental Protection Agency. 1977. Procurement Information
Notice PIN 77-15—Source Evaluation and Selection Procedures.
U.S. Environmental Protection Agency. 1978. Project Management and the
Procurement Process: A Seminar Workshop for Project Officers and
Other Technical Personnel. Washington, D.C. 203 pp.
U.S. Environmental Protection Agency. Health Effects Research Laboratory.
Environmental Toxicology Division. 1974, 1977 rev. ed. Analysis of
Pesticides Residues in Human and Environmental Samples: A Compilation
of Methods Selected for Use in Pesticide Monitoring Programs. Edited by
J.F. Thompson. Research Triangle Park, North Carolina.
U.S. Environmental Protection Agency. Office of Research and Development.
Environmental Monitoring and Support Laboratory. 1976. Quality
Assurance Handbook for Air Pollution Measurement Systems: Volume I—
Principles. Research Triangle Park, North Carolina. EPA-600/9-76-005.
U.S. Environmental Protection Agency. Office of Research and Development.
Environmental Monitoring and Support Laboratory. 1977. Quality
Assurance Handbook for Air Pollution Measurement Systems: Volume II—
Ambient Air Specific Methods. Research Triangle Park, North Carolina.
EPA-600/4-77-027a.
69
-------
BIBLIOGRAPHY (Continued)
U.S. Environmental Protection Agency. Office of Research and Development.
Environmental Monitoring and Support Laboratory. 1978. Environmental
Radioactivity Laboratory Intercomparison Studies Program, 1978-1979.
Las Vegas, Nevada. EPA-600/4-78-032.
U.S. Environmental Protection Agency. Office of Research and Development.
Environmental Monitoring and Support Laboratory. 1979. Handbook for
Analytical Quality Control in Water and Wastewater Laboratories.
EPA-600/4-79-019.
U.S. Environmental Protection Agency. Office of Research and Development.
Environmental Monitoring and Support Laboratory. 1979. Methods for
Chemical Analysis of Water and Wastes. Cincinnati, Ohio.
EPA-600/4-79-020.
U.S. Environmental Protection Agency. Office of Water Planning and
Standards. Monitoring and Data Support Division and Environmental
Monitoring and Support Laboratory. 1975. Minimal Requirements for a
Water Quality Assurance Program, Cincinnati, Ohio. EPA-440/9-75-010.
U.S. Geological Survey. Office of Water Data Coordination. 1977. National
Handbook of Recommended Methods for Water-Data Acquisition. Reston,
Virginia.
Water Supply Quality Assurance Work Group. 1977. Manual for the Interim
Certification of Laboratories Involved in Analyzing Public Drinking
Water Supplies—Criteria and Procedures. Prepared for the
U.S. Environmental Protection Agency, Washington, D.C.
Weber, C.I., ed. 1973. Biological Field and Laboratory Methods for Meas-
uring the Quality of Surface Waters and Effluents. U.S. Environmental
Protection Agency, National Environmental Research Center, Office of
Research and Development, Cincinnati, Ohio. EPA-670/4-73-001.
70
-------
APPENDIX A
QUALITY ASSURANCE EVALUATION CRITERIA CHECKLIST FOR
PROPOSALS AND GRANT APPLICATIONS OFFERING
SAMPLING AND ANALYSIS SERVICES
Scoring Numerical Individual
Criteria Value x Weight = Score
A. Quality assurance management policy/written procedures.
la. Does the offerer have an x 5 =
on-going QA program?
2a. Does the offeror have a written x 4 =
QA manual that he will make
available for review?
3a. Has the offeror designated x 3 =
a QA coordinator or a QA
supervisor who reports to senior
management levels?
4a. Does the proposed project manage- x 2 =
ment structure provide for
adequate QA?
5a. Will a project specific QC plan x 1 =
be prepared before commencement
of sampling and analysis?
Total Score for Sub-element A.
Maximum Possible Score 75
Percent of maximum possible score awarded for Sub-element A. (circle
closest value).
0% 20% 40% — 60% 80% 100%
Score for this sub-element of the proposal evaluation criteria (circle
corresponding value).
0 1 2a 2b 3 4 5
71
-------
APPENDIX A (continued)
Criteria
Scoring Numerical
Value x Weight
Individual
Score
B. Quality assurance procedures for sampling.
Ib. Are sampling locations chosen
to assure representative samples
will be taken?
2b. Will the proposed sampling
program yield data of statisti-
cal significance as appropriate
to the objectives of the project
(e.g., replicate samples,
background samples, etc., should
be discussed)?
3b. Does the offerer show an under-
standing of the proper techniques
used to collect representative
samples while avoiding sample
contamination?
4b. Does the offeror have access to
the appropriate sampling
equipment?
5b. Are samples to be shipped
promptly to the laboratory to
meet maximum sample holding
time limitations?
6b. Are appropriate sample preserva-
tion methods proposed?
7b. Are sample chain-of-custody
procedures described?
Total Score for Sub-element B.
Maximum Possible Score
75
Percent of maximum possible score awarded for Sub-element B. (circle
closest value).
0% 20% 40% — 60% 80% 100%
Score for this sub-element of the proposal evaluation criteria (circle
corresponding value).
0 1 2a 2b 3 4 5
72
-------
APPENDIX A (continued)
Scoring Numerical Individual
Criteria Value x Weight = Score
C. Quality assurance procedures for analysis.
Ic. Does the offerer intend to use
standard analytical methods*
where available? If standard
methods are not available, will
the methods used be documented?
2c. Does the offeror have a labora-
tory QC program which specifies
at least 5-10 percent sample
replication and 5 percent spiked
sample analysis?
3c. Is high quality analytical
instrumentation available for
use on the project?
4c. Are laboratory facilities x 2 =
adequate?
5c. Are analytical detection limits x 2 =
adequate for the purposes of
the project?
6c. Does the offeror participate x 2 =
in EPA and/or other interlabora-
tory QC programs?
Total Score for Sub-element C.
Maximum Possible Score 75
Percent of maximum possible score awarded for Sub-element C. (circle
closest value).
0% 20% 40% — 60% 80% 100%
Score for this sub-element of the proposal evaluation criteria (circle
corresponding value).
0 1 2a 2b 34 5
*Appendix C provides a listing of acceptable EPA Analytical Methods.
73
-------
APPENDIX A (continued)
Scoring Numerical Individual
Criteria Value x Weight = Score
D. Quality assurance procedures for data management.
Id. Does the offeror possess x '.
appropriate data handling,
processing, and retrieval
capabilities?
2d. Will QC data (e.g., standard x '.
curves, duplicate results, spike
sample results) be maintained and
be accessible to the Project
Officer?
3d. Does the organization routinely x 2 =
maintain analytical performance
records such as quality control
charts?
4d. Are all laboratory results and x 3 =
QC data reviewed by laboratory
supervisory personnel?
5d. Are all data and records retained x 1 =
for a minimum of 3 years?
6d. Are field notebooks used to x 3 =
record sampling and engineering
data (e.g., sample number, date/
time of collection, flow,
operating conditions, etc.)?
Total Score for Sub-element D
Maximum Possible Score 75
Percent of maximum possible score awarded for Sub-element D (circle
closest value).
0% 20% 40% — 60% 80% 100%
Score for this sub-element of the proposal evaluation criteria (circle
corresponding value).
0 1 2a 2b 3 4 5
74
-------
APPENDIX B
QUALITY CONTROL PERFORMANCE/REFERENCE TEST SAMPLES
The information contained in this Appendix consists of a listing of
performance/reference test samples and their sources. In addition to those
samples currently available, new materials expected to be available in FY 79
are also listed.
WATER QUALITY PARAMETERS (Available
lERL-Ci)
Mineral/Physical Analyses
Calcium
Magnesium
Sodium
Potassium
Alkalinity
Sulfate
Chloride/Fluoride
Residue, Total Filterable
Hardness, Total
pH
Conductance
Demand Analyses
Organic Carbon
Chemical Oxygen Demand
Biochemical Oxygen Demand
Nutrients
NH3 - N
N03 - N
P04 - P
Kjeldahl - N
Phosphorous, Total
Nitrilotriacetic Acid
NTA
from EMSL-Ci through the QA Officer,
Trace Metals
Aluminum
Arsenic
Beryllium
Cadmium
Cobalt
Chromium
Copper
Iron
Lead
Manganese
Mercury
Nickel
Selenium
Vanadium
Zinc
Mercury (Organic & Inorganic)
Mercury, Total
Linear Alkylate Sulfonate
EPA/SDA Standard Solution,
5% Active
75
-------
APPENDIX B (continued)
Residue Analysis
Residue, Total Non-Filterable—
fine particles
Residue, Total Non-Filterable—
fibrous material
Residue, Total Non-Filterable—
coarse particles
Chlorophyll
Chlorophyll, a, b, c, and
pheophytin for Spectrophoto-
metric Analyses
Chlorophyll, a and pheophytin
for Fluorometric Analyses
Volatile Organics
1,2, Dichloroethane
Chloroform
1,1,1 Trichloroethane
1,1,2 Trichloroethylene
Carbon Tetrachloride
1,1,2,2 Tetrachloroethylene
Broraodichloromethane
Dibromochloromethane
Bromoform
Polychlorobiphenyls
Aroclor 1016
Aroclor 1254
Petroleum Hydrocarbons for
Characterization
Two Crude Oils
#2 Fuel Oil
Bunker C
Pesticides, Chlorinated Hydrocarbon
Aldrin
Chlordane
Dieldrin
Heptachlor
DDT
DDD
DDE
Acid Extracts (Phenolics)
Municipal Digested Sludge
Cyanide
Oil/Gre~ase
Trihalomethanes
Turbidity
Free Chlorine
Base/Neutral Extracts
Phthalates
Nitro- and Chloro-benzenes
PNAs
Benzidine (Standard Only)
2,4-Dinitrotoluene
2,6-Dinitrotoluene
Isophorone
Nitrobenzene
Acenaphthene
1,2,4-Trichlorobenzene
Hexachlorobenzene
Hexachloroethane
2-Chloronaphthalene
1,2-Dichlorobenzene
1,3-Dichlorobenzene
1,4-Dichlorobenzene
Fluoranthene
Hexachlorobutadiene
Hexachlorocyclopentadiene
Naphthalene
Dimethylphthalate
Di ethyIphthaiate
Di-n-butylphthalate
Butylbenzylphthalate
bis(2-ethylhexyl)phthalate
1,2-Benzanthracene
Benzo(a)pyrene
Chrysene
Acenaphthylene
Anthracene
1,12-Benzoperylene
Fluorene
Phenanthrene
1,2,5,6 Dibenzanthracene
Indeno(l,2,3 c,d)pyrene
Pyrene
76
-------
APPENDIX B (continued)
WATER SUPPLY QC SAMPLES (Available
from QA Coordinators through QA
Officer, lERL-Ci)
Metals
Arsenic
Barium
Cadmium
Chromium
Lead
Mercury
Selenium
Silver
Nitrate/Fluoride
Trihalomethanes
Pesticides
Endrin
Lindane
Methoxychlor
Toxaphene
Herbicides
2,4 D
2,4,5-TP Silvex
PRIORITY POLLUTANTS (Available from
EMSL through QA Officer, lERL-Ci)
Pesticides
Toxaphene
Chlordane
Endrin
Heptachlor
Aldrin
Dieldrin
4,4-DDT
4,4-DDE
4,4-ODD
BHC
PRIORITY POLLUTANTS (continued)
Aroclors
PCB-1016
PCB-1254
Purgeable Compounds
Benzene
Carbon Tetrachloride
Chlorobenzene
1,2-Dichloroethane
1,1,1-Trichloroethane
1,1-Dichloroethane
1,1,2-Trichloroethane
1,1,2,2-Tetrachloroethane
Chloroform
1,1-Dichloroethylene
1,2-trans-Dichloroethylene
1,2-Dichloropropane
1,3-Dichloropropylene
Ethylbenzene
Methylene chloride
Bromoform
Dichlorobromomethane
Trichlorofluoromethane
Tetrachloroethylene
Toluene
Trichloroethylene
o, m, and p Xylene
77
-------
APPENDIX B (continued)
RADIATION (Available from EMSL— Las
Vegas through QA Officer, lERL-Ci)
Water
gross a
gross
,
Cs134,137)
H3
PU239
Ra226,228
Sr89,90
Water Laboratory Certification Samples
Blind Sample
Milk
Sr89,90>
Cs137> Ba140
Air
gross a , gross 0
ancj
Soil
Pu238,239> Th228,230,232
Diet
8,89,90,
Ba140 and
Urine
Gas
AIR PARAMETERS (Available from EMSL—
RTP through QA Officer, lERL-Ci)
N02 (ambient)
S02 (ambient)
S04/N03 (glass-fiber filter)
NH4S04 (glass-fiber and teflon filter)
Pb (glass-fiber and membrane filter)
As (glass-fiber filter)
AIR PARAMETERS (continued)
NO (cylinders, 3 levels)
N02 (cylinders)
S02 (cylinders)
CO (cylinders, 3 levels)
CH^ (cylinders, 9 levels)
Zero Air (for HC and NOX)
S02 (source)
NOX (source)
Benzene (cylinders, 2 levels)
Ethylene (cylinders, 3 levels)
Methane/Ethane (cylinders,
4 levels
Propane (cylinders, 2 levels)
Propylene (cylinders, 2 levels)
Toluene (cylinders, 2 levels)
Methyl Acetate (cylinders,
2 levels
Vinyl Chloride (cylinders,
2 levels
Hydrogen Sulfide (cylinders,
3 levels
m-Xylene (cylinders, 2 levels)
Chloroform (cylinders, 2 levels)
Perchloroethylene (cylinders,
2 levels)
Butadiene (cylinders)
Carbonyl Sulfide (cylinders,
4 levels)
Hexane (cylinders, 4 levels)
Methyl Mercaptan (cylinders,
4 levels)
Methyl Ethyl Ketone (cylinders)
Trichloroethylene (cylinders,
2 levels)
Vinylidene Chloride (cylinders,
2 levels)
1,2 Dichloroethane (cylinders,
2 levels)
Acetaldehyde (cylinders, 2 levels)
Propylene Dichloride (cylinders,
2 levels)
1,2 Dibromoethylene (cylinders
2 levels)
Acrylonitrile (cylinders,
2 levels)
Cyclohexane (cylinders)
Methanol (cylinders)
78
-------
APPENDIX C
EPA ACCEPTED ANALYTICAL METHODS
TABLE C-l. WATER QUALITY MEASUREMENT METHODS
Parameter
Method
Reference*
Acidity, as CaC03, mg/1
Alkalinity, as CaC03, mg/1
Ammonia (as N), mg/1
Bacteria
Coliform (fecal), no./lOO ml
Coliform (total), no./lOO ml
Fecal streptococci, no./lOO ml
Benzidine, mg/1
Biochemical oxygen demand 5-d
(BOD5), mg/1
Bromide, mg/1
Chemical oxygen demand
(COD), mg/1
Chloride, mg/1
Chlorinated organic compounds
(except pesticides), mg/1
Chlorine—total residual, mg/1
Color, platinum cobalt units
or dominant wavelength, hue,
luminance, purity
(continued)
Electrometric end point (pH of 8.2) 1, 2, 3, 4
or phenolphthalein end point
Electrometric titration (only to pH 4.5) 1, 2, 3, 4
manual or automated, or equivalent
automated methods
Manual distillation (at pH 9.5) 1, 2, 3, 4
followed by nesslerization, titration,
electrode, Automated phenolate
MPN; membrane filter 2, 4
2, 4
MPN; membrane filter, plate count 2, 4
Oxidation—colorimetric 5
Winkler (Azide modification) or electrode 2, 4
method
Titrimetric, iodine-iodate 1, 3, 4
Dichromate reflux 1, 2, 3, 4
Silver nitrate; mercuric nitrate; or 1, 2, 3, 4
automated colorimetric-ferricyanide
Gas chromatography 6
lodometric titration, amperometric or 1, 2, 3
starch-iodine end-point; DPD colorimetric
or Titrimetric methods (these last 2 are
interim methods pending laboratory testing)
Colorimetric; spectrophotometric; or 1, 2, 4
ADMI procedure
-------
TABLE C-l. WATER QUALITY MEASUREMENT METHODS (Continued)
Parameter
Method
Reference*
oc
c
Cyanide, total, tng/1
Dissolved oxygen, mg/1
Fluoride, mg/1
Hardness—total, as CaC03, mg/1
Hydrogen ion (pH), pH units
Kjeldahl nitrogen (as N), mg/1
Metals
All dissolved metals
Aluminum—total, mg/1
Antimony—total, mg/1
Arsenic—total, mg/1
Barium—total, mg/1
Beryllium—total, mg/1
Boron—total, mg/1
(continued)
Distillation followed by silver nitrate 1, 2, 3, 4
titration or pyridine pyrazolone (or
barbituric acid) colorimetric
Winkler (Azide modification) or 1, 2, 3, 4
electrode method
Distillation followed by ion electrode; 1, 2, 3, 4
SPADNS; or automated complexone
EDTA titration; automated colorimetric; 1, 2, 3, 4
or atomic absorption (sum of Ca and Mg
as their respective carbonates)
Electrometric measurement 1, 2, 3, 4
Digestion and distillation followed 1, 2,4
by nesslerization, titration, or
electrode; automated digestion
automated phenolate
0.45 micron filtration' followed by 1,2,4
referenced method for total metal
Digestion® followed by atomic absorption^ 1, 2,4
or by colorimetric (Eriochrome
Cyanine R)
Digestion" followed by atomic absorption' 1
Digestion followed by silver diethyl- 1, 2, 4
dithiocarbamate; or atomic absorption9
Digestion" followed by atomic absorption' 1, 2, 4
Digestion® followed by atomic 1, 2, 4
absorption' or by colorimetric (Aluminon)
Corimetric (Curcumin) 1, 2
-------
TABLE C-l. WATER QUALITY MEASUREMENT METHODS (Continued)
Parameter
Method
Reference*
00
Cadmium—total, mg/1
Calcium—total, mg/1
Chromium VI, mg/1
Chromium—total, mg/1
Cobalt—total, mg/1
Copper—total, mg/1
Gold—total, mg/1
Iridium—total, mg/1
Iron—total, mg/1
Lead—total, mg/1
Magnesium—total, mg/1
Manganese—total, mg/1
Mercury—total, mg/1
Molybdenum—total, mg/1
Nickel—total, mg/1
Osmium—total, mg/1
(continued)
Digestion" followed by atomic absorption^ 1, 2, 3, 4
or by colorimetric (Dithizone)
Digestion** followed by atomic 1, 2, 3, 4
absorption; or EDTA titration
Extraction and atomic absorption; 1, 2,4
colorimetric (Diphenylcarbazide)
Digestion" followed by atomic absorption" 1, 2, 3, 4
or by colorimetric (Diphenylcarbazide)
Digestion** followed by atomic 1, 2, 3, 4
absorption"
Digestion" followed by atomic 1, 2, 3, 4
absorption^ or by colorimetric
(Neocuproine)
Digestion** followed by atomic absorption^
Digestion** followed by atomic absorption^
Digestion" followed by atomic absorption^ 1, 2, 3, 4
or by colorimetric (Phenanthroline)
Digestion** followed by atomic absorption9 1, 2, 3, 4
or by colorimetric (Dithizone)
Digestion** followed by atomic absorption^ 1, 2, 3, 4
or gravimetric
Digestion** followed by atomic absorption^ 1, 2, 3, 4
or by colorimetric (Persulfate or periodate)
Flameless atomic absorption 1, 2, 3, 4
Digestion** followed by atomic absorption9 1, 3
Digestion** followed by atomic absorption^ 1, 2, 3, 4
or by colorimetric (Heptoxime)
Digestion" followed by atomic absorption'
-------
TABLE C-l. WATER QUALITY MEASUREMENT METHODS (Continued)
Parameter
Method
Reference*
00
10
Palladium—total, mg/1
Platinum—total, mg/1
Potassium—total, mg/1
Rhodium—total, mg/1
Ruthenium—total, mg/1
Selenium—total, mg/1
Silica—dissolved, mg/1
Silver—total, mg/1
Sodium—total, mg/1
Thallium—total, mg/1
Tin—total, mg/1
Titanium—total, mg/1
Vanadium—total, mg/1
Zinc—total, mg/1
Nitrate (as N), mg/1
Nitrite (as N), mg/1
Oil and grease, mg/1
Digestion" followed by atomic absorption^
Digestion** followed by atomic absorption^
Digestion** followed by atomic absorption, 1, 2, 3, 4
colorimetric (Cobaltinitrite), or by flame
photometric
Digestion** followed by atomic absorption^
Digestion" followed by atomic absorption*
Digestion** followed by atomic absorption^ 1, 2
0.45 micron filtration? followed by 1, 2, 3, 4
colorimetric (Molybdosilicate)
ft • Q
Digestion0 followed by atomic absorption' 1, 2, 4
or by colorimetric (Dithizone)
Digestion" followed by atomic absorption 1, 2, 3, 4
or by flame photometric
Digestion** followed by atomic absorption^ 1
Digestion** followed by atomic absorption^ 1, 4
Digestion** followed by atomic absorption^ 1
Digestion** followed by atomic absorption^ 1, 2, 3, 4
or by colorimetric (Gallic acid)
Digestion** followed by atomic absorption9 1, 2, 3, 4
or by colorimetric (Dithizone)
Cadmium reduction; brucine sulfate; 1, 2, 3, 4
automated cadmium or hydrazine reduction^
Manual or automated colorimetric 1, 2, 4
(Diazotization)
Liquid-liquid extraction with 1, 2
trichloro-trifluoroethane-gravimetric
(continued)
-------
TABLE C-l. WATER QUALITY MEASUREMENT METHODS (Continued)
Parameter
Method
Reference
Organic carbon—total (TOC), mg/1
Organic nitrogen (as N), mg/1
Orthophosphate (as P), mg/1
Pentachlorophenol, mg/1
Pesticides, mg/1
Phenols, mg/1
Phosphorus (elemental), mg/1
Phosphorus—total (as P), mg/1
Radiological
oo Alpha—total, pCi per liter
Alpha—counting error,
pCi per liter
Beta—total, pCi per liter
Beta—counting error,
pCi per liter
Radium—total, pCi per liter
Ra, pCi per liter
Residue
Total, mg/1
Total dissolved (filterable), mg/1
Total suspended (nonfilterable),
mg/1
Settleable, ml/1 or mg/1
Total volatile, mg/1
Specific conductance, micromhos
per centimeter at 25°C
Combustion—infrared method**
Kjeldahl nitrogen minus ammonia
nitrogen
Manual or automated ascorbic acid
reduction
Gas chroraatography^
6
Distillation followed by colorimetric
(4AAP)
Gas chromatography
Persulfate digestion followed by manual or
automated ascorbic acid reduction
Proportional or scintillation counter
Proportional counter
Scintillation counter
Gravimetric, 103 to 105°C
Glass fiber filtration, 180°C
Glass fiber filtration, 103 to 105°C
Volumetric or gravimetric
Gravimetric, 550°C
Wheats tone bridge conductimetry
1, 2, 3
1, 2, 4
1, 2, 3, 4
2, 3
1, 2, 3
12
1, 2, 3, 4
2, 3, 4
2, 3, 4
2, 3, 4
2, 3, 4
2, 3
1, 4
1, 2
1, 2
1, 2
2
1, 2
1, 2, 3, 4
(continued)
-------
TABLE C-l. WATER QUALITY MEASUREMENT METHODS (Continued)
Parameter
Method
Reference*
Sulfate (as 804), mg/1
Sulfide (as S), mg/1
Sulfite (as 803), mg/1
Surfactants, mg/1
Temperature, °C
Turbidity, NTU
Gravimetric; turbidimetric; or
automated colorimetric (barium
chloranilate)
Titrimetric—iodine for levels
greater than 1 mg per liter;
Methylene blue photometric
Titrimetric, iodine-iodate
Colorimetric (Methylene blue)
Calibrated glass or electrometric
thermometer
Nephelometric
2, 3
1, 3, 4
1, 2, 3
1, 2, 3, 4
1, 2, 4
1, 2, 3, 4
00
*References:
1 Methods for Chemical Analysis of Water and Wastes. 1979. U.S. Environmental Protection
Agency. Office of Research and Development. Environmental Monitoring and Support Laboratory.
Cincinnati, Ohio.
2 Standard Methods for the Examination of Water and Wastewater, 14th edition. 1975.
American Public Health Association, American Water Works Association and Water Pollution Control
Federation. Washington, D.C.
3 Annual Book of ASTM Standards, Part 31:
Materials. Philadelphia, Pennsylvania.
Water. 1978. American Society for Testing and
^ All references for USGS methods, unless otherwise noted, are to Brown, E., Skougstad, M.W.,
and Fishman, M.J., "Methods for Collection and Analysis of Water Samples for Dissolved Minerals and
Gases." U.S. Geological Survey Techniques of Water-Resources Inv., Book 5, Ch. Al (1970).
-------
TABLE C-l. WATER QUALITY MEASUREMENT METHODS (Continued)
-* Adequately tested methods for benzidine are not available. Until approved methods are
available, the following interim method can be used for the estimation of benzidine: (1) "Method for
Benzidine and Its Salts in Wastewaters," available from Environmental Monitoring and Support
Laboratory, U.S. Environmental Protection Agency, Cincinnati, Ohio 45268.
^Procedures for pentachlorophenol, chlorinated organic compounds, and pesticides can be
obtained from the Environmental Monitoring and Support Laboratory, U.S. Environmental Protection
Agency, Cincinnati, Ohio 45268.
'Dissolved metals are defined as those constituents which will pass through a 0.45 urn
membrane filter. A prefiltration is permissible to free the sample from larger suspended solids.
Filter the sample as soon as practical after collection using the first 50 to 100 ml to rinse the
filter flask. (Glass or plastic filtering apparatus are recommended to avoid possible
contamination.) Discard the portion used to rinse the flask and collect the required volume of
filtrate. Acidify the filtrate with 1:1 redistilled HN03 to a pH of 2. Normally, 3 ml of
(1:1) acid per liter should be sufficient to preserve the samples.
oo
01 "For the determination of total metals the sample is not filtered before processing.
Because vigorous digestion procedures may result in a loss of certain metals through precipitation,
a less vigorous treatment is recommended as given on page 83 (4.1.4) of "Methods for Chemical
Analysis of Water and Wastes" (1974). In those instances where a more vigorous digestion is
desired, the procedure on page 82 (4.1.3) should be followed. For the measurement of the noble
metal series (gold, iridium, osmium, palladium, platinum, rhodium, and ruthenium), an aqua regia
digestion is to be substituted as follows: Transfer a representative aliquot of the well-mixed
sample to a Griffin beaker and add 3 ml of concentrated redistilled HN03. Place the beaker on
a steam bath and evaporate to dryness. Cool the beaker and cautiously add a 5 ml portion of aqua
regia. (Aqua regia is prepared immediately before use by carefully adding 3 volumes of
concentrated HC1 to one volume of concentrated HN03). Cover the beaker with a watch glass and
return to the steam bath. Continue heating the covered beaker for 50 min. Remove cover and
evaporate to dryness. Cool and take up the residue in a small quantity of 1:1 HC1. Wash down the
beaker walls and watch glass with distilled water and filter the sample to remove silicates and
other insoluble material that could clog the atomizer. Adjust the volume to some predetermined
value based on the expected metal concentration. The sample is now ready for analysis.
(continued)
-------
TABLE C-l. WATER QUALITY MEASUREMENT METHODS (Continued)
the various furnace devices (flameless AA) are essentially atomic absorption techniques,
they are considered to be approved test methods. Methods of standard addition are to be followed as
noted in page 78 of "Methods for Chemical Analysis of Water and Wastes," 1974.
automated hydrazine reduction method is available from the Environmental Monitoring
and Support Laboratory, U.S. Environmental Protection Agency, Cincinnati, Ohio 45268.
^Goerlitz, D. , Brown, E., "Methods for Analysis of Organic Substances in Water":
U.S. Geological Survey Techniques of Water-Resources Inv. , book 5, ch . A3 (1972).
l^R.p. Addison and R.G. Ackman, "Direct Determination of Elemental Phosphorus by
Gas-Liquid Chromatography, " "Journal of Chromatography, " vol. 47, No. 3, pp. 421-426, 1970.
-------
TABLE C-2. DRINKING WATER MEASUREMENT METHODS
Parameter
Method
Reference
00
Organics
(a) Chlorinated Hydrocarbons:
Endrin
Lindane
Methoxychlor
Toxaphene
(b) Cholorophenoxys:
2,4-Dichlorophenoxyacetic
acid
2,4,5-Trichloro-
phenoxypropionic acid
Radiation
Inorganic Chemicals
Physical Measurements
Microbiological Measurements
Gas chromatography with
electron capture detector
Methods for Organochlorine
Pesticides in Industrial
Effluents, MDQARL, EPA,
Cincinnati, Ohio, 1973
Methods for Chlorinated Phenoxy
Acid Herbicides in Industrial
Effluents, MDQARL, EPA,
Cincinnati, Ohio, 1973
Code of Federal
40(Parts 100 to
Code of Federal
40(Parts 100 to
Code of Federal
40(Parts 100 to
Code of Federal
40(Parts 100 to
Regulations,
399): 169-197
Regulations,
399): 169-197
Regulations,
399): 169-197
Regulations,
399): 169-197
-------
TABLE C-3. AMBIENT AIR MEASUREMENT METHODS
Pollutant
Measurement method
or principle
Reference
oo
00
Suspended Particulates
Sulfur Dioxide
Carbon Monoxide
Photochemical Oxidants
Nitrogen Dioxide
High volume sampler
Tape sampler
Pararosaniline or equivalent
Nondispersive infrared
or equivalent
Gas phase chemiluminescence
or equivalent
Gas phase chemiluminescence
or equivalent
CFR 40, Part 50, Appendix B,
July 1, 1979
CFR 40, Part 50, Appendix A,
July 1, 1979
CFR 40, Part 50, Appendix C,
July 1, 1979
CFR 40, Part 50, Appendix D,
July 1, 1979
CFR 40, Part 50, Appendix F,
July 1, 1979
-------
TABLE C-4. SOURCE AIR METHODS
Determination
Description of method
Reference
00
Sample and Velocity Traverses
for Stationary Sources
Stack Gas Velocity
Dry Molecular Weight of Gas
Stack Gas Moisture
Particulate Emissions
Sulfur Dioxide
Nitrogen Oxide
Sulfuric Acid Mist and
Sulfur Dioxide
Visible Emissions
Carbon Monoxide
Hydrogen Sulfide'
(continued)
Pitot
Orsat
Volumetric & Gravimetric
Gravimetric
Collection by impinger,
analysis by barium
perchlorate titration
Collection by evacuated
flask, colorimetric
analysis
Collection by impinger,
analysis by barium
perchlorate titration
Certified observer
Non-dispersive infrared
Collection by impinger,
iodimetric titration
EPA Method 1 Environmental
Protection Agency Performance
Test Methods, page 1-1,
EPA-340/1-78-011
EPA Method 2 1-13
EPA Method 3 1-45
EPA Method 4 1-61
EPA Method 5 1-79
EPA Method 6 1-119
EPA Method 7 1-135
EPA Method 8 1-157
EPA Method 9
EPA Method 10
EPA Method 11
-------
TABLE C-4. SOURCE AIR METHODS (Continued)
Determination
Description of method
Reference
Fluoride
Sulfur Compounds
Sulfur Compounds
Particulate Matter
Collection by impinger, EPA Method 13A (Colorimetric)
colorimetric, or specific or 13B (Specific Ion Electrode)
ion electrode
Gas chromatographic
determination of sulfur
gases emitted by a
Claus Sulfur Recovery Unit
Gas chromatographic
determination of reduced
sulfur compounds emitted
by paper mills
In-stack filter
determination of parti-
culate matter
EPA Method 15
EPA Method 16
EPA Method 17
-------
TABLE C-5. PRIORITY POLLUTANT MEASUREMENT METHODS
Recommended analytical methods for priority pollutants are described
in "Sampling and Analysis Procedures for Screening of Industrial Effluents
for Priority Pollutants".
These guidelines for sampling and analysis of industrial wastes have
been prepared by the staff of the Environmental Monitoring and Support
Laboratory - Cincinnati, at the request of the Effluent Guidelines
Division, Office of Water and Hazardous Wastes, and with the cooperation
of the Environmental Research Laboratory, Athens, Georgia. The procedures
represent the current state of the art, but improvements are anticipated
as more experience with a wide variety of industrial wastes is obtained.
Users of these methods are encouraged to identify problems encountered and
to assist in updating the test procedures by contacting the Environmental
Monitoring and Support Laboratory, EPA, Cincinnati, Ohio 45268. These
methods were first made available in March 1977 and were revised in April
1977.
91
-------
TABLE C-6. RADIATION MEASUREMENT METHODS
Parameter and units
Method
Sample matrix
Reference
Alpha - total pCi per liter
Beta - total pCi per liter
Radium-226 - pCi per liter
Strontium 89, 90 - pCi
per liter
Tritium - pCi per liter
Cesium 134 - pCi per liter
Uranium - pCi per liter
Others (various units
depending on media)
Proportional or Water
scintillation
counter
Proportional
counter
Scintillation
counter
Fluorometric
Water
Water
Water
Water
Water
Water
Various
Interim Radiochemical Methodology
for Drinking Water EPA-600/4-75-008
Standard Methods for the Examination
of Water and Wastewater, 14th Ed.
(same as above)
(same as above)
(same as above)
(same as above)
(same as above)
ASTM D-2459 Gamma Spectroscopy in
Water, 1975
ASTM D-2907 Micro Quantities of
Uranium in Water by Fluorimetry, 1975
HASL Procedure Manual, HASL 300,
ERDA Health and Safety Laboratory,
New York, NY, 1973
-------
APPENDIX D
QUALITY ASSURANCE PROGRAM CHECKLIST
Contractors and grantees must follow accepted quality assurance procedures
to establish the quality of sampling and analytical data which are used in
lERL-Ci programs. Each organization should have written QA procedures which
describe the routine steps taken to guarantee, to the extent possible, the
quality of all sampling and analytical data reported by the laboratory. The
purpose of this checklist is to provide guidance to the Project Officer in
reviewing the QA program of a contractor or grantee in order to determine if
it is in general conformance with lERL-Ci requirements. The questions in
this checklist address each major area of quality assurance that should be
encompassed by a comprehensive or "model" QA program.
This checklist applies to a contractor's general Quality Assurance manual
which encompasses all aspects of work performed by the contractor. It
should be used to check the contractor's overall QA program to establish
that all QA aspects are discussed. This checklist should be completed
before project work begins.
This checklist differs from Appendix E in that it covers the contractor's
overall QA program and philosophy. Appendix E is the checklist to be used
in determining whether any weakness exists in the Quality Assurance as
applied to any specific work plan the contractor may submit.
This checklist should be completed by the Project Officer.
93
-------
APPENDIX D (Continued)
QUALITY ASSURANCE PROGRAM CHECKLIST
GENERAL INFORMATION
Laboratory:
Street Address:
Mailing Address (If Different):
City State Zip
Laboratory Telephone No.: Area Code No.
Laboratory Director:
Project Manager:
Contract/Grant Number:
Contract/Grant Title:
Project Officer:
Review Conducted By:
Agency and Address:
Telephone No.: Area Code No.
Reviewer Signature Date
QA manual returned to contractor/grantee .
Date
94
-------
APPENDIX D (Continued)
QA POLICY AND OBJECTIVES
Item
Does the organization maintain written QA
procedures which describe the routine steps
taken to assure the quality of sampling and
analytical data?
Is there a clear statement of quality
objectives by top management levels?
Does the organization have a stated QA policy
consistent with the program requirements of
lERL-Ci?
Are QA plans required for all major projects or
programs requiring extensive sampling and/or
analysis?
Does the appearance, format, and content of the
QA program manual(s) reflect a conscientious
concern for the quality of data produced by
the organization?
Yes
No
Comment
QA ORGANIZATION
Item
Are QA responsibilities and reporting relation-
ships within the organization clearly defined?
Does the QA program provide for a QA Supervisor
and are his responsibilities and authority
defined?
Does the QA Supervisor have access to senior
levels of management?
Are the QA-related responsibilities of
laboratory and field sampling personnel
clearly defined?
Yes
No
Comment
95
-------
APPENDIX D (Continued)
PERSONNEL TRAINING
Item
Has the designated QA Supervisor had formal
training in QA procedures (e.g., the EPA train-
ing course offered by EPA-Ci or EPA-Athens)?
Is there a training program for sampling
personnel?
Is there a training program for laboratory
personnel?
Does the training program go beyond "on the
job training" to include practice tasks,
seminars, instruction sessions, etc.?
Yes
No
Comment
DOCUMENT CONTROL AND REVISION
Item
Are there established procedures for documenting
sampling methods and for updating these methods
when required?
Are there established procedures for documenting
analytical methods and for updating these
methods when required?
Is there a system for documenting instrument
calibration procedures?
Does the QA program require documentation of
all computer programs that are used to calculate
or process data?
Are "standard" analytical methods used whenever
available and appropriate?
Yes
No
Comment
96
-------
APPENDIX D (Continued)
FACILITIES AND EQUIPMENT
PROCUREMENT AND INVENTORY PROCEDURES
Item
Does the organization have procedures for
procurement quality control for reagents,
glassware, etc. ?
Are analytical reagents dated upon receipt?
Are reagent inventories maintained on a
first-in, first-out basis?
Are analytical reagents checked out before use?
Is new analytical instrumentation tested
before use?
Yes
No
Comment
PREVENTIVE MAINTENANCE
Item
Does the laboratory have a preventive
maintenance program or schedule for
laboratory instrumentation?
Does the QA program require the maintenance of
instrument log books and specify their general
content?
Are preventive maintenance activities
documented in instrument log books?
Yes
No
Comment
97
-------
APPENDIX D (Continued)
ANALYTICAL METHODOLOGY
CALIBRATION AND OPERATION PROCEDURES
Item
Are general instrument calibration requirements
described in the QA program?
Does the QA program specify that calibration
standards should be traceable to primary (e.g.,
NBS) standards that are available?
Are instrument calibration data recorded in
instrument log books?
Are acceptability requirements established
for instrumentation?
Have instrument calibration schedules been
established?
Yes
No
Comment
FEEDBACK AND CORRECTIVE ACTION
Item
Does the QA program clearly state who is
responsible for taking corrective action
when analytical problems are encountered?
Are corrective action follow-up procedures
described?
Are corrective actions documented?
Yes
No
Comment
98
-------
APPENDIX D (Continued)
SAMPLING AND SAMPLE HANDLING PROCEDURES
CONFIGURATION CONTROL
Item
Are procedures described for configuration con-
trol of monitoring systems such as air monitors,
water quality monitors, flow monitors?
If an air or water pollution monitor is relo-
cated, would this be recorded in project files?
Yes
No
Comment
SYSTEMS RELIABILITY
Item
Are procedures described for maintaining the
reliability of data generating equipment and
instrumentation?
Do QA procedures require the documentation of
system reliability (e.g., percent uptime,
percent reliable data, etc.)?
Are reliability data used in revising
maintenance and/or instrument replacement
schedules?
Yes
No
Comment
99
-------
APPENDIX D (Continued)
QUALITY CONTROL
QUALITY CONTROL PROCEDURES
Item
Is duplicate analysis required on a minimum
of 5 to 15 percent of all samples?
Have acceptance criteria been established for
the precision of duplicate analysis results?
Are spiked sample analyses required on a
routine basis (e.g., 5 percent of samples) to
determine analytical recoveries?
have acceptance criteria been established for
spiked sample results?
Are quality control charts used to monitor
analyst performance on routine analyses?
Are reagent blank analyses run with each
set of samples?
Are split sample analyses used as part of
the quality control program?
Are a minimum of three and preferably more
standards required for standard curves?
Do routine procedures require that standard
curves bracket sample concentrations?
Yes
No
Comment
100
-------
APPENDIX D (Continued)
CONTROL CHECKS AND INTERNAL AUDITS
Item
Does the QA Supervisor or other supervisory
personnel audit laboratory procedures on a
routine, periodic basis?
If keypunching is required, have procedures
been established to identify keypunch errors?
Does the laboratory routinely use available EPA
reference test samples and/or standard reference
materials available from other sources (e.g.,
NBS) to evaluate analytical performance?
Are reference test samples used to evaluate
analytical performance at least semi -annually?
Are the results of intralaboratory performance
tests maintained by the QA Supervisor and used
to improve analytical performance?
Are reference test samples routinely prepared
and submitted to the analyst as "blind"
samples?
Does the laboratory participate in inter-
laboratory performance tests?
Are field sampling procedures audited at
periodic intervals by the QA Supervisor or
other supervisory personnel?
Yes
No
Comment
101
-------
APPENDIX D (Continued)
DATA HANDLING
DATA HANDLING, REPORTING, AND RECORDKEEPING
Item
Are procedures described for the proper
handling, routing, and reviewing of field
and laboratory data?
Is each sample assigned a unique identification
number upon collection?
Are laboratory notebooks used to record vital
operational information?
Are laboratory bench data reported in a
clear, logical format which permits review
and confirmation of calculations?
Are procedures described for the accurate and
complete labeling of strip charts?
Are field notebooks required for all field
data and observations?
Is the general content of field notebooks
specified?
Are field notebook entries reviewed by
supervisory personnel?
Are field notebooks signed by the individual
recording the information and by reviewers?
Are sample chain-of-custody procedures
described?
Are QA reports routinely prepared for
management review?
Are all data and records retained a minimum
of three years?
Yes
No
Comment
102
-------
APPENDIX D (Continued)
DATA VALIDATION
Item
Does the QA program require routine cross-
checking of manual data calculations?
Are supervisory personnel required to .review
all laboratory data?
Are acceptance criteria established for
laboratory data?
Are data collected by on-line or in situ systems
routinely checked for consistency and accuracy?
Are computer programs validated before use?
Are data validation activities recorded for
future reference?
Does the QA program describe the handling
of invalid data?
Yes
No
Comment
103
-------
APPENDIX D (Continued)
Overall, does the QA program described in the document reviewed meet the QA
program requirements of lERL-Ci? Yes No
Describe where the QA program departs from lERL-Ci requirements and explain
what corrective action, if any, is necessary.
104
-------
APPENDIX E
PROJECT QUALITY ASSURANCE CHECKLIST
This checklist has been prepared to assist Project Officers in reviewing
project work plans or study plans with respect to sampling and analytical
quality assurance. The checklist is not comprehensive; it does not address
all factors that would affect the quality of the sampling and analytical
data on all projects. It does, however, represent the type of detail that
should be included in a well-written Project Work Plan. The use of this
checklist in reviewing project work plans should allow the Project Officer
to judge the adequacy of proposed quality assurance procedures.
This checklist applies to a contractor's specific work plan as applied to a
project or task. It should be used to insure that the contractor has
prepared a work plan which includes sufficient quality assurance to complete
the project in a satisfactory manner.
Appendix E differs from Appendix D in that it applies specifically to a
project or task, whereas the checklist in Appendix D applies to the
contractor's general Quality Assurance Program.
This checklist should be completed by the Project Officer.
105
-------
APPENDIX E (Continued)
PROJECT QUALITY ASSURANCE CHECKLIST
GENERAL INFORMATION
Laboratory:
Street Address:
Mailing Address (If Different):
City State Zip
Laboratory Telephone No. : Area Code No.
Laboratory Director:
Project Manager:
Contract/Grant Number:
Contract/Grant Title:
Project Officer:
Review Conducted By:
Agency and Address:
Telephone No. : Area Code No.
Reviewer Signature Date
QA manual returned to contractor/grantee .
Date
106
-------
APPENDIX E (Continued)
PROJECT OBJECTIVES
Item
Is a clear statement made of the objectives of
the project and the use of the sampling and
analysis data?
Has a statement been made, or can the level of
importance to be attached to the QA considera-
tions be derived from stated project objectives?
Yes
No
Comment
«i
PROJECT STAFFING
Item
Has a project QA Supervisor been assigned
to the project team?
Is the project organization structure
appropriate to accomplish the QA objectives
of the project?
Do personnel assigned to this project have
the appropriate educational background to
successfully accomplish project objectives?
If any special training or experience is
required, is it represented on the project
staff?
Will the training of personnel be required
specifically for this project? If so, is it
covered in the project plan?
Is there adequate staffing to accomplish the
planned work in a high-quality manner within
the project schedule?
Yes
No
Comment
107
-------
APPENDIX E (Continued)
FACILITIES, EQUIPMENT, AND INSTRUMENTATION
Item
Is appropriate and adequate sampling
equipment available?
Will appropriate sample containers be used
for the parameters measured?
If in situ, on-line, or monitoring instrumenta-
tion's to be used, is it clearly specified as
to make, model, and performance specifications?
Are the performance specifications of all on-
line or in situ instrumentation adequate to
meet project reliability and data quality
requirements?
Has a plan been made to optimize system relia-
bility by requiring periodic performance checks,
calibration, and preventive maintenance?
Are procedures described for documenting and
controlling the configuration of all systems?
Is laboratory instrumentation and equipment
suitable to meet the data quality needs of
the project?
Yes
No
Comment
SAMPLING PLAN AND METHODS
Item
Are sampling procedures described in detail?
Will standard sampling methods be used where
available?
Are precautions described to avoid sample
contamination?
Yes
No
Comment
108
-------
APPENDIX E (Continued)
SAMPLING PLAN AND METHODS (continued)
Item
Are site or system diagrams included which show
clearly and precisely the location of sample
collection?
Will any samples be taken in duplicate in
order to define sampling variability?
Are background samples to be taken?
Is sampling frequency adequate and appropriate
to the purposes of the project?
Is the sampling program assigned to assure all
samples are representative of the source?
Will field notebooks be used to record
important observations and on-site data?
Will field data or system data be verified
by supervisory personnel on a routine basis?
Do sampling plans allow for delivery of the
samples to the laboratory in time to meet
maximum holding time limitations?
Will the samples be preserved? If so, will
EPA-accepted preservation methods be used?
If any new sampling methods are to be used,
will they be adequately tested before use?
Yes
No
Comment
109
-------
APPENDIX E (Continued)
ANALYTICAL PLAN AND METHODS
Item
Will standard analytical (EPA-approved)
procedures be used where appropriate and
available?
Does the project plan include a copy of all
non-standard analytical procedures?
If any new analytical procedures are to be used,
will they be adequately tested before use?
Will use of the analytical methods specified
result in data of adequate detection limit,
accuracy, and precision to meet the require-
ments of the project?
Will duplicate analyses be conducted on at
least 10 percent of the samples?
Will spike sample analyses be conducted on
at least 5 percent of the samples?
Will reagent blank samples be run?
Will split sample analysis be conducted?
Will any field spiked samples be processed?
Will instruments and measurement systems be
calibrated with adequate frequency (at least
daily)?
Will calibration materials that are traceable
to NBS standards be used where available?
Yes
No
Comment
110
-------
APPENDIX E (Continued)
DATA MANAGEMENT
Item
Will data be validated before entering
into automated data systems?
Will automated data handling programs or
computer models be adequately documented
and verified before use?
Will mathematical and computer models be
verified by actual data?
Is the statistical treatment of the data
described and does it meet 'project
requirements?
Will a project QA report be prepared to
summarize all quality control data?
Yes
No
Comment
PROJECT SCHEDULE
Item
Does the project plan show adequate time to
accomplish the sampling program, and does it
allow for uncontrollable delays, such as bad
weather?
Will interim sampling and analysis program
results be reported to the Project Officer
for review and comment?
Does the project schedule allow sufficient time
between sample collection and reporting of the
data to apply adequate analytical quality
control, including supervisory review?
Yes
No
Comment
111
-------
APPENDIX E (Continued)
Overall, does the Project Work Plan meet the QA requirements of lERL-Ci?
Yes No
Describe changes or improvements that should be incorporated into a revised
work plan?
112
-------
APPENDIX F
QUALITY ASSURANCE PRE-AUDIT WORKSHEET
The purpose of the Quality Assurance Pre-Audit Worksheet is to provide the
auditing agency with information with which to familiarize itself with the
project. It should be sent to the contractor, completed by the contractor,
and returned before any audit trip is scheduled.
This form is to be used when the decision has been made to perform a project
audit.
This checklist should be completed by the contractor's project personnel.
INSTRUCTIONS
1. It is not necessary to type your replies. Completion of the form in ink
will suffice. Alternatively, appropriate sections of the project plan
may be attached where appropriate.
2. You are not limited to yes and no answers. Feel free to elaborate at
any point in the form.
3. Careful attention should be given to proper completion of this form
since the information supplied will have a direct bearing on conclusions
drawn and recommendations made concerning the evaluation of your
laboratory.
4. Answer only those questions contained in the worksheet which are project
specific. Indicate non-applicable questions.
113
-------
APPENDIX F (Continued)
QUALITY ASSURANCE PRE-AUDIT WORKSHEET
Date
Laboratory:
Street Address:
Mailing Address (If Different):
City State Zip
Laboratory Telephone No. : Area Code No.
Laboratory Director:
Quality Assurance Supervisor:
Pre-Audit Worksheet Completed By (if more than one, please indicate which
sections):
Name Title
No. of Contract or Grant for Which Audit Is Being Performed:
Title of Contract:
Audit to be Conducted By:
Agency and Address:
-Do Not Write Below This Line-
Telephone No. : Area Code No.
Quality Assurance Audit Scheduled On:
114
-------
APPENDIX F (continued)
A. ORGANIZATION AND PERSONNEL
A.I. Please use a simple block diagram to illustrate the structure of your
organization and how the laboratory functions within it. Identify key
management personnel.
ORGANIZATIONAL CHART
115
-------
APPENDIX F (continued)
A.2. Please use a simple block diagram to illustrate the organization of the
project sampling and sample analysis functions. Identify: key project
management personnel, the quality assurance supervisor, the person
directly responsible for sampling, and the person directly responsible
for analyses.
PROJECT ORGANIZATIONAL CHART
116
-------
A.3. PROJECT PERSONNEL
Laboratory
APPENDIX F (continued)
Contract No.
Date
Position
Name
Academic
Training
Years
Experience
Special Training
Responsibilit ies
On This Project
-------
APPENDIX F (continued)
A.4. Describe or illustrate the planned schedule for completion of the
project. Show the schedule for all sampling activites and the period
of time scheduled for sample analysis.
PROJECT SCHEDULE
118
-------
APPENDIX F (continued)
B. FACILITIES
Instrumentation: List major instrumentation (including field instrumen-
tation and equipment) required for the performance of this project.
Prepare separate lists for radiation, chemistry, air, biology, micro-
biology, etc.
Item
Manufacturer
Model
Age
Condition
119
-------
APPENDIX F (continued)
0. ANALYTICAL METHODOLOGY
List all laboratory methods used in the performance of this project.
biology, radiation, air, microbiology, etc.
Use separate pages for chemistry,
Scientific Area
fO
O
Parameter
Name or
Description of Method
Reference
(Cite Page and Year)
Latest Reference Sample Check
(Cite Agency of Origin and Date)
-------
APPENDIX F (continued)
D. SAMPLING AND SAMPLE HANDLING PROCEDURES
D.I. Describe the sampling plan and draw a diagram indicating sampling
locations, how to locate the sampling points and other pertinent
information. Please indicate the scientific area (biology, radiation,
chemistry, air, etc.) and use separate sheets if addressing more than
one field.
Scientific Area
121
-------
APPENDIX F (continued)
D.2. Describe the information obtained in field notebooks or worksheets,
D.3. Describe sample chain-of-custody procedures employed by your
laboratory.
122
-------
APPENDIX F (continued)
D.4. List sample types collected, methods of preservation, etc. Please use separate pages for chemistry,
biology, radiation, air, microbiology, etc.
SAMPLE COLLECTING, HANDLING, AND PRESERVATION
Scientific Area
Parameter
Container Used
Preservative Used
Normal Maximum
Holding Time
Method of Transport
-------
APPENDIX F (continued)
E. QUALITY CONTROL
E.I. Are analytical quality control records available for review?
E.2. Are duplicate sample analyses conducted?
If so, at what frequency?
E.3. Are spiked (recovery) sample analyses conducted?
If so, at what frequency?
E.4. Are performance test samples analyzed?
If so, how often?
E.5. Are quality control charts maintained?
E.6. Does the laboratory have a Quality Assurance Manual?
If so, is it available for review?
E.7. Does each analyst have access to approved and documented analytical
procedures?
E.8. Has a quality assurance plan been prepared for the sampling required by
this project?
E.9. How are quality control data used?
124
-------
APPENDIX F (continued)
F. DATA HANDLING
F.I. How long are laboratory records and field data retained after
completion of a project?
F.2. Have procedures been established for cross-checking laboratory
calculations?
F.3. Have procedures been established for cross-checking reported data?
F.4. Are laboratory data reviewed by supervisory personnel?
125
-------
APPENDIX G
INSTRUMENTATION, EQUIPMENT, AND PERSONNEL
SKILL RATING FOR SPECIFIC METHODS
The arbitrary rating numbers used in this appendix for the degree of
skills required are:
Rating 1—A semi-skilled sub-professional with limited background
Rating 2—An experienced aide (sub-professional) with a background in
general laboratory techniques and some knowledge of chemistry, or a
professional with modest training or experience
Rating 3—Requires a good background in analytical techniques
Rating 4—Requires an individual with experience on complex
instrumentation, some degree of specialization, and the ability to
interpret results
Specific analytical methods are presented as follows.
Page
WATER
GENERAL ANALYTICAL METHODS 127
TRACE METALS 130
OTHER REFERENCE METHODS FOR METALS 131
NUTRIENTS, ANIONS, AND ORGANICS 133
OTHER PARAMETERS 139
AMBIENT AIR 139
SOURCE AIR 141
BIOLOGY 142
RADIATION 145
126
-------
APPENDIX G (continued)
Skill
Rating No.
WATER
GENERAL ANALYTICAL METHODS
1. Alkalinity
(a) Electrometric Titration, Manual: pH meter, Type I or II
as defined in ASTM D1293 1
(b) Electrometric Titration, Automated: An automatic
titrimeter meeting the pH meter specifications in (a) 1
(c) Automated, Methyl Orange: AutoAnalyzer with appropriate
analytical manifold and 550-nm filters 2, 3
2. Biochemical Oxygen Demand (BOD), 5-day, 20°C
(a) Modified Winkler with Full-Bottle: BOD incubation bottles;
BOD incubator 2 , 3
(b) Probe Method: No specific probe is recommended as superior
in the 1974 EPA Methods Manual 2, 3
3. Chemical Oxygen Demand (COD)
Reflux Apparatus 2 , 3
4. Residue, Total
Gravimetric, dried at 103-105°C: Blender (if samples contain oil
or grease); Porcelain, vycor, or platinum evaporating dishes;
Muffle furnace, 550°C; Steam bath or 98°C oven; Drying oven,
103-105°C; Dessicators; Analytical balance, 200-g capacity,
weighting to 0.1 mg 1, 2
5. Residue, Total Filterable
Glass Fiber Filtration, dried at 180°C: Glass fiber filter discs
(0.45-um glass fiber filter); Filter holder, membrane filter
funnel, or Gooch crucibles and adapter; Suction flask; Porcelain,
vycor, or platinum evaporating dishes; Muffle -furnace, 550°C;
Steam bath; Drying oven, 180°C; Dessicators; Analytical balance,
200-g capacity, weighting to 0.1 mg 1, 2
127
-------
APPENDIX G (continued)
Skill
Rating No,
WATER (Continued)
6. Residue, Total Non-Filterable
Glass Fiber Filtration, Dried at 103-105°C: Same as (5), except
drying oven is at 103-105°C and steam bath, muffle furnace, and
evaporating dishes are not required 1, 2
7. Residue, Total Volatile
Gravimetric, Dried at 550°C: Same as (5) 1, 2
8. Ammonia (as N)
(a) Distillation and Titration: All glass distillation
apparatus (Kjeldahl); Standard titration apparatus 2, 3
(b) Distillation and Nesslerization: All-glass distillation
apparatus (Kjeldahl); Nessler tubes, 50 ml, matched set,
APHA standard; Spectrophotometer or filter photometer for
use at 425 nm with light path _>. 1 cm 2, 3
(c) Distillation and Ammonia Electrode: All-glass distillation
apparatus (Kjeldahl); Electrometer (pH meter) with expanded
mV scale or specific ion meter; Ammonia-selective electrode;
Magnetic stirrer, thermally-insulated, and Teflon-coated
stirring bar 2, 3
(d) Automated Colorimetric Phenate Method: AutoAnalyzer with
appropriate analytical manifold and 630-660 nm filter 2, 3
9. Total Kjeldahl Nitrogen (as N)
(a) Digestion, Distillation, and Titration: Same as 8(a) with
suction takeoff to remove SO^ fumes during digestion 2, 3
(b) Digestion, Distillation, and Nesslerization: Same as 8(b)
with suction takeoff to remove SO^ during digestion 2, 3
(c) Digestion, Distillation, and Ammonia Electrode: Same as
8(c) with suction takeoff to remove 803 fumes during
digestion 2, 3
(d) Automated Phenate Method: AutoAnalyzer with appropriate
analytical manifold and 630-nm filter 2, 3
128
-------
APPENDIX G (continued)
Skill
Rating No.
WATER (Continued)
(e) Automated Selenium Method: AutoAnalyzer with appropriate
analytical manifold and 630- or 650-nm filters 2, 3
10. Nitrate (as N)
(a) Cadmium Reduction Method (Nitrate-Nitrite): Glass fiber
or membrane filters and associated apparatus; Copper/
cadmium reduction column; Spectrophotometer or filter
photometer for use at 540 nm with light path > 1 cm 2, 3
(b) Automated Cadmium Reduction Method (Nitrate-Nitrite):
Glass fiber or membrane filters and associated apparatus;
Copper/cadmium reduction column; AutoAnalyzer with
appropriate analytical manifold and 540-nm filter 2, 3
(c) Brucine Method: Spectrophotometer or filter photometer
for use at 410 nm; Water bath at 100°C (Temperature
control is critical: all sample tubes must be held at the
same temperature, and temperature must not drop
significantly when tubes are immersed in bath); Water bath
at 10-15°C; Neoprene-coated wire rack for holding sample
tubes in baths; Glass sample tubes (40-50 ml) 2, 3
11. Phosphorus, Total as Ortho (as P)
(a) Single Reagent (Ascorbic Acid Reduction Method):
Spectrophotometer or filter photometer for use at 650 nm
(less sensitive) or 880 nm; Acid-washed, detergent-free
glassware; Hotplate or autoclave (for persulfate
digestion) 2, 3
(b) Automated Colorimetric Ascorbic Acid Reduction Method:
Acid-washed, detergent-free glassware; Hotplate or
autoclave (for persulfate digestion); AutoAnalyzer with
appropriate analytical manifold and 650-660 nm or 880-nm
filter 2, 3
12. Acidity
(a) Hydrogen Peroxide Digestion and Electrometric Titration:
pH meter, Type I or II as defined in ASTM D1293 1, 2
(b) Hydrogen Peroxide Digestion and Phenolphthalein End-Point
Titration: No special equipment other than standard
laboratory glassware 1, 2
129
-------
APPENDIX G (continued)
Skill
Rating No,
WATER (Continued)
13. Organic Carbon, Total (TOC)
Combustion and infrared method (CC^) or flame ionization
method (CH^); Blender; Apparatus for total and dissolved
organic carbon . ... 2, 3
14. Hardness, Total
(a) EDTA Titration: No special equipment other than standard
laboratory glassware 1
(b) Automated Colorimeter: AutoAnalyzer with appropriate
analytical manifold and 520-nm filter 2, 3
15. Nitrate (as N)
(a) Manual Colorimeter Diazotization: Spectrophotoraeter for
use at 540 nm with cells > 1 cm; Nessler tubes or
volumetric flasks, 50 ml 2
(b) Automated Colorimetric Diazotization: Glass fiber or
membrane filters and associated apparatus; AutoAnalyzer
with appropriate analytical manifold and 540-nm filter 2, 3
TRACE METALS
EPA specifies atomic absorption as at least one of the reference
methods for many metals. The required equipment in each case will
include: (1) an atomic absorption spectrophotometer, (2) the hollow
cathode (or electrode-less discharge) lamp for each metal, and (3) the
fuels and other apparatus specified below. Design features of some
common atomic absorption spectrophotometers (as of March 1979) are
discussed in the EPA Handbook for Analytical Quality Control in
Water and Wastewater Laboratories. If extraction procedures are to
be used, special reagents are required, but no special equipment
other than standard laboratory glassware.
16. Metal by Atomic Absorption 2, 3
130
-------
APPENDIX G (continued)
Skill
Rating No,
WATER (Continued)
OTHER REFERENCE METHODS FOR METALS
17. Aluminum
Eriochrome Cyanine R Colorimetric Method: Spectrophotometer
for use at 535 nm, or Filter photometer with 525-535 nm
filters (green), or Nessler tubes, 50 ml 2, 3
18. Arsenic
Gaseous Hydride—Silver Diethyldithiocarbamate Colorimetric
Method: Arsine generator and absorption tube; Spectrophoto-
meter for use at 535 nm, or Filter photometer with 530-540 nm
filter (green) 2, 3
19. Beryllium
Aluminon Method: Spectrophotometer or filter photometer for
use at 515 nm with 5-cm cells 2, 3
20. Boron
Curcumin Method: Spectrophotometer or filter photometer for
use at 540 nm with cells > 1 cm; Vycor or platinum evaporating
dishes, 100-150 ml; Water bath, 55 _+ 2°C; Ion exchange column,
50 cm x 1.3 cm (diameter) 2, 3
21. Cadmium
Dithizone Colorimetric Method: Spectrophotometer or filter
photometer for use at 515 nm 2, 3
22. Calcium
EDTA Titration: No special equipment 1
23. Chromium VI
Diphenylcarbazide Colorimetric: Membrane or sintered glass
filter; Spectrophotometer or filter photometer for use at
540 nm with cells > 1 cm 2, 3
131
-------
APPENDIX G (continued)
Skill
Rating No.
WATER (Continued)
24. Chromium, Total
Oxidation and Diphenylcarbazide Colorimetric: Membrane
or sintered glass filter; Spectrophotometer or filter
photometer for use at 540 nm with cells _>. 1 cra 2, 3
25. Copper
Neocuproine Colorimetric: Spectrophotometer for use at
457 nm with cells ^ 1 cm, or filter photometer with narrow-band
violet filter (max. transmittance at 450-460 nm) and cells
2^ 1 cm, or Nessler tubes, 50 ml 2, 3
26. Iron
0-Phenanthroline Colorimetric: Spectrophotometer or filter
photometer for use at 510 nm with cells > 1 cm, or Nessler
tubes, 100ml 7 2, 3
27. Lead
Dithizone Colorimetric: Spectrophotometer or filter photometer
for use at 520 nm with cells > 1 cm; pH meter 2, 3
28. Magnesium
Gravimetric: No special equipment 2
29. Mercury
Manual Cold Vapor Technique (Water or Sediment): Commercially
available mercury analyzer employing this technique, or atomic
absorption Spectrophotometer with open sample presentation area
for mounting 10-cm absorption cell; Mercury hollow cathode lamp;
Recorder, multi-range, variable speed, compatible with UV
detection system; Absorption cell, 10 cm, quartz end windows,
vapor inlet and outlet ports; Air pump, peristaltic, 1 liter/
minimum; Flowmeter; Aeration tubing and drying tube (or
incandescent lamp to warm cell); Autoclave (optional, for
digestion procedure) 2, 3
30. Nickel
Heptoxime Colorimetric Method: Spectrophotometer or filter
photometer for use at 445 nm with cells 2. 1 cm 2, 3
132
-------
APPENDIX G (continued)
Skill
Rating No.
WATER (Continued)
31. Potassium
(a) Colorimetric: Spectrophotometer for use at 425 nm with
cells _> 1 era, or Filter photometer with violet filter
(maximum transmittance near 425 mm) and > 1 cm cells,
or Nessler tubes, 100 ml; Centrifuge and 25-ml centrifuge
"tubes 2, 3
(b) Flame Photometric: Flame photometer, direct-reading or
internal-standard, and associated equipment for measure-
ment at 768 nm 2, 3
32. Sodium
Flame Photometric: Flame photometer, direct-reading or
internal-standard, and associated equipment for measurement
at 589 nm; For low-solids water, air filter and blower for
burner housing, oxyhydrogen flame, and polyethylene or
Teflon cups, bottles, etc 2, 3
33. Vanadium
Colorimetric (Catalysis of Gallic Acid Oxidation): Spectro-
photometer or filter photometer for use at 415 nm with 1-5 cm
cells; Water bath, 25^0.5°C 2, 3
34. Zinc
Dithizone Colorimetric Method: Spectrophotoraeter or filter
photometer for use at 535 or 620 nm with 2-cm cells, or
Nessler tubes, matched; pH meter 2, 3
NUTRIENTS, ANIONS, AND ORGANICS
35. Organic Nitrogen (as N)
Kjeldahl Nitrogen Minus Ammonia Nitrogen: See (8) and (9)
above 2 , 3
36. Sulfate (as 804)
(a) Gravimetric: Analytical balance, weighing to 0.1 mg;
Steam bath; Drying oven, 180°C; Muffle furnace, 800°C;
Appropriate filters or crucibles 2
133
-------
APPENDIX G (continued)
Skill
Rating No.
WATER (Continued)
(b) Turbidimetric: Nepheloraeter, or Spectrophotometer or
filter photometer for use at 420 nm with 4-5 cm cells;
Magnetic stirrer with timer or stopwatch 2
(c) Automated Colorimetric Barium Chloroanilate: Auto-
Analyzer with appropriate analytical manifold and
520-nm filter; Magnetic stirrer 2, 3
37. Sulfide (as S)
Titrimetric Iodine: No special equipment other than standard
laboratory glassware 2
38. Sulfite (as S03)
Titrimetric lodide-Iodate: No special equipment other than
standard laboratory glassware 2
39. Bromide
Titrimetric lodide-Iodate: No special equipment other than
standard laboratory glassware 2
40. Chloride
(a) Silver Nitrate: No special equipment other than standard
laboratory glassware 1
(b) Mercuric Nitrate: No special equipment other than
standard laboratory glassware 1
(c) Automated Colorimetric Ferricyanide: AutoAnalyzer with
appropriate analytical manifold and 480-nm filter 2, 3
41. Cyanide, Total
(a) Distillation and Silver Nitrate Titration: Cyanide
distillation apparatus; Koch microburet, 5 ml 2, 3
(b) Distillation and Pyridine-Pyrazolone (or Pyridine-
Barbituric Acid) Colorimetric: Cyanide distillation
apparatus; Spectrophotometer or filter photometer for
use at 578 or 620 nm with > 1-cra cells 2, 3
134
-------
APPENDIX G (continued)
Skill
Rating No.
WATER (Continued)
42. Fluoride
(a) Distillation—SPADNS: Bellack distillation apparatus;
Spectrophotometer for use at 570 nra with > 1-cra cells,
or Filter photometer with green-yellow filter (max.
transmittance 550-580 nm) and > 1-cm cells 2, 3
(b) Automated Complexone Method: AutoAnalyzer with
appropriate analytical manifold and 650-nm filter 2, 3
(c) Fluoride Electrode: Electrometer; Fluoride ion activity
electrode; Reference electrode, single junction, sleeve-
type; Magnetic mixer 2
43. Chlorine, Total Residual
(a) Starch-Iodide Titration: No special equipment other than
standard laboratory glassware 2
(b) Amperometric Titration: Amperoraetric end-point detection
apparatus consisting of noble metal electrode, salt bridge,
and silver-silver chloride reference electrode cell unit
connected to microammeter with appropriate electrical
accessories; Agitator 2
44. Oil and Grease
(a) Gravimetric: Separatory funnels or soxhlet apparatus;
Vacuum 2 , 3
(b) Infrared: Separatory funnels; Infrared Spectrophotometer,
double beam, with 1-, 5-, and 10-cra cells 2, 3
45. Phenols
(a) Colorimetric (4-AAP Method with Distillation): Phenols
distillation apparatus; Spectrophotometer or filter
photometer for use at 460 nm (following chloroform
extraction) or 510 nm and 1- to 10-cm cells; pH meter 2, 3
(b) Automated 4-AAP Method: AutoAnalyzer with appropriate
analytical manifold and 505- or 520-nm filter 2, 3
135
-------
APPENDIX G (continued)
Skill
Rating No.
WATER (Continued)
46. Surfactants
Methylene Blue Colorimetric: Spectrophotometer or filter
photometer for use at 625 nm with > 1-cm cells 2
47. Algicides
Gas Chromatography: There is no reference procedure for
algicides as a class, and, therefore, detailed equipment
requirements cannot be specified. For general discussions of
gas chromatography and its application in environmental
monitoring, see the EPA Training Manual for Pesticide Residue
Analysis in Water and the EPA Methods Manual for Analysis of
Pesticide Residues in Human and Environmental Samples 3, 4
48. Benzidine
Diazotization and Colorimetric: Spectrophotometer, scanning,
510-370 nm; Cells, 1- to 5-cm pathlength, 20-ml max. volume 3
49. Chlorinated Organic Compounds (Except Pesticides)
Gas Chromatography: There is no reference procedure for
chlorinated organic compounds as a class, and, therefore,
detailed equipment requirements cannot be specified. Gas
chromatography with electron capture, microcoulometry, or
electrolytic conductivity detection may be appropriate for
individual compounds or groups of compounds. For general
discussions of gas chromatography and its application in
environmental monitoring, see the EPA Training Manual for
Pesticide Residue Analysis in Water and the EPA Methods Manual
for Analysis of Pesticide Residues in Human Environmental
Samples 3 , 4
50. Pesticides
There is no single reference procedure for pesticides as a
class. However, specific reference procedures for several
sub-classes are available from EMSL, U.S. EPA, Cincinnati,
Ohio. To be qualified in this parameter, the laboratory should
be equipped to analyze for all specified sub-classes. The
analysis of pesticides at the levels normally found in waste-
water and other environmental sources requires special expertise
and experience, in addition to up-to-date, well-maintained,
136
-------
APPENDIX G (continued)
Skill
Rating No,
WATER (Continued)
calibrated instrumentation and apparatus. The equipment
lists below are based on the EMSL methods; for further
information on the equipment and methodology of pesticide
analysis, see the EPA Training Manual for Pesticide Residue
Analysis in Water and the EPA Methods Manual for Analysis of
Pesticide Residues in Human and Environmental Samples
(a) Organochlorine Pesticides: Gas chromatograph with
(1) Glass-lined injection port, (2) One or more of the
following detectors: electron capture, radioactive
(H3 or Ni^3), microcoulometric titration;
electrolytic conductivity, (3) Recorder, potentio-
metric, 10" strip chart, and (4) Appropriate Pyrex
gas chromatographic columns; Snyder columns, 3-ball
(macro) and 2-ball (micro), and other K-D glassware;
Appropriate columns for liquid-solid partition
chromatography; Blender; and Special materials, such
as PR Grade Florisil and pesticide standards 3, 4
(b) Organophosphorus Pesticides: Gas chromatograph with
(1) Glass-lined injection port, (2) One or more of
the following detectors: flame photometric, 526-nm
phosphorus filter; electron capture, radioactive
(H3 or Ni63), (3) Recorder, potentiometric, 10"
strip chart, and (4) Appropriate Pyrex gas chromato-
graphic columns; Snyder columns, 3-ball (macro) and
2-ball (micro), and Other K-D glassware; Appropriate
columns for liquid-solid partition chroraatography;
Blender; Special materials, such as PR Grand Florisil
Woelm neutral alumina, and pesticide standards 3, 4
(c) Polychlorinated Biphenyls (PCBs): Gas chromatograph
with (1) Glass-lined injection port, (2) One or more
of the following detectors: electron capture,
radioactive (H3 or Ni"3), microcoulometric
titration, electrolytic conductivity, (3) Recorder,
potentiometric, 10" strip chart, and (4) Appropriate
Pyrex gas chromatographic columns; Snyder column,
3-ball (macro); Appropriate columns for liquid-solid
partition chromatography; Low-pressure regulator
(0-5 psig) with low-flow needle valve; Blender;
Special materials, such as PR Grade Florisil, high-
quality silica gel, and Aroclor (PCB) standards 3, 4
137
-------
APPENDIX G (continued)
Skill
Rating No.
WATER (Continued)
(d) Triazine Pesticides: Gas chromatograph with (1) Glass-
lined injection port, (2) Electrolytic conductivity
detector, (3) Recorder, potentiometric, 10" strip
chart, and (4) Appropriate Pyrex gas chromatographic
column; Snyder columns, 3-ball (macro) and 2-ball
(micro), and Other K-D glassware; Appropriate
columns for liquid-solid partition chromatography;
Blender; Special material, such as PR Grade Florisil
and pesticide standards 3, 4
(e) 0-Aryl Carbamate Pesticides: Thin layer chromatography
plates, 200 x 200 mm, coated with Silica Gel G, 0.25 mm;
Associated TLC apparatus, including spotting template,
developing chamber, and sprayer (20 ml) 3, 4
51. Organics
Gas chromatography/mass spectrometry 4
52. Organics
High-pressure liquid chromatography 4
53. Specific Conductance (mho/cm @ 25°C)
Wheatstone bridge: Commercial conductivity meter, or
apparatus consisting of (1) Wheatstone bridge (reading to
1% accuracy or better), (2) Appropriate source of electrical
current, (3) Specific conductance cell, (4) Water bath, 25CC,
with racks 1
54. Turbidity (Jackson Units)
Turbidimeter Method: Nephelometric turbidimeter 1
55. Streptococci Bacteria, Fecal (Number/100 ml)
(a) Membrane Filter: Autoclave (to 121°C); Filter membranes;
Petri culture dishes; Incubator, 35 + 0.5°C, ca. 90%
relative humidity; Low-power (10-15xT, binocular,
wide-field, dissecting microscope and light source 2, 3
(b) MPN: Autoclave (to 121°C); Inoculation tubes; Incubator,
35 + 0.5°C 2, 3
138
-------
APPENDIX G (continued)
Skill
Rating No.
WATER (Continued)
(c) Plate Count: Autoclave (to 121°C); Petri culture dishes;
Incubator, 35 + 0.5°C; Microscope and light source, or
colony counter; Petri culture dishes; incubator,
35 _+ 0.5°C; Microscope and light source, or colony
counter 2, 3
56. Coliforra Bacteria, Fecal (Number/100 ml)
(a) MPN: Autoclave (to 121°C); Inoculation tubes; Incubator,
35 _+ 0.5°C; Water bath, 44.5jf 0.2°C 2, 3
(b) Membrane Filter: Autoclave (to 121°C); Filter membranes;
Petri culture dishes; Water bath, 44.5 _+ 0.2°C; Low-power
(10-15X), binocular, wide-field, dissecting microscope
and light source 2, 3
57. Coliforra Bacteria, Total (Number/100 ml)
(a) MPN: Same as 56 (a) 2, 3
(b) Membrane Filter: Same as 56 (b) 2, 3
OTHER PARAMETERS
58. Temperature: Good quality mercury-filled or dial-type
centrigrade thermometer, or a thermistor 1
59. pH: pH meter (electrometer using either glass electrode and
reference, such as saturated calomel, or a combination glass
and reference electrode) 1
AMBIENT AIR
60. Sulfur Dioxide (ug/m-3 or ppm)
(a) Pararosaniline Method: Absorber; Pump; Air flowmeter
or critical orifice; Spectrophotoraeter for use at 548 nm,
band width < 15 nm, with 1-cm cells 2, 3
(b) Automated Pararosaniline: Autoanalyzer with appropriate
manifold and 548-nm filter 2, 3
(c) Continuous Analyzer: EPA-designated equivalent method
instrumentation 2 , 3
139
-------
APPENDIX G (continued)
Skill
Rating No,
AMBIENT AIR (continued)
61. Suspended Particulates (ug/m^)
(a) High-Volume Sampler: High-volume sampler; Shelter for
sampler; Flow measurement equipment, including (1) Rota-
meter, (2) Orifice calibration unit, (3) Differential
manometer, (4) Positive displacement meter; Barometer;
Environment for conditioning filters; Analytical balance:
chamber to hold unfolded 8" x 10" filters, sensitivity
=0.1 mg; Glass fiber filters; Acceptable alternative
equipment for flow measurement (3-6): Exhaust orifice
meter, interfaced with a circular chart recorder 1, 2
(b) Continuous High-Volume: EPA-designated equivalent
instrumentation 2 , 3
62. Carbon Monoxide (ug/m^ or ppm)
Non-Dispersive Infrared Spectrometry: Carbon monoxide
analyzer; Pump, flow control valve, and flowraeter; In-line
filter for particles (2-10 urn); Moisture control (refriger-
ation unit or drying tube) 2, 3
63. Photochemical Oxidant (63) (ug/m-^ or ppm)
Chemiluminescence, Continuous: Commercial photochemical
oxidant (63) analyzer, or apparatus consisting of
(1) Detector cell, (2) Flowmeters (air and ethylene),
(3) Air inlet filter (Teflon, 5 m), (4) Photomultiplier tube,
(5) High voltage power supply, (6) Direct current amplifier,
(7) Recorder, (8) Ozone source (low pressure Hg lamp/quartz
tube) and dilution system; Apparatus for calibration
(KI -> I2 Spectrophotoraetric Method) 2, 3
64. Total Hydrocarbons (Corrected for Methane) GC - FID
Method: Commercially-available THC, CH^, and CO Analyzer;
Pump, flow control valves, automatic switching valves, and
flowmeter; In-line filter (3-5 um); Stripper or precolumn;
Oven (for column and catalytic converter) 2, 3
140
-------
APPENDIX G (continued)
Skill
Rating No.
AMBIENT AIR (continued)
65. Nitrogen Dioxide (ug/nH or ppm)
(a) Arsenite 24-Hour Sampling Method: Sampling train
(bubbler, trap, membrane filter, 27-gauge hypodermic
needle, air pump, calibration equipment); Standard
glassware (volumetrics, pipets, graduated cylinders,
etc.); Spectrophotoraeter or colorimeter for use at
540 nm 2, 3
(b) Continuous Chemiluminescent Method: Commercial chemi-
luminescent analyzer [generally including particulate
filter, thermal converter, (N02 NO), ozone generator,
reaction chamber, optical filter, photomultiplier tube,
and vacuum pump; instrument will be specified as EPA
equivalent method]; Calibration apparatus (Gas-Phase
Titration Method) [generally including air flow con-
troller, air flowmeters, pressure regulator for NO
cylinder, NO flowmeters, capillary restriction, ozone
generator, reaction chamber and mixing bulb, sample
manifold, NO detector, iodometric calibration apparatus]... 2, 3
(c) Griess-Saltzman Colorimetric, Continuous: Sampling
train; Colorimeter for use at 550 nm 2, 3
SOURCE AIR
66. Stack Gas Velocity (EPA Method 2): Pitot tube 2
67. Dry Molecular Weight of Gas (EPA Method 3)
(a) Or sat 2
(b) Gas chromatograph with thermal conductivity detector 2
68. Stack Gas Moisture (EPA Method 4): Midget impingers;
Sample metering pump 2
69. Particulates (EPA Method 5): Heated filter holder;
Impingers; Sample metering pump; Analytical balance;
Heated probe 2, 3
70. Sulfur Dioxide (EPA Method 6): Impingers; Sample
metering pump; Burettes 2, 3
141
-------
APPENDIX G (continued)
Skill
Rating No.
SOURCE AIR (Continued)
71. Nitrogen Oxide (EPA Method 7): Round-bottom flasks; Vacuum
pump; Hot plate; Spectrophotometer 2, 3
72. Sulfuric Acid Mist (EPA Method 8): Source sampling train;
Burettes 2, 3
73. Visible Emissions (EPA Method 9): Stopwatch 2
74. Carbon Monoxide (EPA Method 10): Non-dispersive infrared
analyzer 2, 3
75. Hydrogen Sulfide (EPA Method 11): Same as 70 2, 3
76. Fluoride
(a) EPA Method 13A: Spectrophotometer 2, 3
(b) EPA Method 13B: Fluoride-specific ion electrodes;
Specific ion meter 2, 3
77. Sulfur Compounds (EPA Method 15): Gas chromatograph with
flame photometric detector; All Teflon dilution system; Mass
flowmeter; Heated sampling line; Permeation system 3, 4
78. Sulfur Compounds (EPA Method 16): Two gas chromatographs
with flame photometric detectors; All Teflon dilution system;
Mass flowmeter; Heated sampling line; Permeation system 3, 4
79. Particulates (EPA Method 17): In-stack filter holder;
Impingers; Sample metering pump; Analytical balance 2, 3
BIOLOGY
80. Wildlife and Terrestrial Methods
(a) Cover and Habitat Assessment: Ocular instrumentation 3, 4
(b) Mapping and Vegetation Analysis: Planimeter;
Statistical instrumentation 2, 3, 4
(c) Photo Interpretation: Stereoscope 3, 4
(d) Biomass Determination: General laboratory
instrumentation 1, 2, 3
142
-------
APPENDIX G (continued)
Skill
Rating No.
BIOLOGY (Continued)
(e) Species Composition/Density: Statistical
instrumentation 2, 3, 4
(f) Ecosystem Analysis: Statistical instrumentation;
Graphical instrumentation 3, 4
(g) Community Analysis: Same as (f) 3, 4
(h) Systems Analysis: Computer 4
(i) Inventory of Mammals: Ocular instrumentation;
Mechanical traps 1,2,3,4
(j) Inventory of Birds: Same as (i) 2, 3, 4
(k) Inventory of Reptiles and Amphibians: Same as (i) 2, 3, 4
(1) Inventory of Terrestrial Insects: Same as (i) 2, 3, 4
81. Bioassay
(a) Culture Maintenance: General laboratory equipment 2
(b) Routine Static Assays: General laboratory equipment 2
(c) Chronic Assays: Proportional diluters; Metering
pumps ; Photometers 3
(d) Non-Routine Assays: Dependent on type of assay 3, 4
(e) Physiological/Biochemical: Multi-channel recorder;
Photometer 3, 4
82. Fish: Age and Growth
(a) Collection: Electrofisher; Nets; Seines; Trawls;
Traps; Toxicants 1, 2
(b) Scale Analysis: Scale press; Scale reader 2
83. Fish: Population Dynamics
(a) Collection: Same as 82(a) 1, 2
143
-------
APPENDIX G (continued)
Skill
Rating No.
BIOLOGY (Continued)
(b) Mark and Recapture Methods: Method-specific materials 1, 2, 3
(c) Predictive Models: Statistical instrumentation 3, 4
84. Fish: Feeding Habits
(a) Collection: Same as 73(a) 1, 2
(b) Stomach Content Analysis 2, 3
85. Phytoplankton
(a) Collection: Plankton net 1, 2, 3
(b) Analysis: Microscope with Sedwick-Rafter Cell 3, 4
86. Zooplankton
(a) Collection: Zooplankton net 1, 2, 3
(b) Microscope
87. Periphyton
(a) Collection: Glass slides attached to floats or other
sampling apparatus 1, 2, 3
(b) Analysis: Microscope 3, 4
88. Aquatic and Terrestrial
(a) Specific Macrophyte Collection: Macrophyte standard
sampler 2, 3
(b) Macrophyte Taxonomy: Ocular instrumentation 2, 3, 4
89. Chlorophyll
(a) Spectrophotometric: Variable wavelength spectro-
photometer 2 , 3
(b) Fluorometric: Fluorometer 2, 3
144
-------
APPENDIX G (continued)
Skill
Rating No.
RADIATION
The analysis of radiological parameters requires special expertise
and experience, in addition to up-to-date, we11-maintained,
calibrated instrumentation and apparatus.
90. Alpha, Total (pCi/liter): Windowless Gas-Flow Proportional
Counter and associated equipment, or Thin Window Gas-Flow
Proportional Counter and associated equipment, or Alpha
Scintillation Counter and associated equipment, or Alpha
Spectrometer System (Surface Barrier Type) and associated
equipment 2, 3, 4
91. Alpha Counting Error (pCi/liter): Same as 90 2, 3, 4
92. Beta, Total (pCi/liter): Windowless Gas-Flow Proportional
Counter and associated equipment, or Thin Window Gas-Flow
Proportional Counter and associated equipment, or Beta
Scintillation Counter and associated equipment, or Liquid
Scintillation Counter and associated equipment 2, 3, 4
93. Beta Counting Error (pCi/liter): Same as 92 2, 3, 4
94. Radium, Total (pCi/liter): Windowless gas-flow proportional
counter and associated equipment, or Thin window gas-flow
proportional counter and associated equipment, or Alpha
scintillation counter and associated equipment, or Alpha
spectrometer (surface barrier type) system and associated
equipment, or Radon gas counting system and associated
equipment 2, 3, 4
145
-------
APPENDIX H
SAMPLE PRESERVATION METHODS AND RECOMMENDED HOLDING TIMES
TABLE H-l. WATER PARAMETERS(l»2^
Measurement
Acidity
Alkalinity
Arsenic
Biological Oxygen
Demand (BOD)
Bromide
Chemical Oxygen
Demand (COD)
Chloride
Chlorine Req.
Color
Cyanides
Dissolved Oxygen
Probe
Winkler
Fluoride
Hardness
Iodide
MBAS
Metals
Dissolved
Suspended
Total
Mercury
Dissolved
Vol. Req.
(ml)
100
100
100
1,000
100
50
50
50
50
500
300
300
300
100
100
250
200
100
100
Container
P, G<3)
P, G
P, G
P, G
P, G
P, G
P, G
P, G
P, G
P, G
G only
G only
P, G
P, G
P, G
P, G
P, G
P, G
Preservative
Cool, 4°C
Cool, 4°C
HN03 to pH<2
Cool, 4°C
Cool, 4°C
H2S04 to PH<2
None Req .
Cool, 4°C
Cool, 4°C
Cool, 4°C
NaOH to pH 12
Det . on site
Fix on site
Cool, 4°C
Cool, 4°C
Cool, 4°C
Cool, 4°C
Filter on site
HN03 to pH<2
Filter on site
HN03 to pH<2
Filter
HN03 to pH<2
Holding
Time^7'
24 Hours
24 Hours
6 Months
6 Hours (*)
24 Hours
7 Days
7 Days
24 Hours
24 Hours
24 Hours
No Holding
No Holding
7 Days
7 Days
24 Hours
24 Hours
6 Months
6 Months
6 Months
38 Days
(Glass)
13 Days
(Hard
Plastic)
(continued)
146
-------
TABLE H-l (continued)
Vol. Req.
Measurement (ml)
Total
Nitrogen
Ammonia
Kjeldahl
Nitrate
Nitrite
NTA
Oil & Grease
Organic Carbon
pH
Phenolics
Phosphorus
Orthophosphate,
Dissolved
Hydrolyzable
Total
Total, Dissolved
Residue
Filterable
Non-Filterable
Total
Volatile
Settleable Matter
Selenium
Silica
Specific
Conductance
100
400
500
100
50
50
1,000
25
25
500
50
50
50
50
100
100
100
100
1,000
50
50
100
Container
P, G
Pf*
> «
P, G
P, G
P, G
P, G
G only
P, G
P, G
G only
P, G
P, G
P, G
P, G
P, G
P, G
P, G
P, G
P, G
P, G
P only
P, G
Preservative
HN03 to pH<2
Cool, 4°C
H2S04 to PH<2
Cool, 4°C
H2S04 to pH<2
Cool, 4°C
H2S04 to pH<2
Cool, 4°C
Cool, 4°C
Cool, 4°C
H2S04 to pH<2
Cool, 4°C
H2S04 to pH<2
Cool, 4°C
Det. on site
Cool, 4°C
H3P04 to pH<4
1.0 g CuS04/l
Filter on site
Cool, 4°C
Cool, 4°C
H2S04 to pH<2
Cool, 4eC
Filter on site
Cool, 4°C
Cool, 4°C
Cool, 4°C
Cool, 4°C
Cool, 4°C
None Req .
HN03 to pH<2
Cool, 4°C
Cool, 4°C
Holding
Time^7'
38 Days
(Glass)
13 Days
(Hard
Plastic)
24 Hours (5)
24 Hours ^5)
24 Hours (5)
24 Hours (5)
24 Hours
24 Hours
24 Hours
6 Hours(4)
24 Hours
24 Hours (5)
24 Hours (5)
24 Hours (5)
24 Hours (5)
7 Days
7 Days
7 Days
7 Days
24 Hours
6 Months
7 Days
24 Hours ^6^
(continued)
147
-------
TABLE H-l (continued)
Measurement
Sulfate
Sulfide
Vol. Req.
(ml)
50
50
Container Preservative
P,
P,
G
G
Cool
2 ml
, 4
°C
zinc
Holding
Time (7?
7
24
Days
Hours
acetate
Sulfite
Temperature
Threshold Odor
Turbidity
50
1,000
200
100
P,
P,
G
P,
G
G
only
G
Cool
Det.
Cool
Cool
, 4
on
, 4
, 4
°C
site
°C
°C
24
No
24
7
Hours
Holding
Hours
Days
1. More specific instructions for preservation and sampling are found
with each procedure as detailed in this manual. A general discus-
sion on sampling water and industrial wastewater may be found in
ASTM, Part 31, p. 68-78 (1978).
2. U.S. Environmental Protection Agency Office of Research and Develop-
ment. Environmental Monitoring and Support Laboratory. 1979.
Methods for Chemical Analyses of Water and Wastes. Cincinnati,
Ohio. EPA-625/6-74-003.
3. Plastic or Glass.
4. If samples cannot be returned to the laboratory in less than 6 hours
and holding time exceeds this limit, the final reported data should
indicate the actual holding time.
5. Mercuric chloride may be used as an alternate preservative at a
concentration of 40 mg/1, especially if a longer holding time is
required. However, the use of mercuric chloride is discouraged
whenever possible.
•*»
6. If the sample is stabilized by cooling, it should be warmed to 25°C
for reading, or temperature correction should be made and results
reported at 25°C.
7. It has been shown that samples properly preserved may be held for
extended periods beyond the recommended holding time.
148
-------
TABLE H-2. AMBIENT AIR SAMPLES
Recommended
Parameter holding time Preservation method
Particulate Filters Indefinite Store in controlled
atmosphere of <50%
relative humidity
Sulfur Dioxide 30 days, if Store at <4CC after
(Pararosaniline Method) properly stored collection, during
transport, and
before analysis
Nitrogen Oxides 6 weeks Samples are stable
(Sodium-Arsenite Method) for 6 weeks at room
temperature
Fluoride None Collect and store
in plastic
containers
149
-------
APPENDIX I
QUALITY ASSURANCE AUDIT CHECKLIST
This form has-been prepared to be used in the on-site audit of projects.
Personnel performing the audit should list any questions which were not clear
from the information provided by the Quality Assurance Pre-Audit worksheet in
this checklist and obtain answers during the actual site visit.
This form is to be used in the actual performance of an on-site project
audit.
This checklist is to be completed by the personnel involved in the on-site
audit.
150
-------
APPENDIX I (Continued)
QUALITY ASSURANCE AUDIT CHECKLIST
Laboratory:
Street Address:
Mailing Address (If Different):
City State Zip
Laboratory Telephone No. : Area Code No.
Laboratory Director:
Quality Assurance Supervisor:
Personnel Contacted During Audit:
Name Title
Contract Number:
Contract Title:
Project Officer:
Audit Conducted By:
Agency and Address:
Telephone No.: Area Code No.
Date Audit Performed:
151
-------
APPENDIX I (continued)
A. ORGANIZATION AND PERSONNEL
A.I. Review the Pre-Audit Worksheet and list questions from the Organ-
ization and Personnel section of the Pre-Audit Worksheet to be
discussed during the QA audit.
Al.
Q2.
A2.
Q3.
A3.
Q4.
A4.
Q5.
A5.
152
-------
APPENDIX I (continued)
A.2. Organization and Personnel Checklist
Do personnel assigned to this project have the
appropriate educational background to success-
fully accomplish the objectives of the program?
Do personnel assigned to this project have the
appropriate level and type of experience to
successfully accomplish the objectives of this
program?
Is project organization appropriate to
accomplish the objectives of this' program?
Is the project adequately staffed to meet
project commitments in a timely manner?
Are project reporting relationships clear?
If any special training or experience is
required, is it represented on the project staff?
Does the laboratory have a Quality Assurance
Supervisor who reports to senior management
levels?
Was the Project Manager available during the
QA audit?
Was the Quality Assurance Supervisor available
during the QA audit?
Does the project schedule show adequate time to
accomplish the sampling program and does it allow
for uncontrollable delays, such as bad weather?
Does the project schedule allow sufficient time
between sample collection and reporting of the
data to apply adequate analytical quality control,
including supervisory review of the data?
Yes
No
Comment
153
-------
APPENDIX I (continued)
A. 3. Does the project organization plan and schedule give adequate attention
and time to the sampling and analysis effort?
Comments on project organization and schedule:
A.4. Are the personnel assigned to this project generally qualified to
accomplish the objectives of the program?
Comments on personnel:
154
-------
APPENDIX I (continued)
B. FACILITIES
When touring the facilities, give special attention to: (a) the overall
appearance of organization and neatness, (b) the proper maintenance of
facilities and instrumentation, (c) the general adequacy of the facilities
to accomplish the required work, and (d) sampling equipment required for the
project.
B.I. General Facilities Checklist
Does the laboratory appear to have adequate
workspace (120 sq. feet, 6 linear feet of
unencumbered bench space per analyst)?
Are voltage control devices used on major
instrumentation (e.g., GC/MS, spectropho-
tometers)?
Does the laboratory have a source of distilled/
demineralized water?
Is the conductivity of distilled/demineralized
water routinely checked and recorded?
Is the analytical balance located away from draft
and areas subject to rapid temperature changes?
Has the balance been calibrated within one year?
Are exhaust hoods provided to allow
organized work with volatile materials?
Is the laboratory maintained in a clean and
organized manner?
Are safe and contamination-free work areas
provided for the handling of toxic or radio-
active materials?
Are the radioactive and/or toxic chemical
handling areas either a stainless steel bench
or an impervious material covered with absorbent
material?
Yes
No
Comment
155
-------
APPENDIX I (continued)
Are adequate facilities provided for storage of
samples, including cold storage?
Are chemical waste disposal facilities adequate?
Are contamination-free areas provided for trace
level analytical work?
Can the laboratory supervisor document that trace
metals-free water is available for preparation of
standards and blanks?
Is organic-free water available for preparation of
standards and blanks?
If biotesting is to be conducted, are adequate
environment-controlled facilities available (e.g.,
light, temperature control)?
Is the required field instrumentation and sampling
equipment properly maintained?
Is adequate safety equipment (fire extinguishers,
showers, eyewash stations) located throughout the
laboratory?
If bacteriological analyses are to be conducted,
is an aseptic work area provided?
Are bacteriological incubators maintained at the
proper temperature (35 + 0.5°C for total coliform
and fecal streptococcus^ 44. 5 +• 0.2°C for fecal
coliform)?
Are boats, motors, vehicles, and other mobile
facilities available as required?
Yes
No
Comment
156
-------
APPENDIX I (continued)
B.2. Instruments. List the major laboratory and in situ analytical
instruments that will be used. Complete an instrument evaluation form
on each one.
Instrument
Analysis
157
-------
APPENDIX I (continued)
INSTRUMENT EVALUATION
Instrument:
Instrument Mfg.
Model:
Year of Acquisition:
Condition:
Calibration Frequency:
Service Maintenance Frequency:
Other Pertinent Information:
Are Manufacturer's operating manuals readily
available to the operator?
Is there a calibration protocol available to the
operator?
Are calibrations kept in a permanent record?
Is a permanent service record maintained?
Has the instrument been modified in any way?
yes
no
SATISFACTORY?
Comments:
158
-------
Instrument:
APPENDIX I (continued)
INSTRUMENT EVALUATION
Instrument Mfg.
Model:
Year of Acquisition:
Condition:
Calibration Frequency:
Service Maintenance Frequency:
Other Pertinent Information:
Are Manufacturer's operating manuals readily
available to the operator?
Is there a calibration protocol available to the
operator?
Are calibrations kept in a permanent record?
Is a permanent service record maintained?
Has the instrument been modified in any way?
yes
no
SATISFACTORY?
Comments:
159
-------
APPENDIX I (continued)
INSTRUMENT EVALUATION
Instrument:
Instrument Mfg.
Model:
Year of Acquisition:
Condition:
Calibration Frequency:
Service Maintenance Frequency:
Other Pertinent Information:
Are Manufacturer's operating manuals readily
available to the operator?
Is there a calibration protocol available to the
operator?
Are calibrations kept in a permanent record?
Is a permanent service record maintained?
Has the instrument been modified in any way?
yes
no
SATISFACTORY?
Comments:
160
-------
APPENDIX I (continued)
C. ANALYTICAL METHODOLOGY
C.I. Review the Pre-Audit Worksheet and list items from the Analytical
Methodology section of the Pre-Audit Worksheet to be discussed
during the QA audit.
Al
Q2.
A2.
Q3.
A3.
Q4.
A4.
Q5.
A5.
161
-------
APPENDIX I (continued)
C.2. Conduct discussions with two or more individuals who have
analytical responsibilities in connection with the project. The
following points should be addressed to determine each individ-
ual's awareness and application of QA/QC procedures:
1. Specific project responsibilities,
2. Level of knowledge of the analytical methods used,
3. Awareness of and adherence to the laboratory's QC procedures,
and
4. Appearance and accuracy of the work records.
Analyst
Name Responsibility
Comments:
Analvst
Name Responsibility
Comments:
162
-------
Comments:
Comments:
APPENDIX I (continued)
Analyst
Name
Responsibility
Analyst
Name
Responsibility
163
-------
APPENDIX I (continued)
C.3. Analytical Methodology Checklist.
Item
Are standard methods (e.g., EPA, ASTM, Standard
Methods for the Examination of Water and
Wastewater) used when available?
Have standard methods been altered in any way?
If so, is it justified?
Are written analytical procedures provided to
the analyst?
Are reagent grade or higher purity chemicals
used to prepare standards?
Are samples analyzed within the linear range
of the method in all cases?
Does the standard curve bracket the concen-
tration of the samples on each sample run?
Are fresh analytical standards prepared at
the required frequency?
Are standards run periodically during a
long sample run?
Are reference standards properly labeled with
concentrations, date of preparation, and the
identity of the person preparing the sample?
Do the analysts record bench data in a neat
and accurate manner?
Is the appropriate instrumentation used in
accordance with standard procedures?
Are methods used which are appropriate to the;
sample matrix (e.g., saline waters, wastewaters)?
Yes
No
Comment
164
-------
APPENDIX I (continued)
Item
Are analytical detection limits adequate for
the purposes of the project?
Are the analytical procedures used adequately
documented? For example, if a standard method
is not available, is a written procedure
incorporated into the project plan?
Are all strip charts properly labeled with
instrument conditions, date, and sample
numbers?
Are samples properly handled (e.g., organized,
chilled as necessary, appropriate containers)
before, during, and after analysis?
Yes
No
Comment
C.4. Are the analytical methods used satisfactory to accomplish the objectives
of the program? Are laboratory practices acceptable?
Comments on analytical methods and practices:
165
-------
APPENDIX I (continued)
D. SAMPLING AND SAMPLE HANDLING
D.I. Review the Pre-Audit Worksheet and list the items from the Sampling
and Sample Handling section of the Pre-Audit Worksheet to be
discussed.
_
Al.
Q2.
A2.
03.
A3.
QA.
Q5.
A5.
166
-------
APPENDIX I (continued)
D.2. Conduct discussions with two or more individuals who have sampling
responsibilities in connection with the project. The following
points should be addressed to determine each individual's aware-
ness and application of appropriate sampling procedures:
1. Specific project responsibility,
2. Level of knowledge of acceptable sampling procedures,
3. Adherence to the project sampling plan, and
4. Neatness and accuracy of field records.
Field
Technician
Name Responsibility
Comments:
Field
Technician
Name Responsibility
Comments:
167
-------
APPENDIX I (continued)
D.3. Sampling Equipment and Procedures Checklist
Item
Are the sampling procedures specifically
defined in the project QA plan or other
referenced document?
Is the sampling program well organized?
Have appropriate techniques been used in
selecting sampling sites?
Are proper containers used for sample
collection, transport, and storage?
Are sample containers properly prepared before
sample collection to avoid sample contamination?
Containers for organics should be solvent rinsed;
for trace metals, acid rinsed.
Are the proper preservatives used in the samples
for each parameter? (See Appendix H)
Are permanent labels affixed to sample containers?
Do the sample labels contain adequate information
(date, time, sample location, samples) and a
unique sample identification number?
Are proper techniques used to collect representa-
tive samples while avoiding sample contamination?
Are duplicate samples collected? What frequency?
Are samples shipped promptly to the laboratory in
order to meet recommended holding time deadlines?
(See Appendix H)
Are chain-of-custody records available for inspec-
tion? Are they neat and understandable? Have the
required custody signatures been obtained?
Yes
No
Comment
168
-------
APPENDIX I (continued)
D.4. Field Notebooks. Review one or more field notebooks and determine if
the following information is recorded.
Item
Is a permanent bound notebook used to
record all field data and observations?
Is the notebook reasonably neat and
organized, considering the use under
adverse field conditions?
Are field instrument and in situ instru-
ment calibration data recorded daily?
Are sample location, time, and number
accurately and completely recorded?
Are in situ data neatly recorded in an
understandable manner?
Are ambient data (i.e., weather)
recorded when appropriate?
Are the necessary engineering data (e.g.,
flow, operating conditions) recorded?
Have supervisory personnel reviewed the field
notebook and so indicated by their signature?
Yes
No
Comment
D.5. Is the sampling program adequate to accomplish the objectives of the
project?
Comments on the sampling program and sample collection:
169
-------
APPENDIX I (continued)
E. QUALITY CONTROL
E.I. Review the Pre-AudiL Worksheet and list items from the Quality
Control section of the Pre-Audit Worksheet to be discussed.
~oT
Al.
Q2.
A2.
Q3.
A3.
Q4.
A4.
Q5.
A.5
170
-------
APPENDIX I (continued)
E.2. Quality Control Manual Checklist
Item
Does the laboratory maintain a
Quality Control Manual?
Does the manual address the important
elements of a QC program, including the
fol lowing :
a. Personnel?
b. Facilities and equipment?
c. Configuration control of instruments?
d. Documentation of procedures?
e. Procurement and inventory practices?
f. Preventive maintenance?
g. Reliability of data?
h. Data validation?
i. Feedback and corrective action?
j. Instrument calibration?
k. Pecord keeping?
1 . Internal audits?
Does the QC Manual specify the frequency of
duplication and spiked sample analysis?
Is at least 10 percent sample duplication
required?
Are QC responsibilities and reporting
relationships clearly defined?
Yes
No
Comment
171
-------
APPENDIX I (continued)
E.3. Quality Control Procedures Checklist
Item
Select a representative number of
analyses from the project list and review
historical quality control data for these
parameters. Are QC records adequate for
the purposes of the project?
Are reference standards analyzed with
each set of samples?
Have standard curves been adequately
documented?
Are laboratory standards traceable to
the National Bureau of Standards, where
appropriate?
Have standards been analyzed every
20 or fewer samples to verify that the
analytical method is in control?
Have the prescribed number (QC Manual or
Project Plan) of duplicate and spiked
samples been analyzed?
Do duplicate data fall within
acceptable limits of precision?
Are recoveries, calculated from spiked
sample data, acceptable?
Are quality control charts maintained
for each routine analysis?
Do QC records show corrective action
when analytical results fail to meet
QC criteria?
Do supervisory personnel review the
data and QC results?
Yes
No
Comment
172
-------
APPENDIX I (continued)
E.4. Are quality control procedures and records generally adequate to
accomplish the objectives of the project?
Comments on quality control procedures and records:
173
-------
APPENDIX I (continued)
F. DATA HANDLING
F.I. Review the Pre-Audit Worksheet and list items from the Data
Handling section of the Pre-Audit Worksheet to be discussed.
Ql.
Al.
Q2.
A2.
Q3.
A3.
04,
A4.
Q5,
A5.
174
-------
APPENDIX I (continued)
F.2. Data Handling Checklist
Item
Ask for a demonstration of data handling
procedures from initial sample check-in
to reporting of the final data. Are
these procedures clear and adequate to
avoid data errors?
Are data calculations checked by a
second person?
Are data calculations documented?
Do records indicate corrective action
that has been taken on rejected data?
Are limits of detection determined and
reported properly?
Are results which are below the analytical
detection limit reported as such?
Are the data reported to a justifiable
number of significant figures?
Are all data and records retained at least
3 years beyond completion of the project?
Are quality control data (e.g., standard
curve, results of duplication and spikes)
accessible for all analytical results?
Are data reported in the appropriate units
(e.g., ppm, mg/1, dry weight, metric
measure)?
Yes
No
Comment
175
-------
APPENDIX I (continued)
F.3. Are data handling procedures adequate to accomplish the objectives
of the project and to trace the accompanying quality control
results?
Comments on data handling:
176
-------
APPENDIX I (continued)
G. SUMMARY
G.I. Summary Checklist
Item
Do responses to audit questions indicate that
project and supervisory personnel are aware
of QA and its application to the project?
Do project and supervisory personnel place
positive emphasis on QA/QC?
Have responses with respect to QA/QC aspects
of the project been open and direct?
Has a cooperative attitude been displayed
by all project and supervisory personnel?
Are the personnel assigned to the
project qualified?
Does the organization place the proper
emphasis on quality assurance?
Have any QA/QC deficiencies been
discussed before leaving?
Is the overall quality assurance
adequate to accomplish the objectives
of the project?
Are any corrective actions required?
If so, list the necessary actions below.
Yes
No
Comment
177
-------
G.2. Summary Comments and Corrective Actions
178
-------
GLOSSARY
agency: the United States Environmental Protection Agency (EPA).
analytical or reagent blank: a blank used as a baseline for the analytical
portion of a method. For example, a blank consisting of a sample from a
batch of absorbing solution used for normal samples, but processed
through the analytical system only, and used to adjust or correct
routine analytical results.
blank or sample blank: a sample of a carrying agent (gas, liquid, or solid)
that is normally used to selectively capture a material of interest and
that is subjected to the usual analytical or measurement process to
establish a zero baseline or background value, which is used to adjust
or correct routine analytical results.
confidence interval: a value interval that has a designated probability (the
confidence coefficient) of including some defined parameter of the
populat ion.
confidence limits: the outer boundaries of a confidence interval.
contract: the legal instrument reflecting a relationship between the Federal
Government and a State or local government or other recipient:
(1) whenever the principal purpose of the instrument is the acquisition,
by purchase, lease, or barter, of property or services for the direct
benefit or use of the Federal Government; or (2) whenever an executive
agency determines in a specific instance that the use of a type of
procurement contract is appropriate.
cooperative agreement: the legal instrument reflecting the relationship
between the Federal Government and a State or local government or other
recipient whenever: (1) the principal purpose of the relationship is
the transfer of money, property, services, or anything of value to the
State or local government or other recipient to accomplish a public
purpose of support or stimulation authorized by Federal statute, rather
than acquisition, by purchase, lease, or barter, of property or services
for the direct benefit or use of the Federal Government; and (2) sub-
stantial involvement is anticipated between the executive agency acting
for the Federal Government and the State or local government or other
recipient during performance of the contemplated activity.
data validation: a systematic effort to review data to identify any outliers
or errors and thereby cause deletion or flagging of suspect values to
179
-------
assure the validity of the data to the user. This "screening" process
may be done by manual and/or computer methods, and it may utilize any
consistent technique such as sample limits to screen out impossible
values or complicated acceptable relationships of the data with other
data.
extramural review: technical and scientific review of a research or
demonstration proposal by a qualified individual not an employee of the
Environmental Protection Agency, such as an employee of industry or an
academic institution.
grant: the legal instrument reflecting the relationship between the Federal
Government and a State or local government or other recipient in which:
(1) the principal purpose of the relationship is the transfer of money,
property, services, or anything of value to the State or local govern-
ment or other recipient in order to accomplish a public purpose of
support or stimulation authorized by Federal statute, rather than
acquisition, by purchase, lease, or barter, of property or services for
the direct benefit or use of the Federal Government; and (2) no sub-
stantial involvement is anticipated between the executive agency, acting
for the Federal Government, and the State or local government or other
recipient during performance of the contemplated activity.
grantee: any individual, agency, or entity that has been awarded a grant
pursuant to grant regulations or has received a cooperative agreement.
in-house project: a project carried out by EPA staff in EPA facilities.
intramural review: technical and scientific review of a research or
demonstration proposal by a qualified employee of the Environmental
Protection Agency.
measures of dispersion or variability: measures of the differences, scatter,
or variability of values of a set of numbers. Measures of the disper-
sion or variability are the range, the standard deviation, the variance,
and the coefficient of variation.
performance audit: planned independent (duplicate) sample checks of actual
output made on a random basis to arrive at a quantitative measure of the
quality of the output. These independent checks are made by an auditor
subsequent to the routine checks by a field technician or laboratory
analyst.
performance test sample: a sample or sample concentrate (to be diluted to a
specified volume before analysis) of known (to EPA only) true value
which has been statistically established by interlaboratory tests.
These samples are commonly provided to laboratories to test analytical
performance. Analytical results are reported to EPA for evaluation.
pre-application: a preliminary proposal outlining the intent of a proposed
project. Letter format is normally used, in which case the program
180
-------
office to which the pre-application is referred responds directly to the
submitter to encourage or discourage followup.
pre-award survey: on-site inspection, review, and discussions with a
prospective grantee or prospective contractor at his/her facilities.
Discussions would normally include, but not be limited to, the proposed
project plan, personnel, procedures, schedule, and facilities. Normally
conducted after receipt of a "best and final" offer, but prior to final
selection of a contractor.
proficiency testing: special series of planned tests to determine the
ability of field technicians or laboratory analysts who normally perform
routine analyses. The results may be used for comparison against
established criteria, or for relative comparisons among the data from a
group of technicians or analysts.
program: the technical office or staff that has responsibility for a part of
the Agency's operations. For R&D grants, the "programs" are the Office
of Research and Development, the Office of Air Quality Planning and
Standards, the Office of Solid Waste Management Programs, and the Office
of Mobile Sources Air Pollution Control.
project officer: the EPA official designated in the grant or contract agree-
ment as the Agency's principal contact with the grantee on a particular
grant. This person is the individual responsible for project monitoring
and for recommendations on or approval of proposed project changes.
quality: the totality of feature and characteristics of a product or service
that bears on its ability to satisfy a given purpose. For pollution
measurement systems, the product is pollution measurement data, and the
characteristics of major importance are accuracy, precision, and
completeness. For monitoring systems, "completeness," or the amount of
valid measurements obtained relative to the amount expected to have been
obtained, is usually a very important measure of quality. The relative
importance of accuracy, precision, and completeness depends upon
particular purpose of the user.
quality assurance: actions taken by the Laboratory (lERL-Ci) line organi-
zation under the specific auspices of the Office of the Director, to
assure that quality control policies and procedures are being properly
implemented and appropriate levels of accuracy, reliability, and com-
parability are being achieved in the sampling and analysis activities
(including data reduction and handling) of the Laboratory to fulfill the
Laboratory's assigned mission.
quality assurance manual: an orderly assembly of management policies,
objectives, principles, and general procedures by which an agency or
laboratory outlines how it intends to produce quality data.
quality assurance plan: an orderly assembly of detailed and specific
procedures by which an agency or laboratory delineates how it produces
quality data for a specific project or measurement method. A given
181
-------
agency or laboratory would have only one quality assurance manual, but
would have a quality assurance plan for each of its projects or programs
(group of projects using the same measurement methods; for example, a
laboratory service group might develop a plan by analytical instrument
since the service is provided to a number of projects).
quality control: actions taken by the Laboratory (lERL-Ci) organization (on
in-house projects) and by contractors/grantees (on extramural projects)
in day-to-day activities to achieve desired accuracy, reliability, and
comparability in the results obtained from sampling and analysis
activities. Review by contractors/grantees of their overall quality
control activities is "quality assurance" to them, but "quality control"
from the Laboratory's viewpoint.
range: the difference between the maximum and minimum values of a set of
values. When the number of values is small (i.e., 12 or less), the
range is a relatively sensitive (efficient) measure of variability.
reliability (general): the ability of an item or system to perform a
required function under stated conditions for a stated period of time.
reliability (specific): the probability that an item will perform a required
function under stated conditions for a stated period of time.
representative sample: a sample taken to represent a lot or population as
accurately and precisely as possible. A representative sample may be
either a completely random sample or a stratified sample, depending upon
the objective of the sampling and the conceptual population for a given
situation.
spiked sample: a normal sample of material (gas, solid, or liquid) to which
is added a known amount of some substance of interest. The extent of
the spiking is unknown to those analyzing the sample. Spiked samples
are used to check on the performance of a routine analysis or the
recovery efficiency of a method.
standard deviation: the square root of the variance of a set of values:
Ui - X)2
s =
n - 1
If the values represent a sample from a larger population:
N
(X£ - u)2
i=l
s =
N
182
-------
where u is the true arithmetic mean of the population. The property of
the standard deviation that makes it most practically meaningful is
that it is in the same units as the values of the set, and univer-
sal statistical tables for the normal (and other) distributions are
expressed as a function of the standard deviation. Mathematically,
the tables could just as easily be expressed as a function of the
variance.
standard reference material (SRM): a material produced in quantity, of which
certain properties have been certified by the National Bureau of
Standards (NBS) or other agencies to the extent possible to satisfy its
intended use. The material should be in a matrix similar to actual
samples to be measured by a measurement system or be used directly in
preparing such a matrix. Intended uses include: (1) standardization of
solutions, (2) calibration of equipment, and (3) monitoring the accuracy
and precision of measurement systems.
standard reference sample (SRS): a carefully prepared material produced from
or compared against an SRM (or other equally well characterized
material) such that there is little loss of accuracy. The sample should
have a matrix similar to actual samples used in the measurement system.
These samples are intended for use primarily as reference standards to:
(1) determine the precision and accuracy of measurement systems,
(2) evaluate calibration standards, and (3) evaluate quality control
reference samples. They may be used "as is" or as a component of a
calibration or quality control measurement system. Examples: an
NBS-certified sulfur dioxide permeation device is an SRM. When used in
conjunction with an air dilution device, the resulting gas becomes an
SRS. An NBS-certified nitric oxide gas is an SRM. When diluted with
air, the resulting gas is an SRS.
standardization: a physical or mathematical adjustment or correction of a
measurement system to make the measurements conform to predetermined
values. The adjustments or corrections are usually based on a
single-point calibration level.
calibration standard: a standard prepared by the analyst for the pur-
pose of calibrating an instrument. Laboratory control standards
are prepared independently from calibration standards for most
methods.
detection limit: that number obtained by adding two standard deviations
to the average value obtained for a series of reagent blanks that
are analyzed over a long time period (several weeks or months).
duplicate analyses: the collection of two samples from the same
fieldsite which are analyzed at different times but usually on the
same day.
laboratory control standard: a standard of known concentration prepared
by the analyst.
183
-------
reference standard: a solution obtained from an outside source having a
known value and analyzed as a blind sample.
relative percent error for duplicate analyses: the difference between
the measured concentration for the duplicate pair times 100 and
divided by the average of the concentration.
relative percent error for laboratory control standards: the difference
between the measured value and the theoretically correct value
times 100 and divided by the correct value.
relative percent error of a reference sample analysis: the difference
between the correct and measured values times 100 and divided by
the correct concentration.
Standards Based Upon Usage
calibration standard: a standard used to quantitate the relationship
between the output of a sensor and a property to be measured.
Calibration standards should be traceable to standard reference
materials or primary standards.
quality control reference sample (or working standard): a material used
to assess the performance of a measurement or portions thereof. It
is intended primarily for routine intralaboratory use in maintain-
ing control of accuracy and would be prepared from or traceable to
a calibration standard.
Standards Depending Upon "Purity" or Established Physical or Chemical
Constants
primary standard: a material having a known property that is stable,
that can be accurately measured or derived from established
physical or chemical constants, and that is readily reproducible.
secondary standard: a material having a property that is calibrated
against a primary standard.
standards in naturally-occurring matrix: standards relating to the pollutant
measurement portions of air pollution measurement systems may be cate-
gorized according to matrix, purity, or use. Standards in a naturally-
occurring matrix include Standard Reference Materials and Standard
Reference Samples.
statistical control chart (also Shewhart control chart): a graphical chart
with statistical control limits and plotted values (usually in chrono-
logical order) of some measured parameter for a series of samples. Use
of the charts provides a visual display of the pattern of the data,
enabling the early detection of time trends and shifts in level. For
maximum usefulness in control, such charts should be plotted in a timely
manner, i.e., as soon as the data are available.
184
-------
system audit: a systematic on-site qualitative review of facilities, equip-
ment, training, procedures, recordkeeping, validation, and reporting
aspects of total (quality assurance) system to arrive at a measure of
the capability and ability of the system. Even though each element of
the system audit is qualitative in nature, the evaluation of each ele-
ment and the total may be quantified and scored on some subjective
basis.
Test Variability
accuracy: the degree of agreement of a measurement (or an average of
measurements of the same thing), X, with an accepted reference or
true value, T, usually expressed as the difference between the two
values, X-T, or the difference as a percentage of the reference or
true value, 100(X-T)/T, and sometimes expressed as a ratio, X/T.
bias: a systematic (consistent) error in test results. Bias can exist
between test results and the true value (absolute bias, or lack of
accuracy), or between results from different sources (relative
bias). For example, if different laboratories analyze a homo-
geneous and stable blind sample, the relative biases among the
laboratories would be measured by the differences existing among
the results from the different laboratories. However, if the true
value of the blind sample were known, the absolute bias or lack of
accuracy from the true value would be known for each laboratory.
precision: a measure of mutual agreement among individual measurements
of the same property, usually under prescribed similar conditions.
Precision is most desirably expressed in terms of the standard
deviation but can be expressed in terms of the variance, range, or
other statistics. Various measures of precision exist depending
upon the "prescribed similar conditions."
replicates: repeated but independent determinations of the same sample,
by the same analyst, at essentially the same time and same condi-
tions. Care should be exercised in considering replicates of a
portion of an analysis and replicates of a complete analysis. For
example, duplicate titrations of the same digestion are not valid
replicate analyses, although they may be valid replicate titra-
tions. Replicates may be performed to any degree, e.g., dupli-
cates, triplicates, etc.
reproducibility: the precision, usually expressed as a standard devia-
tion, measuring the variability among results of measurements of
the same sample at different laboratories.
variance: mathematically, for a sample, the sum of squares of the differ-
ences between the individual values of a set and the arithmetic mean of
the set, divided by one less than the number of values.
185
EPA-RTF LIBRARY
-------
TECHNICAL REPORT DATA
(Please read Instructions on the reverse before completing)
1. REPORT NO.
EPA-600/9-79-046
3. RECIPIENT'S ACCESSION-NO.
4. TITLE ANDSUBTITLE
Quality Assurance Guidelines for
lERL-Ci Project Officers
5. REPORT DATE
December, 1979 issuing date
6. PERFORMING ORGANIZATION CODE
7. AUTHOR(S)
Charles L. Stratton and John D. Bonds
8. PERFORMING ORGANIZATION REPORT NO.
9. PERFORMING ORGANIZATION NAME AND ADDRESS
Environmental Science and Engineering, Inc.
P. 0. Box 13^
Gainesville, Florida 326oh
10. PROGRAM ELEMENT NO.
C2HN1E
11. CONTRACT/GRANT NO.
68-03-2656
12. SPONSORING AGENCY NAME AND ADDRESS
Industrial Environmental Research Laboratory
Office of Research and Development
U. S. Environmental Protection Agency
Cincinnati, Ohio U5268
13. TYPE OF REPORT AND PERIOD COVERED
Final Report 9/78-12/79
14. SPONSORING AGENCY CODE
EPA/600/12
15. SUPPLEMENTARY NOTES
16. ABSTRACT
This document provides guidelines to Industrial Environmental Research Laboratory-
Cincinnati (lERL-Ci) Project Officers for (l) incorporating quality assurance (QA)
criteria in contract procurement and grant awards, (2) monitoring quality assurance
of extramural projects, and (3) conducting QA audits for projects involving sampling
and analysis activities. The Project Officer's responsibilities are described for
the initiation, monitoring, and satisfactory conclusion of contracts, research and
demonstration grants, and cooperative agreements of the type normally funded by IERL-
Guidance is provided to assure QA is adequately addressed during project conception
and solicitation and that prospective grantees are informed of QA requirements. A
technical evaluation system is presented for the evaluation of the QA aspects of pro-
posals and grant applications. The basic elements of an acceptable QA program and of
a project QA plan are described, and the Project Officer's role in QA monitoring is
discussed. A procedure is described for conducting QA audits of active projects.
Checklists are included to assist the Project Officer.
7.
KEY WORDS AND DOCUMENT ANALYSIS
DESCRIPTORS
b.IDENTIFIERS/OPEN ENDED TERMS
c. COSATI Field/Group
Quality Assurance
Sampling and Analysis
Contract Procurement
QA
Quality Control
18. DISTRIBUTION STATEMENT
Release to Public
i9. SECURITY CLASS (This Report)
Unclassified
21. NO. OF PAGES
196
20. SECURITY CLASS (Thispage)
Unclassified
22. PRICE
EPA Form 2220-1 (9-73)
18 b
> U.S. GOVERNMENT PRINTING OFFICE: 1980 -657-146/3599
------- |