US EPA REGION 9 GUIDANCE FOR
PREPARING QUALITY ASSURANCE PROJECT PLANS
FOR SUPERFUND REMEDIAL PROJECTS
(Document Control No. 9QA-03-89)
September, 1989
Quality Assurance Management Section
Environmental Services Branch
Office of Policy and Management
USEPA Region 9
-------
TABLE OF CONTENTS
Section Page Pages Revision Date
Introduction „ 1 3 0 Sept, 1989
General Guidelines 420 Sept, 1989
Preparation of a Quality Assurance Project Plan
I. Plan Identification 1
Title and Signature Pages 6 0 Sept, 1989
Table of Contents 6 0 Sept, 1989
II. Quality Assurance Elements 10
Project Objectives and Organization
1. Project Description 7 0 Sept, 1989
2. Data Quality Objectives 8 0 Sept, 1989
3. Project Organization 9 0 Sept, 1989
Measurement Procedures
4. Sample Collection and Quality Control 10 0 Sept, 1989
5. Sample Custody 11 0 Sept, 1989
6. Analytical and Quality Control Procedures .... 12 0 Sept, 1989
Quality Assurance Management
7. Data Quality Management 14 0 Sept, 1989
8. Quality Assurance Oversight 15 0 Sept, 1989
Definitions 17 1 0 Sept, 1989
References 18 1 0 Sept, 1989
-------
Section No.Introduction
Revision No. 0
Date Sept 1989
Page I of 3
INTRODUCTION
Quality Assurance Policy
Environmental measurements are conducted with the goal of producing data which
are scientifically valid, are of known quality which meets the established objectives, and
are legally defensible if necessary. Environmental measurements include field or
laboratory work involving any of the following (6):
0 measurement of chemical, physical, or biological parameters in the
environment;
*: measurement of-pollutants in waste: streams;
0 health- and ecological-effect studies;
*: clinical and epidemiological investigations;
0 laboratory simulation of environmental events;
* studies or measurements on pollution transport, including diffusion models.
Quality assurance arises from the attitude of doing a job right the first time. Al-
though sometimes perceived as an activity apart from the mainstream, QA is inherent in
the measurement process; i.e., determining the needs to be met ... thinking through the
operations ... anticipating the potholes ... making contingency plans ... and demonstrating
the quality of the result. Upfront planning is important for controlling or accounting for
the variables that influence the quality of the measurement data. All members of a
project planning team have QA activities in their domains whether or not they are aware
of it.
The Agency quality assurance policy states that a Quality Assurance Project Plan
must be developed and approved prior to every monitoring and measurement project or
group of similar projects (5.)
Quality Assurance Project Plan Guidelines
In 1980, the Quality Assurance Management Staff at EPA Headquarters (QAMS-
HQ), which oversees and supports the Agency's QA activities, identified the elements of a
Quality Assurance Project Plan for environmental data collection. The Plan consists of
the specific organization, quality assurance objectives, methodologies and operating proce-
dures, and QA/QC measures designed to achieve and document the data quality. The ele-
ments are contained in the document entitled "Interim Guidelines and Specifications for
Preparing Quality Assurance Project Plans", QAMS-005/80 (6). These guidelines are the
foundation upon which the EPA Regions may develop regional requirements for Quality
Assurance Project Plans.
Data Quality Objectives Guidelines
QAMS-HQ issued "Development of Data Quality Objectives " (4) and a "Data
Quality Objectives Checklist (3), to illustrate a process for determining the appropriate
level of effort to reach a measurement goal. Data quality objectives (DQOs) are quantita-
tive and qualitative statements of the type of data needed to support a decision, based on
the level of uncertainty that a decision-maker is willing to accept and the resources avail-
able.
-------
Section No.Introduction
Revision No. 0
Date Sept 1989
Page 2 of 3
The involvement of data-users and decision-makers early in the planning process is
emphasized. The product of the DQO process should be quantitative statements of the
precision, accuracy, detection or quantitation level, and completeness which are the goals
of the measurement effort, and qualitative statements about the representativeness and
comparability. This information is similar to but more extensive than what was required
by the QAMS 005/80 guidance for QA project plans.
Quality Assurance Program Plan and Sampling and Analysis Plan
In addition to the Quality Assurance Project Plan, two other documents are used in
planning quality assurance. These are the Quality Assurance Program Plan and the Sam-
pling and Analysis Plan.
The'Quality Assurance Program Plan describes the QA policies of an organization
which performs environmental measurements. A Quality Assurance Program Plan is re-
quired from each organization which performs environmental measurements under con-
tract to EPA. The QA policies represent a commitment by the management, to allocate
the time and resources necessary to produce environmental data of the quality needed.
The mechanisms for carrying out the QA policies are described in the Plan, and include
the following elements (7):
0 QA personnel and responsibilities;
® Selection, inspection, and maintenance of facilities, equipment, and services;
0 Qualifications and training of technical personnel;
0 Data management and data quality management;
0 Audits;
0 Corrective action;
0 Assessment of the QA program and reports to management.
Some QA activities which are the same for all sites can be described in the Quality
Assurance Program Plan.
The Sampling and Analysis Plan ("Sample Plan") is a document which is submitted
for each discrete sampling event. The Sample Plan functions as an operating manual for
field personnel, as well as for requesting laboratory services. Where the Quality As-
surance Project Plan identifies the anticipated methodologies in the project, the Sample
Plan states in detail the specific procedures selected. The Sample Plan (1) contains the
following elements:
0 Objective;
0 Background;
0 Maps;
0 Rationale;
0 Request for Analysis;
0 Field Methods and Procedures;
* Health and Safety Plan.
-------
Section No.Introduction
Revision No. 0
Date Sent 1989
Page 3 of 3
Regional Quality Assurance Project Plan Guidance
The QA Project Plan is the written product of the upfront planning and thought
process, providing all project participants with the same clear goals and guidelines.
The following- is the Region 9 Guidance for developing QA Project Plans for Su-
perfund projects. It contains the elements of the QAMS-005/80 guidance, and the
products from the DQO process. Therefore, only this Guidance need be consulted in writ-
ing the Quality Assurance Project Plan.
Acknowledgment
In addition; to the QAMS-005/80 guidance, ideas (and in some cases, actual words)
used previously by other Agency personnel to explain quality assurance are credited for
shaping this guidance.
-------
Section No. Guidelines
Revision No. 0
Date Sept 1989
Page 1 of 2
GENERAL GUIDELINES
Throughout this document, guidelines intended for EPA contractors are denoted "For Fund-
Lead Projects." Guidelines intended for Potentially Responsible Parties and Federal Facilities
are denoted "For Enforcement-Lead Projects." This distinction is used to minimize documen-
tation efforts for EPA contractors following the standard QA planning practices in the Region.
Format. Organize the elements of the Plan according to the sequence listed in the Table
of Contents. This format generally follows a logical train of thought for planning the
measurement operations.
Document Control Information. Display the following information on each page:
Section No.
Revision No.
Date:
Page of
This is a tool for indexing pages and posting revisions to an approved plan (therefore, its
use is optional during plan preparation.) The "Revision Number" represents the most cur-
rent version, i. e., upon approval, the first version is "0". The other entries are self-
explanatory. If the Plan is subsequently revised, the index works this way:
Page Revisions. Change the Revision Number, and revise the date. Assign a
page number to any new pages; a new page inserted between pages 5 and 6,
for example, could be numbered page 5a.
New Document. If major alterations result in a new document, return the
Revision Number to "0" and revise the Date.
Standard Operating Procedures (SOPs). Many field and laboratory operations can be stan-
dardized and written as Standard Operating Procedures, for incorporation by reference
into the Quality Assurance Project Plan. Examples are: sampling site selection, sampling
and analytical methodology, storage containers, sample preservatives, special precautions,
instrument selection and use, calibration, maintenance, QC procedures, documentation,
document control, sample custody procedures, data handling procedures, and measurement
of precision, accuracy, and completeness. The use of SOPs by reference is described
below.
Documentation of Procedures. Include complete descriptions of all anticipated procedures
— sampling, analysis, data reduction and validation, etc. — either directly or by
reference. If a reference contains several alternative procedures, specify which one(s)
apply, or how a selection will be made.
Generally-recognized reference procedures (e. g., EPA reference methods)
should be cited by number or name. Provide the complete citation in a foot-
note or on a separate reference list.
-------
Section No. Guidelines
Revision No. 0
Date Sent 1989
Page 2 of 2
Standard operating procedures (SOPs) and published procedures not approved
by EPA should be briefly summarized. The sources should be cited, and the
referenced section of the document submitted with the Plan.
New or unpublished procedures, and modifications to published procedures or
SOPs should be submitted, with the rationale for use.
Procedures in preparation are procedures which depend on information,
perhaps initial sampling results, which does not exist at the time the Plan is
prepared. In the Plan, describe the situation, and the conditions under which
the procedure will be submitted for review.
Fund-Lead Projects: state the procedures in the Sample Plan.
Enforcement-Lead Projects: when a procedure is finally developed.
submit it for EPA review. After approval, attach it to the Plan as
an addendum.
A laboratory QA manual may be used to address a given Plan element only if
it contains sufficiently detailed and specific information which can be ap-
plied definitively to the project. Reference and provide the specific excerpt,
and state how it addresses that element for the analyses pertinent to the
project.
Fund-Lead Projects: when the Contract Laboratory Program
Routine Analytical Services (CLP-RAS) are utilized, the following
procedures are pre-established by the CLP, and should be so cited:
* laboratory sample custody;
9 analytical procedures;
0 laboratory instrument calibration & maintenance;
* laboratory QC checks and criteria;
e laboratory data reduction and reporting;
0 laboratory audits;
* sample documentation forms
To avoid transcription errors from re-typing a list, table, or excerpt from an
existing document, simply cite the reference if it is widely-distributed, or
photocopy the information.
Plan Content. If a particular QA element, or portion thereof, is not relevant to the
project, include in its place a brief explanation of why this is so.
-------
Section No. 1
Revision No. 0
Date Sept 1989
Page 1 of
PREPARATION OF A QUALITY ASSURANCE PROJECT PLAN
I. PLAN IDENTIFICATION
TITLE AND SIGNATURE PAGE(S)
Include, at minimum:
* Title of the Plan
0 Name of organization(s) implementing the project
0 Names, titles, signatures of approving officials and approval dates,
for:
Organization's Project Manager
Organization's Quality Assurance Officer
EPA Remedial Project Manager
EPA Regional Quality Assurance Officer
Others, as needed
TABLE OF CONTENTS
List the sections, figures, tables, appendices, the number of pages in each section, the revi-
sion number, and the date (the table of contents in this guidance illustrates the format.)
Following the table of contents, provide a distribution list of the individuals who will
receive copies of the plan and any subsequent revisions. Include all managers responsible
for implementing the plan as well as the EPA Regional QAO, on this list.
-------
Section No. H_
Revision No. 0
Date Scot 1989
Page 1 of L0_
II. QUALITY ASSURANCE ELEMENTS
PROJECT OBJECTIVES AND ORGANIZATION
Before a QA project plan is written, interaction must occur among the decision-makers,
data users, and technical staff. Laboratory personnel are consulted regarding analytical
method options. The result is an understanding of the need for new data and the expectations
associated with it.
This understanding is defined in terms of data quality objectives, which are quantita-
tive statements of the precision, accuracy, detection or quantilation level, and completeness
goals of the measurement effort, and qualitative statements about the representativeness and
comparability. The DQO statements contain all of the information required by the project
technical staff to unambiguously proceed with designing the project. The lines of communica-
tion set up at the onset of the project should be maintained throughout the project, so the ini-
tial DQO estimates may be adjusted to keep pace with incoming information.
The sections of the QA Project Plan entitled Project Description, Project Organization.
and Data Quality Objectives contain the DQO statements, which determine the suitable sam-
pling, analytical, and QA/QC protocols described later in the Plan.
1. PROJECT DESCRIPTION
The Project Description lays the groundwork for the DQOs by establishing the objec-
tives of the measurement project, the data needed, the intended uses and the data users, and
the strategy for achieving the objectives.
A. Objective and Scope
* Why is the project needed?
Briefly summarize the site background - history of site use, reason for en-
vironmental concern, and general conclusions of any relevant previous
studies, including matrices and substances of interest and approximate con-
centration levels. Describe the adequacy of the existing data, which requires
the collection of new data.
B. Data Usage
* Delineate the scope of the project, i. e., the domain (geographical locale, environmental
medium, time period, etc.) over which conclusions and decisions will apply.
* State the time, resource, or other constraints on the measurement project.
-------
Section No. II_
Revision No. 0
Date Sept 1989
Page 2 of 1_0_
0 What data are needed, and how will they be used? List or explain the following:
60 the intended uses of the data, in order of importance;
00 the decisions to be made for which data are needed;
00 the users of the data and the decision-makers.
C. Experimental Design and Rationale
0 What is the design of the project?
00 Outline in general terms the experimental design of the project and the
anticipated project activities, including, the sampling network design, sam-
pling frequencies, sample matrices, measurement parameters of interest, and
the rationale for the design. The measurement parameters include field
measurements and any hydrogeological investigations (such as particle-size
analysis.)
00 Provide a project schedule or a sequence of milestones and their expected
durations. If individual sampling plans will be developed for discrete project
phases, include their preparation schedule.
2. DATA QUALITY OBJECTIVES (POOS) FOR MEASUREMENT DATA
An environmental measurement effort is worth doing only if it produces useful infor-
mation. The quality of the data needed to meet the project objectives determines the choice of
sampling and analytical methods, and quality assurance and quality control procedures. There-
fore, DQOs must be dearly defined prior to defining the remaining elements of the QA project
plan. Without first defining DQOs, a QA program can only be used to document the quality of
data obtained, not to ensure that the quality is sufficient (4.)
One approach for developing DQOs is suggested in EPA guidance (3.) However, the
level of effort devoted to developing DQOs should be appropriate to the size of the data col-
lection activity. It is important that a cooperative effort be undertaken by the project manager
and sampling and analytical personnel, so that DQOs are developed based on the intended data
uses as well as the sampling and analytical capabilities.
0 Consider the prioritized data uses and decisions stated in the Project Description. If
possible, prepare tables and lists of the following information, from the combined inputs
of decision-makers, data users, and project design staff:
00 The data needed: measurement parameters, compounds, and sample
matrices.
00 The action levels or standards upon which decisions will be made, includ-
ing the data reporting units. Cite the source(s) of this information.
-------
Section No. II_
Revision No. 0
Date Scot 1989
Page 3 of ]Q_
** The summary statistic(s), e. g., mean, maximum, range, etc., which specify
the form the data will be in when compared against action levels or stan-
dards, and the reason for the selection.
0181
The acceptable level of confidence in the data needed for the stated pur-
poses; or the acceptable amount of uncertainty. One way of estimating un-
certainty is to sum the probabilities of committing the major types of
measurement errors.
* Tabulate the quantitative precision, accuracy, and completeness goals for each major
measurement parameter (including all pollutant measurements), based on the DOO state-
ments:
Ol* The numerical goals should be for the total measurement, if possible, or
the field and laboratory components separately. In the event there is no basis
for defining data quality goals for the project, goals may be estimated based
on prior knowledge of the measurement system, and on method validation
studies (using replicates, spikes, standards, recovery studies, etc.) Explain the
circumstances under which these goals were established.
** If defining numerical goals is not relevant for certain measurements indi-
cate this and state the reason.
** Identify any sample types, such as control or background samples which
require 100% completeness. '
°* State the units of expression of the precision and accuracy goals- these
should correspond to the methods selected to assess data precision accuracy
described later in the Plan.
ln?If th,C- 8°alr °f achievinS data representativeness and comparability, and the planning
considerations for attaining these goals (some examples follow.) Unlike precision ac-
curacy, and completeness, these objectives are not expressed or assessed quantitatively
a1Ven?SS I? ?f^Cted in the Site SamplinS lay°ut (sampling locations, fre-
''T^ *?* thc fleM and labora«»y «™pling and analytical scheme. Data
1S d,ependent uPon consistency in sampling conditions, selection of sampling
3- PROJECT ORGANIZATION
'S,m0re l'kely l° SUCceed if its ^rations are coordinated. It is essential that
°f the eMire W"' organization, not just their Own funcnona,
areas.
-------
Section No. H_
Revision No. 0
Date Sept 1989
Page 4 of [0_
0 Identify the individuals or organizations, including EPA managers, who are directly
responsible for the following areas of the project. Include a brief description of duties.
00 project management;
°® overall quality assurance;
00 field activities (including training of field personnel, sample
collection and field measurements, and quality control;
00 laboratory analyses;
00 database management;
00 data validation;
00 audits;
00 corrective actions.
0 Identify the individuals or organizations who are the data users and the decision-
makers. Be sure to include data users who are outside of the organization generating the
data, but for whom the data are nevertheless intended, e. g., toxicologists, community rela-
tions specialists, etc.
* Provide a concise organization chart showing the relationships and the lines of com-
munication among the preceding entities. If possible, the project quality assurance
manager should be independent of the unit generating the data. Do not include senior of-
ficials, such as corporate managers or agency administrators, who are nominally but not
functionally involved in data generation, data use, or decision-making. Where direct con-
tact between project managers and data users does not occur (e.g., between a project con-
sultant for a Potentially-Responsible Party and EPA risk assessment staff) the organiza-
tion chart should show the route by which information is exchanged.
MEASUREMENT PROCEDURES
4. SAMPLE COLLECTION
The defensibility of data is dependent on the use of well-defined, accepted sampling
procedures. Data comparability is ensured when each sampling event in the project is carried
out in the same manner by all sampling personnel.
0 Describe the following aspects of the project sampling design:
00 techniques or guidelines to be followed in selecting sampling points and
frequencies, well installation design when applicable, and sampling equip-
ment. When field screening techniques will be used to identify samples for
laboratory analysis, describe the criteria for sample selection;
00 preparation and decontamination of sampling equipment, including dis-
posal of decontamination by-products;
10
-------
Section No. H_
Revision No. 0
Date Scot 1989
Page 5 of 10
*"* selection and preparation of sample containers, sample volumes, preserva-
tion methods, and maximum sample holding times to sample extraction
and/or analysis. A tabular presentation format is recommended;
00 procedures for collecting samples;
00 provisions for sample handling and shipment, taking into account the na-
ture of the samples and the maximum allowable sample holding times before
extraction or analysis.
Fund-Lead Projects: this information is contained in the site sample plan(s), and
may be addressed by reference.
5. SAMPLE CUSTODY
The defensibility of data, especially those which may be used as legal evidence, requires
proof that they were properly generated. Implementing proper sample chain-oj -custody proce-
dures should ensure that custody is documented for every step in the handling of the sample,
from collection through analysis. Samples and evidence files (including original laboratory
reports) must be maintained in the custody of authorized personnel, or under documented con-
trol in a secure area. When legal chain-o /-custody is needed, procedures should be consistent
with NEIC guidelines (7). A sample is considered to be in custody if:
°/r is in one's actual physical possession or view,
°/< is in one's physical possession and has not been tampered with, i. e., under
lock or official seal,
°// is retained in a secured area with restricted access, or
°/f is placed in a container and secured with an official seal such that the sample
cannot be reached without breaking the seal.
0 Describe the following provisions for sample custody, in both the field and the
laboratory:
00 Forms, notebooks and procedures to record the exact location and ambient
conditions associated with sample collection, possession and analysis. In the
laboratory, a sample custody log, consisting of serially-numbered sample-
tracking report forms, should be maintained.
^ Examples of sample documentation forms, such as sample labels, custody
seals, and chain-of-custody forms.
*"* Labeling procedures and information entered on the forms, including
sample preservation, if any, and dates and times of sample transfer and
analysis.
00 Procedures for transferring and maintaining custody of samples. Designate
a laboratory sample custodian who is authorized to sign for incoming
samples, obtain shipping documents, and verify the data and sample custody
records.
11
-------
Section No. H_
Revision No. Q_
Date Sent 1989
Page 6 of J
6. ANALYTICAL AND QUALITY CONTROL PROCEDURES
Appropriate field and laboratory analytical procedures and quality control checks are
selected to meet the DQO specifications stated in the Plan, and to demonstrate the data quality.
The data quality can then be measured against the previously-established DQOs. The following
are the minimum considerations:
Analytical Procedures and Detection or Quant it at ion Limits:
0 EPA-approved procedures must be used whenever possible; these procedures
contain documented method performance information and assessment criteria.
Methods should, contain the following information: sample preparation proce-
dures, analytical and QC procedures and criteria, verification of results, method
performance data (precision and accuracy), and operator qualifications.
0 The achievable detection limits or quantitation limits stated in the selected
methods must be adequate for valid comparisons of analytical results against, any
action levels or standards.
0 To maintain data comparability, each analytical procedure, once selected from
among the acceptable options, should be used throughout the project, barring dif-
ficulties which endanger the validity of the data.
Calibration and Preventive Maintenance. The accuracy of scientific measurements
requires that instruments function properly. This is verified by regular calibra-
tion and maintenance. Logbooks should be maintained for the major field and
laboratory instrumentation, to document servicing, maintenance, and instrument
modifications.
Internal Quality Control Checks and Corrective Actions. Quality control checks of
field and laboratory sampling and analysis serve two purposes: to document the
data quality, and to identify areas of weakness within the measurement process
which need correction. A program of periodic internal quality control checks is
needed to support the field and laboratory measurements. The extent of the
program should reflect the data quality needs and intended data uses.
Data Calculations and Reporting Units. A data reduction scheme states the equa-
tions used to calculate the value of the measured parameters and the reporting
units. These must be compatible with the intended data uses.
Documentation and Deliverables. Laboratory documentation and reporting
deliverables are specified so that information is available to determine the quality
and usability of the data.
12
-------
Section No. H_
Revision No. 0
Date Sept 1989
Page 7 of I0_
0 For each field or laboratory measurement, or group of analytes to be measured by a
single analytical method, outline the analytical and quality control procedures using the
following format:
Analvtes: List the specific analytes to be measured.
Sample Matrices: State the sample matrices and anticipated sample concentra-
tions.
Analytical Procedure and Detection or Ouantitation Limits: Provide the
method reference number or attach a copy of the method. The method
selected must be directly applicable, as written, to all analytes and matrices;
if not, modifications to the method must be proposed. If the method includes
optional procedures, those selected should be identified. Provide or cite
documentation of the method precision and accuracy. Identify any potential
analytical interferences, or other method limitations, and describe how these
will be treated by the laboratory. Specify the required quantitation or detec-
tion limits for each analyte.
Calibration Procedure and Criteria: Reference the sections of the method
describing the minimum instrument calibration (including tuning of the mass
spectrometers), or provide the appropriate procedures. For both initial and
continuing calibrations, state the frequency, number of calibration points,
and the calibration range and traceability of standards. Also state the
quality control criteria and acceptance limits which indicate the system is
calibrated.
\
Preventive Maintenance: Document the measures, including inspection, test-
ing, and preventive maintenance procedures and critical spare parts, to assure
that field and laboratory equipment function optimally with minimal
downtime. Describe any contingency plans, e. g., equipment backups, in case
of equipment failure. For each major piece of field and laboratory equip-
ment, summarize the preventive maintenance program in a table.
Internal Quality Control Checks and Corrective Action: List the required
quality control (QC) checks, such as matrix spikes, duplicates, blanks,
laboratory control samples, surrogates, second column confirmation, etc. State
the frequency of analysis for each type of QC check, and the spike com-
pounds and levels. State or reference the required control limits for each QC
check and corrective action required when control limits are exceeded.
Calculations and Reporting Unify State the required reporting units,
and state or reference the required calculations. For solid sample analyses!
indicate whether results are reported on a dry or wet weight basis. Also indi-
cate whether moisture or solids content is needed.
13
-------
Section No. H_
Revision No. 0_
Date Sent 1989
Page 8 of LQ_
Documentation and Deliverables: Itemize the information and records which
must be included in a data report package, and specify the reporting format,
if desired. Documentation can include raw data, instrument printouts, and
results of calibration and QC checks. Specify the laboratory data reporting
turnaround time.
Fund-Lead Projects: preventive maintenance policies stated in the QA Program
Plan may be cited. If no additional requirements apply, this should be stated.
Any site-specific requirements should be specified in the Plan. The remaining
components of this element are also required in the sample plans, and may be
addressed by reference to the plan(s).
Enforcement-Lead Projects: consult EPA guidance (EPA. 1988) before preparing
this element.
QUALITY ASSURANCE MANAGEMENT
The value of data for achieving well-founded decisions rests upon two components:
scientific validity, and integrity. The degree of validity of data is characterized by comparing
the analytical and QC results to defined scientific criteria. The integrity of the data is main-
tained by observing procedures designed to prevent errors and loss of data during manipula-
tion and transfer.
7. DATA QUALITY MANAGEMENT
0 Outline the project data management scheme, tracing the path of the data, beginning
from receipt from the field or laboratory, to the use or storage of the final reported form.
Describe the standard record-keeping procedures, document control system, and the means
of data storage and retrieval. Include the control mechanism for detecting and correcting
paperwork errors, and preventing loss of data, during data reduction (i.e., calculations),
data reporting, and data entry to forms, reports, and databases. Provide examples of any
forms or checklists to be used.
0 State the criteria used by the project team to review and validate — that is, accept,
reject, or qualify ~ data, in an objective and consistent manner. Provide examples of
any forms or checklists to be used. Describe how the results are conveyed to data users.
The review of data can include checks of the following: transmittal errors, field and
laboratory quality control data, detection limits, instrument calibration, special sampling
or analysis conditions, performance and system audits, and statistical data treatments,
such as tests for outliers.
Fund-Lead Projects: If this element is addressed in a QA Program Plan, cite the
applicable section, and state any site-specific requirements.
14
-------
Section No. II_
Revision No. 0
Date Scot 1989
Page 9 of LQ_
0 Identify the procedures used to assess precision, accuracy, and completeness for the
project data. For each major measurement parameter, state the equations for calculating
precision, accuracy, and completeness, and the methods to be used to gather data for the
precision and accuracy calculations. Describe any statistical or other treatments to be
used.
8. OA OVERSIGHT
A. Performance and System Audits
An audit assesses the capability and performance of a measurement system, or its com-
ponents, and identifies problems which warrant, correction. Two types of audits may be con-
ducted: the systems audit, which verifies adherence to-standard, operating procedures and
quality assurance policies, and the performance audit, which measures the abiiiiy to achieve
measurement data which are comparable to a standard of reference. The audit is conducted
by individuals who are not directly involved in the measurement process.
The systems audit consists of on-site evaluation of the physical facilities, equipment.
and personnel of a measurement system, to determine their proper selection and use, and obser-
vation of the measurement, quality control, and documentation procedures. A systems audit is
recommended prior to or shortly after a system is operational, to confirm the system's readi-
ness. During the lifetime of the project, systems audits, or technical audits of system com-
ponents (field or laboratory), are conducted on a regularly-scheduled basis.
The performance audit is conducted periodically to determine the accuracy of the
measurement system or its components. Laboratory analysis of performance evaluation
samples and participation in inter-laboratory performance evaluation studies may be part of
the performance audit process.
In support of performance audits, EPA provides audit materials and devices, conducts
regularly-scheduled inter-laboratory performance evaluation studies, and provides guidance
and assistance in the conduct of systems audits. The Regional QAO may be contacted to make
arrangements for assistance in these areas.
0 Include a schedule or frequency for conducting systems and performance audits for
each major measurement parameter. Describe the auditing protocols and criteria, and the
provisions for reporting and follow-up. Provide examples of any forms or checklists to
be used.
Fund-Lead Projects: if this element is addressed in a QA Program Plan, cite the
appropriate section which contains this information. Describe any site-specific
requirements in the Plan.
15
-------
Section No. II_
Revision No. 0
Date Scot 1989
Page 10 of K)_
B. Corrective Action
The ability to quickly detect and correct a problem may lessen the potential impact of
the problem on the project.
0 Describe the mechanism for identifying any system deficiencies, tracing the source,
planning and implementing corrective actions, and documenting, problem resolution. Iden-
tify the chain of command through which corrective actions and follow-ups are initiated
and approved.
Fund-Lead Projects: if this element is addressed in a QA Program Plan, cite the
appropriate section which contains this information, and describe any site-specific
requirements in the Plan.
C. QUALITY ASSURANCE REPORTS TO MANAGEMENT
A basis for timely and effective response to problems is established by developing and
maintaining QA reporting and feedback channels to management.
* Identify the frequency, content, and distribution of reports issued to inform manage-
ment of the following:
00 status of the project;
00 results of performance and system audits;
00 results of periodic data quality assessments;
00 significant quality assurance problems and recommended
solutions
0 Identify the responsible unit which will prepare the report, and the recipients of the
report.
0 Include a provision for summarizing data quality information in a separate QA section
in the final project report.
Fund-Lead Projects: if this element is addressed in a QA Program Plan, cite the
appropriate section which contains the information, and describe any site-specific
requirements in the Plan.
16
-------
Section No. Definitions
Revision No. 0
Date Seot 1989
Page 1 of 1
DEFINITIONS
Accuracy: The degree of agreement of a measured value with an accepted reference or
true value. Accuracy can be expressed numerically as the absolute value of the difference
between a measured and a reference or true value, or as the ratio of the difference ex-
pressed as a percentage of this value. (Although "accuracy" is a misnomer as defined here,
these are the- commonly-accepted definitions.)
Comparability: The confidence with which one set of data can be compared to another.
Comparability is dependent upon consistency in sampling conditions and selection of sam-
pling procedures, sample preservation methods, analytical methods, and units of data ex-
pression.
Completeness: Comparison of the number of valid data obtained from a measurement ef-
fort to the total number needed to meet the project goals. Data completeness incorporates
the factors of sample loss and data acceptability, i.e., the data quality.
Data Reduction: The mathematical and/or statistical calculations used to convert raw data
to the reported data.
Data Validation: A systematic process for reviewing a body of data against a pre-
established set of criteria to determine the quality ofthe data.
Out-of-Control Data: Data which fall outside pre-established acceptance limits.
Performance Audit: Procedure used to independently collect measurement data and quan-
titatively determine the accuracy of measurement data through the use of performance
evaluation samples.
Precision: A measure of agreement (reproducibility) among replicate measurements. Preci-
sion can be expressed as the standard deviation, or when duplicate measurements are per-
formed, as the percent difference or relative percent difference.
Quality Assurance: The total integrated program for the planning, acquisition, and review
of monitoring and measurement data, to meet user requirements.
-*
Qualify Cpntrol: The routine application of procedures for obtaining prescribed standards
of performance in the monitoring and measurement process.
Representativeness.: Reliability with which a measurement or measurement system reflects
the true conditions under investigation. Representativeness is influenced by the number
and location of the sampling points, sampling timing and frequency in monitoring efforts,
and by the field and laboratory sampling procedures.
Systems Audit: A review of the data generation process, including on-site audits of the
field and laboratory operations.
17
-------
Section No. References
Revision No. 0
Date Sept 1989
Page 1 of 1
REFERENCES
1. EPA, 1989. Preparation of a U.S. EPA Region 9 Sample Plan for EPA-Lead Superfund
Projects. (9QA-5-89) Available from QAO, Region 9.
2. EPA, 1988. Documentation Requirements for Data Validation of Non-CLP Laboratory
Data for Qrganic and Inorganic Analyses. (9QA-7-89) Available from QAO, Region 9.
3. EPA, 1987. Data Quality Objectives for Remedial Response Activities. 2 Parts.
Development Process (EPA-540/G-87/003). Example Scenario (EPA-540/G-87/004). Avail-
able in EPA Region 9 Library.
4. EPA, 1986. Development of Data Quality Objectives, Description of Stages I and II.
Quality Assurance Management Staff, EPA Headquarters. Available from QAO, Region
9.
5. EPA, 1984. Policy and Program Requirements to Implement the Mandatory Quality As-
surance Program. EPA Order 5360.1.
6. EPA, 1980. Interim Guidelines and Specifications for Preparing Quality Assurance
Project Plans, QAMS-005/80 (EPA 600/4-83-004). Quality Assurance Management Staff,
EPA Headquarters. Available from QAO, Region 9.
7. EPA, 1980. Guidelines and Specifications for Preparing Quality Assurance Program
Plans, QAMS-004/80. Quality Assurance Management Staff, EPA Headquarters. Available
from QAO, Region 9.
8. NEIC, 1986. NEIC>olicies and Procedures Manual (EPA-330/9-78-001-R). NEIC Office
of Enforcement. Available in EPA Region 9 Library.
18
-------
US EPA REGION 9 GUIDANCE FOR
PREPARING QUALITY ASSURANCE PROJECT PLANS
FOR SUPERFUND REMEDIAL PROJECTS
(Document Control No. 9QA-03-89)
September, 1989
Quality Assurance Management Section
Environmental Services Branch
Office of Policy and Management
USEPA Region 9
-------
TABLE OF CONTENTS
Section Page Pages Revision Date
Introduction 1 3 0 Sept, 1989
General Guidelines 420 Sept, 1989
Preparation of a Quality Assurance Project Plan
I. Plan Identification 1
Title and Signature Pages 6 0 Sept, 1989
Table of Contents 6 0 Sept, 1989
II. Quality Assurance Elements . 11
Project Objectives and Organization
1. Project Description 7 0 Sept, 1989
2. Data Quality Objectives 8 0 Sept, 1989
3. Project Organization 10 0 Sept, 1989
Measurement Procedures
4. Sample Collection and Quality Control 10 0 Sept, 1989
5. Sample Custody 11 0 Sept, 1989
6. Analytical and Quality Control Procedures.. 12 . 0 Sept, 1989
Quality Assurance Management
7. Data Quality Management 15 0 Sept, 1989
8. Quality Assurance Oversight 15 0 Sept, 1989
Definitions 18 1 0 Sept, 1989
References 19 1 0 Sept, 1989
-------
Section No. Introduction
Revision No. 0
Date Sept 1989
Page 1 of 3
INTRODUCTION
Quality Assurance Policy
Environmental measurements are conducted with the goal of producing data which are
scientifically valid, are of known quality which meets the established objectives, and are legally
defensible if necessary. Environmental measurements include field or laboratory work
involving any of the following (6):
* measurement of chemical, physical, or biological parameters in the environment;
0 measurement of pollutants in waste streams;
0 health- and ecological-effect studies;
0 clinical and epidemiological investigations;
0 laboratory simulation of environmental events;
* studies or measurements on pollution transport, including diffusion models.
Quality assurance arises from the attitude of doing a job right the first time. Although
sometimes perceived as an activity apart from the mainstream, QA is inherent in the
measurement process; i.e., determining the needs to be met... thinking through the operations
... anticipating the potholes ... making contingency plans ... and demonstrating the quality of
the result. Upfront planning is important for controlling or accounting for the variables that
influence the quality of the measurement data. All members of a project planning team have
QA activities in their domains whether or not they are aware of it.
The Agency quality assurance policy states that a Quality Assurance Project Plan must
be developed and approved prior to every monitoring and measurement project or group of
similar projects (5.)
Quality Assurance Project Plan Guidelines
In 1980, the Quality Assurance Management Staff at EPA Headquarters (QAMS- HQ),
which oversees and supports the Agency's QA activities, identified the elements of a Quality
Assurance Project Plan for environmental data collection. The Plan consists of the specific
organization, quality assurance objectives, methodologies and operating procedures, and
QA/QC measures designed to achieve and document the data quality. The elements are
contained in the document entitled "Interim Guidelines and Specifications for Preparing
Quality Assurance Project Plans", QAMS-005/80 (6). These guidelines are the foundation
upon which the EPA Regions may develop regional requirements for Quality Assurance
Project Plans.
-------
Section No. Introduction
Revision No. 0
Date Sept 1989
Page 2 of 3
Data Quality Objectives Guidelines
QAMS-HQ issued "Development of Data Quality Objectives " (4) and a "Data Quality
Objectives Checklist (3), to illustrate a process for determining the appropriate level of effort
to reach a measurement goal. Data quality objectives (DQOs) are quantitative and qualitative
statements of the type of data needed to support a decision, based on the level of uncertainty
that a decision-maker is willing to accept and the resources available.
The involvement of data-users and decision-makers early in the planning process is
emphasized. The product of the DQO process should be quantitative statements of the
precision, accuracy, detection or quantitation level, and completeness which are the goals of
the measurement effort, and qualitative statements about the representativeness and
comparability. This information is similar to but more extensive than what was required by
the QAMS 005/80 guidance for QA project plans.
Quality Assurance Program Plan and Sampling and Analysis Plan
In addition to the Quality Assurance Project Plan, two other documents are used in
planning quality assurance. These are the Quality Assurance Program Plan and the Sampling
and Analysis Plan.
The Quality Assurance Program Plan describes the QA policies of an organization
which performs environmental measurements. A Quality Assurance Program Plan is required
from each organization which performs environmental measurements under contract to EPA.
The QA policies represent a commitment by the management, to allocate the time and
resources necessary to produce environmental data of the quality needed. The mechanisms
for carrying out the QA policies are described in the Plan, and include the following elements
(7):
* QA personnel and responsibilities;
* Selection, inspection, and maintenance of facilities, equipment, and services;
* Qualifications and training of technical personnel;
* Data management and data quality management;
* Audits;
0 Corrective action;
* Assessment of the QA program and reports to management.
Some QA activities which are the same for all sites can be described in the Quality
Assurance Program Plan.
-------
Section No. Introduction
Revision No. 0
Date Sept 1989
Page 3 of 3
The Sampling and Analysis Plan ("Sample Plan") is a document which is submitted for
each discrete sampling event. The Sample Plan functions as an operating manual for field
personnel, as well as for requesting laboratory services. Where the Quality Assurance Project
Plan identifies the anticipated methodologies in the project, the Sample Plan states in detail
the specific procedures selected. The Sample Plan (1) contains the following elements:
0 Objective;
* Background;
0 Maps;
* Rationale;
* Request for Analysis;
0 Field Methods and Procedures;
* Health and Safety Plan.
Regional Quality Assurance Project Plan Guidance
The QA Project Plan is the written product of the upfront planning and thought
process, providing all project participants with the same clear goals and guidelines.
The following is the Region 9 Guidance for developing QA Project Plans for Superfund
projects. It contains the elements of the QAMS-005/80 guidance, and the products from the
DQO process. Therefore, only this Guidance need be consulted in writing the Quality
Assurance Project Plan.
Acknowledgment
In addition to the QAMS-005/80 guidance, ideas (and in some cases, actual words)
used previously by other Agency personnel to explain quality assurance are credited for
shaping this guidance.
3
-------
Section No. Guidelines
Revision No. 0
Date Sept 1989
Page 1 of 2
GENERAL GUIDELINES
Throughout this document, guidelines intended for EPA contractors are denoted "For Fund- Lead
Projects." Guidelines intended for Potentially Responsible Parties and Federal Facilities are
denoted "For Enforcement-Lead Projects." This distinction is used to minimize documentation
efforts for EPA contractors following the standard QA planning practices in the Region.
Format. Organize the elements of the Plan according to the sequence listed in the Table of
Contents. This format generally follows a logical train of thought for planning the
measurement operations.
Document Control Information. Display the following information on each page:
Section No.
Revision No.
Date:
Page of
This is a tool for indexing pages and posting revisions to an approved plan (therefore, its use
is optional during plan preparation.) The "Revision Number" represents the most current
version, i. e., upon approval, the first version is "0". The other entries are self-explanatory. If
the Plan is subsequently revised, the index works this way:
Page Revisions. Change the Revision Number, and revise the date. Assign a page
number to any new pages; a new page inserted between pages 5 and 6, for example,
could be numbered page 5a.
New Document, If major alterations result in a new document, return the Revision
Number to "0" and revise the Date.
Standard Operating Procedures (SOPsX Many field and laboratory operations can be stan-
dardized and written as Standard Operating Procedures, for incorporation by reference into
the Quality Assurance Project Plan. Examples are: sampling site selection, sampling and
analytical methodology, storage containers, sample preservatives, special precautions,
instrument selection and use, calibration, maintenance, QC procedures, documentation,
document control, sample custody procedures, data handling procedures, and measurement of
precision, accuracy, and completeness. The use of SOPs by reference is described below.
Documentation of Procedures. Include complete descriptions of all anticipated procedures -
sampling, analysis, data reduction and validation, etc. •- either directly or by reference. If a
reference contains several alternative procedures, specify which one(s) apply, or how a
selection will be made.
4
-------
Section No. Guidelines
Revision No. 0
Date Sept 1989
Page 2 of 2
Generally-recognized reference procedures (e. g., EPA reference methods) should be
cited by number or name. Provide the complete citation in a footnote or on a separate
reference list.
Standard operating procedures (SOPs) and published procedures not approved by EPA
should be briefly summarized. The sources should be cited, and the referenced section
of the document submitted with the Plan.
New or unpublished procedures, and modifications to published procedures or SOPs
should be submitted, with the rationale for use.
Procedures in preparation are procedures which depend on information, perhaps initial
sampling results, which does not exist at the time the Plan is prepared. In the Plan,
describe the situation, and the conditions under which the procedure will be submitted
for review.
Fund-Lead Projects: state the procedures in the Sample Plan.
Enforcement-Lead Projects: when a procedure is finally developed, submit it for
EPA review. After approval, attach it to the Plan as an addendum.
A laboratory QA manual may be used to address a given Plan element only if it contains
sufficiently detailed and specific
information which can be applied definitively to the project, provide the specific excerpt, and
state how it addresses that element for the analyses pertinent to the project.
Fund-Lead Projects: when the Contract Laboratory Program Routine Analytical Services
(CLP-RAS) are utilized, the following procedures are pre- established by the CLP, and
should be so cited:
* laboratory sample custody;
* analytical procedures;
* laboratory instrument calibration & maintenance;
* laboratory QC checks and criteria;
* laboratory data reduction and reporting;
* laboratory audits;
* sample documentation forms
To avoid transcription errors from re-typing a list, table, or excerpt from an existing
document, simply cite the reference if it is widely-distributed, or photocopy the
information.
Plan Content. If a particular QA element, or portion thereof, is not relevant to the project,
include in its place a brief explanation of why this is so.
-------
Section No. I
Revision No. 0
Date Sept 1989
Page 1 of
PREPARATION OF A QUALITY ASSURANCE PROJECT PLAN
I. PLAN IDENTIFICATION
TITLE AND SIGNATURE PAGE(S)
Include, at minimum:
* Title of the Plan
* Name of organization(s) implementing the project
0 Names, titles, signatures of approving officials and approval dates, for:
Organization's Project Manager
Organization's Quality Assurance Officer
EPA Remedial Project Manager
EPA Regional Quality Assurance Officer
Others, as needed
TABLE OF CONTENTS
List the sections, figures, tables, appendices, the number of pages in each section, the revision
number, and the date (the table of contents in this guidance illustrates the format.) Following
the table of contents, provide a distribution list of the individuals who will receive copies of
the plan and any subsequent revisions. Include all managers responsible for implementing the
plan as well as the EPA Regional QAO, on this list.
-------
Section No. II
Revision No. 0
Date Sept 1989
Page 1 of 11
II. QUALITY ASSURANCE ELEMENTS
PROJECT OBJECTIVES AND ORGANIZATION
Before a QA project plan is written, interaction must occur among the decision-makers,
data users, and technical staff. Laboratory personnel are consulted regarding analytical method
options. The result is an understanding of the need for new data and the expectations associated
with it.
This understanding is defined in terms of data quality objectives, which are quantitative
statements of the precision, accuracy, detection or quantitation level, and completeness goals of the
measurement effort, and qualitative statements about the representativeness and comparability.
The DQO statements contain all of the information required by the project technical staff to
unambiguously proceed with designing the project. The lines of communication set up at the onset
of the project should be maintained throughout the project, so the initial DQO estimates may be
adjusted to keep pace with incoming information.
The sections of the QA Project Plan entitled Project Description, Project Organization, and
Data Quality Objectives contain the DQO statements, which determine the suitable sampling,
analytical, and QA/QC protocols described later in the Plan.
1. PROJECT DESCRIPTION
The Project Description lays the groundwork for the DQOs by establishing the objectives of
the measurement project, the data needed, the intended uses and the data users, and the strategy
for achieving the objectives.
A. Objective and Scope
* Why is the project needed?
Briefly summarize the site background -- history of site use, reason for environmental
concern, and general conclusions of any relevant previous studies, including matrices
and substances of interest and approximate concentration levels. Describe the
adequacy of the existing data, which requires the collection of new data.
B. Data Usage
* Delineate the scope of the project, i. e., the domain (geographical locale, environmental
medium, time period, etc.) over which conclusions and decisions will apply.
-------
Section No. II
Revision No. 0
Date Sept 1989
Page 2 of 11
* State the time, resource, or other constraints on the measurement project.
* What data are needed, and how will they be used? List or explain the following:
08 the intended uses of the data, in order of importance;
** the decisions to be made for which data are needed;
08 the users of the data and the decision-makers.
C. Experimental Design and Rationale
0 What is the design of the project?
*"> Outline in general terms the experimental design of the project and the
anticipated project activities, including the sampling network design, sampling
frequencies, sample matrices, measurement parameters of interest, and the
rationale for the design. The measurement parameters include field
measurements and any hydrogeological investigations (such as particle-size
analysis.)
00 Provide a project schedule or a sequence of milestones and their expected
durations. If individual sampling plans will be developed for discrete project
phases, include their preparation schedule.
2. DATA QUALITY OBJECTIVES (DOOS^ FOR MEASUREMENT DATA
An environmental measurement effort is worth doing only if it produces useful information.
The quality of the data needed to meet the project objectives determines the choice of sampling
and analytical methods, and quality assurance and quality control procedures. Therefore, DQOs
must be clearly defined prior to defining the remaining elements of the QA project plan. Without
first defining DQOs, a QA program can only be used to document the quality of data obtained,
not to ensure that the quality is sufficient (4.)
One approach for developing DQOs is suggested in EPA guidance (3.) However, the level
of effort devoted to developing DQOs should be appropriate to the size of the data collection
activity. It is important that a cooperative effort be undertaken by the project manager and
sampling and analytical personnel, so that DQOs are developed based on the intended data uses
as well as the sampling and analytical capabilities.
* Consider the prioritized data uses and decisions stated in the Project Description. If
possible, prepare tables and lists of the following information, from the combined inputs of
decision-makers, data users, and project design staff:
8
-------
Section No. II
Revision No. 0
Date Sept 1989
Page 3 of 11
tttf The data needed: measurement parameters, compounds,'and sample matrices.
*"* The action levels or standards upon which decisions will be made, including the data
reporting units. Cite the source(s) of this information.
00 The summary statistic(s), e. g., mean, maximum, range, etc., which specify the form
the data will be in when compared against action levels or standards, and the reason
for the selection.
*° The acceptable level of confidence in the data needed for the stated purposes; or
the acceptable amount of uncertainty. One way of estimating uncertainty is to sum the
probabilities of committing the major types of measurement errors.
0 Tabulate the quantitative precision, accuracy, and completeness goals for each major
measurement parameter (including all pollutant measurements), based on the DQO state-
ments.
00 The numerical goals should be for the total measurement, if possible, or the field
and laboratory components separately. In the event there is no basis for defining data
quality goals for the project, goals may be estimated based on prior knowledge of the
measurement system, and on method validation studies (using replicates, spikes,
standards, recovery studies, etc.) Explain the circumstances under which these goals
were established.
*"* If defining numerical goals is not relevant for certain measurements, indicate this
and state the reason.
°* Identify any sample types, such as control or background samples, which require
100% completeness.
** State the units of expression of the precision and accuracy goals; these should
correspond to the methods selected to assess data precision, accuracy, described later in
the Plan.
0 State the goals of achieving data representativeness and comparability, and the planning
considerations for attaining these goals (some examples follow.) Unlike precision, accuracy,
and completeness, these objectives are not expressed or assessed quantitatively. Data
representativeness is reflected in the site sampling layout (sampling locations, frequencies, and
timing) and the field and laboratory sampling and analytical scheme. Data comparability is
dependent upon consistency in sampling conditions, selection of sampling procedures, sample
preservation methods, analytical methods, and data reporting units, throughout the project.
-------
Section No. II
Revision No. 0
Date Sept 1989
Page 4 of 11
3. PROJECT ORGANIZATION
A project is more likefy to succeed if its operations are coordinated. It is essential that all
individuals be clearly aware of the entire project organization, not just their own functional areas.
* Identify the individuals or organizations, including EPA managers, who are directly
responsible for the following areas of the project. Include a brief description of duties.
** project management;
** overall quality assurance;
** field activities (including training of field personnel, sample collection and field
measurements, and quality control;
** laboratory analyses;
** database management;
** data validation;
** audits;
** corrective actions.
* Identify the individuals or organizations who are the data users and the decision- makers.
Be sure to include data users who are outside of the organization generating the data, but for
whom the data are nevertheless intended, e. g., lexicologists, community relations specialists,
etc.
0 Provide a concise organization chart showing the relationships and the lines of com-
munication among the preceding entities. If possible, the project quality assurance manager
should be independent of the unit generating the data. Do not include senior officials, such as
corporate managers or agency administrators, who are nominally but not functionally involved
in data generation, data use, or decision-making. Where direct contact between project
managers and data users does not occur (e.g., between a project consultant for a Potentially-
Responsible Party and EPA risk assessment staff) the organization chart should show the
route by which information is exchanged.
MEASUREMENT PROCEDURES
4. SAMPLE COLLECTION
The defensibility of data is dependent on the use of well-defined, accepted sampling
procedures. Data comparability is ensured when each sampling event in the project is carried out
in the same manner by all sampling personnel
10
-------
Section No. II
Revision No. 0
Date Sept 1989
Page 5 of 11
Describe the following aspects of the project sampling desigm:
** techniques or guidelines to be followed in selecting sampling points andfrequencies,
well installation design when applicable, and sampling equipment. When field
screening techniques will be used to identify samples for laboratory analysis, describe
the criteria for sample selection;
°* preparation and decontamination of sampling equipment, including disposal of
decontamination by-products;
** selection and preparation of sample containers, sample volumes, preservation
methods, and maximum sample holding times to sample extraction and/or analysis. A
tabular presentation format is recommended;
oo
procedures for collecting samples;
0 * provisions for sample handling and shipment, taking into account the nature of the
samples and the maximum allowable sample holding times before extraction or analysis.
Fund-Lead Projects: this information is contained in the site sample plan(s), and may be
addressed by reference.
5. SAMPLE CUSTODY
The defensibility of data, especially those which may be used as legal evidence, requires
proof that they were properly generated. Implementing proper sample chain-of-custody procedures
should ensure that custody is documented for every step in the handling of the sample, from
collection through analysis. Samples and evidence files (including original laboratory reports) must
be maintained in the custody of authorized personnel, or under documented control in a secure
area. When legal chain-of-custody is needed, procedures should be consistent with NEIC
guidelines (7). A sample is considered to be in custody if:
*It is in one's actual physical possession or view,
*It is in one's physical possession and has not been tampered with, i. e., under lock or
official seal,
*It is retained in a secured area with restricted access, or
*It is placed in a container and secured with an official seal such that the sample cannot
be reached without breaking the seal
11
-------
Section No. II
Revision No. 0
Date Sept 1989
Page 6 of 11
Describe the following provisions for sample custody, in both the field and the laboratory:
** Forms, notebooks and procedures to record the exact location and ambient
conditions associated with sample collection, possession and analysis. In the laboratory,
a sample custody log, consisting of serially-numbered sample-tracking report forms,
should be maintained.
** Examples of sample documentation forms, such as sample labels, custody seals, and
chain-of-custody forms.
00 Labeling procedures and information entered on the forms, including sample
preservation, if any, and dates and times of sample transfer and analysis.
°* Procedures for transferring and maintaining custody of samples. Designate a
laboratory sample custodian who is authorized to sign for incoming samples, obtain
shipping documents, and verify the data and sample custody records.
6. ANALYTICAL AND QUALITY CONTROL PROCEDURES
Appropriate field and laboratory analytical procedures and quality control checks are
seated to meet the DQO specifications stated in the Plan, and to demonstrate the data quality.
Tfie data quality can then be measured against the previously-established DQOs. The following
are the minimum considerations:
Analytical Procedures and Detection or Quantitation Limits:
* EPA-approved procedures must be used whenever possible; these procedures contain
documented method performance information and assessment criteria. Methods should
contain the following information: sample preparation procedures, analytical and QC
procedures and criteria, verification of results, method performance data (precision and
accuracy), and operator qualifications.
* The achievable detection limits or quantitation limits stated in the selected methods
must be adequate for valid comparisons of analytical results against any action levels or
standards.
* To maintain data comparability, each analytical procedure, once selected from among
the acceptable options, should be used throughout the project, barring difficulties which
endanger the validity of the data.
12
-------
Section No. II
Revision No. 0
Date Sept 1989
Page 7 of 11
Calibration and Preventive Maintenance, The accuracy of scientific measurements requires
that instruments function properly. This is verified by regular calibration and maintenance.
Logbooks should be maintained for the major field and laboratory instrumentation, to
document servicing, maintenance, and instrument modifications.
Internal Quality Control Checks and Corrective Actions. Quality control checks of field
and laboratory sampling and analysis serve two purposes: to document the data quality,
and to identify areas of weakness within the measurement process which need correction,
A program of periodic internal quality control checks is needed to support the field and
laboratory measurements. The extent of the program should reflect the data quality needs
and intended data uses.
Data Calculations and Reporting Units. A data reduction scheme states the equations used
to calculate the value of the measured parameters and the reporting units. These must be
compatible with the intended data uses.
Documentation and Deliverables. Laboratory documentation and reporting deliverables are
specified so that information is available to determine the quality and usability of the data
* For each field or laboratory measurement, or group of analytes to be measured by a single
analytical method, outline the analytical and quality control procedures using the following
format:
Analytes: List the specific analytes to be measured.
Sample Matrices: State the sample matrices and anticipated sample concentrations.
Analytical Procedure and Detection or Quantitation Limits: Provide the method
reference number or attach a copy of the method. The method selected must be
directly applicable, as written, to all analytes and matrices; if not, modifications to the
method must be proposed. If the method includes optional procedures, those selected
should be identified. Provide or cite documentation of the method precision and
accuracy. Identify any potential analytical interferences, or other method limitations,
and describe how these will be treated by the laboratory. Specify the required
quantitation or detection limits for each analyte.
Calibration Procedure and Criteria: Reference the sections of the method describing
the minimum instrument calibration (including tuning of the mass spectrometers), or
provide the appropriate procedures. For both initial and continuing calibrations, state
the frequency, number of calibration points, and the calibration range and traceability
of standards. Also state the quality control criteria and acceptance limits which
indicate the system is calibrated.
13
-------
Section No. II
Revision No. 0
Date Sept 1989
Page 8 of 11
Preventive Maintenance: Document the measures, including inspection, testing, and
preventive maintenance procedures and critical spare parts, to assure that field and
laboratory equipment function optimally with minimal downtime. Describe any
contingency plans, e. g., equipment backups, in case of equipment failure. For each
major piece of field and laboratory equipment, summarize the preventive maintenance
program in a table.
Internal Quality Control Checks and Corrective Action: List the required quality
control (QC) checks, such as matrix spikes, duplicates, blanks, laboratory control
samples, surrogates, second column confirmation, etc. State the frequency of analysis
for each type of QC check, and the spike compounds and levels. State or reference the
required control limits for each QC check and corrective action required when control
limits are exceeded.
Data Calculations and Reporting Units: State the required reporting units, and state or
reference the required calculations. For solid sample analyses, indicate whether results
are reported on a dry or wet weight basis. Also indicate whether moisture or solids
content is needed.
Documentation and Deliverables: Itemize the information and records which must be
included in a data report package, and specify the reporting format, if desired.
Documentation can include raw data, instrument printouts, and results of calibration
and QC checks. Specify the laboratory data reporting turnaround time.
Fund-Lead Projects: preventive maintenance policies stated in the QA Program Plan may
be cited. If no additional requirements apply, this should be stated. Any site-specific
requirements should be specified in the Plan. The remaining components of this element
are also required in the sample plans, and may be addressed by reference to the plan(s).
Enforcement-Lead Projects: consult EPA guidance (EPA, 1988) before preparing this
element.
14
-------
Section No. II
Revision No. 0
Date Sept 1989
Page 9 of 11
QUALITY ASSURANCE MANAGEMENT
The value of data for achieving well-founded decisions rests upon two components:
scientific validity, and integrity. The degree of validity of data is characterized by comparing the
analytical and QC results to defined scientific criteria. The integrity of the data is maintained by
observing procedures designed to prevent errors and loss of data during manipulation and transfer.
1. DATA QUALITY MANAGEMENT
0 Outline the project data management scheme, tracing the path of the data, beginning from
receipt from the field or laboratory, to the use or storage of the final reported form. Describe
the standard record-keeping procedures, document control system, and the means of data
storage and retrieval. Include the control mechanism for detecting and correcting paperwork
errors, and preventing loss of data, during data reduction (i.e., calculations), data reporting,
and data entry to forms, reports, and databases. Provide examples of any forms or checklists
to be used.
* State the criteria used by the project team to review and validate - that is, accept, reject, or
qualify - data, in an objective and consistent manner. Provide examples of any forms or
checklists to be used. Describe how the results are conveyed to data users. The review of
data can include checks of the following: transmittal errors, field and laboratory quality
control data, detection limits, instrument calibration, special sampling or analysis conditions,
performance and system audits, and statistical data treatments, such as tests for outliers.
Fund-Lead Projects: If this element is addressed in a QA Program Plan, cite the
applicable section, and state any site-specific requirements.
* Identify the procedures used to assess precision, accuracy, and completeness for the project
data. For each major measurement parameter, state the equations for calculating precision,
accuracy, and completeness, and the methods to be used to gather data for the precision and
accuracy calculations. Describe any statistical or other treatments to be used.
8. OA OVERSIGHT
A. Performance and System Audits
An audit assesses the capability and performance of a measurement system, or its com-
ponents, and identifies problems which warrant correction. Two types of audits may be conducted:
the systems audit, which verifies adherence to standard operating procedures and quality assurance
policies, and the performance audit, which measures the ability to achieve measurement data
which are comparable to a standard of reference. The audit is conducted by individuals who are
not directly involved in the measurement process.
15
-------
Section No. II
Revision No. 0
Date Sept 1989
Page 10 of 11
The systems audit consists ofon-site evaluation of the physical facilities, equipment, and
personnel of a measurement system, to determine their proper selection and use, and observation
of the measurement, quality control, and documentation procedures. A systems audit is
recommended prior to or shortly after a system is operational, to confirm the system's readiness.
During the lifetime of the project, systems audits, or technical audits of system components (field
or laboratory), are conducted on a regularly-scheduled basis.
The performance audit is conducted periodically to determine the accuracy of the
measurement system or its components. Laboratory analysis of performance evaluation samples
and participation in inter- laboratory performance evaluation studies may be part of the
performance audit process.
In support of performance audits, EPA provides audit materials and devices, conducts
regularly-scheduled inter-laboratory performance evaluation studies, and provides guidance and
assistance in the conduct of systems audits. The Regional QAO may be contacted to make
arrangements for assistance in these areas.
* Include a schedule or frequency for conducting systems and performance audits for each
major measurement parameter. Describe the auditing protocols and criteria, and the
provisions for reporting and follow-up. Provide examples of any forms or checklists to be
used.
Fund-Lead Projects: if this element is addressed in a QA Program Plan, cite the
appropriate section which contains this information. Describe any site-specific requirements
in the Plan.
B. Corrective Action
The ability to quickly detect and correct a problem may lessen the potential impact of the
problem on the project.
* Describe the mechanism for identifying any system deficiencies, tracing the source, planning
and implementing corrective actions, and documenting problem resolution. Identify the chain
of command through which corrective actions and follow-ups are initiated and approved.
Fund-Lead Projects: if this element is addressed in a QA Program Plan, cite the
appropriate section which contains this information, and describe any site-specific
requirements in the Plan.
16
-------
Section No. II
Revision No. 0
Date Sept 1989
Page 11 of 11
C. QUALITY ASSURANCE REPORTS TO MANAGEMENT
A basis for timefy and effective response to problems is established by developing and
maintaining QA reporting and feedback channels to management.
0 Identify the frequency, content, and distribution of reports issued to inform management of
the following:
on
so
sstt
oo
status of the project;
results of performance and system audits;
results of periodic data quality assessments;
significant quality assurance problems and recommended solutions
* Identify the responsible unit which will prepare the report, and the recipients of the report.
0 Include a provision for summarizing data quality information in a separate QA section in the
final project report.
Fund-Lead Projects: if this element is addressed in a QA Program Plan, cite the
appropriate section which contains the information, and describe any site-specific
requirements in the Plan.
17
-------
Section No. Definitions
Revision No. 0
Date Sept 1989
Page 1 of 1
DEFINITIONS
Accuracy: The degree of agreement of a measured value with an accepted reference or true
value. Accuracy can be expressed numerically as the absolute value of the difference between
a measured and a reference or true value, or as the ratio of the difference expressed as a
percentage of this value. (Although "accuracy" is a misnomer as defined here, these are the
commonly-accepted definitions.)
Comparability: The confidence with which one set of data can be compared to another.
Comparability is dependent upon consistency in sampling conditions and selection of sampling
procedures, sample preservation methods, analytical methods, and units of data expression.
Completeness: Comparison of the number of valid data obtained from a measurement effort
to the total number needed to meet the project goals. Data completeness incorporates the
factors of sample loss and data acceptability, i.e., the data quality.
Data Reduction: The mathematical and/or statistical calculations used to convert raw data to
the reported data.
Data Validation: A systematic process for reviewing a body of data against a pre- established
set of criteria to determine the quality ofthe data.
nnt-nf-Control Data: Data which fall outside pre-established acceptance limits.
Performance Audit: Procedure used to independently collect measurement data and quan-
titatively determine the accuracy of measurement data through the use of performance
evaluation samples.
Precision: A measure of agreement (reproducibility) among replicate measurements. Precision
can be expressed as the standard deviation, or when duplicate measurements are performed,
as the percent difference or relative percent difference.
Duality Assurance: The total integrated program for the planning, acquisition, and review of
monitoring and measurement data, to meet user requirements.
Quality Control: The routine application of procedures for obtaining prescribed standards of
performance in the monitoring and measurement process.
Representativeness: Reliability with which a measurement or measurement system reflects the
true conditions under investigation. Representativeness is influenced by the number and
location of the sampling points, sampling timing and frequency in monitoring efforts, and by
the field and laboratory sampling procedures.
Systems Audit: A review of the data generation process, including on-site audits of the field
and laboratory operations.
18
-------
Section No. References
Revision No. 0
Date Sept 1989
Page 1 of 1
REFERENCES
1. EPA, 1989. Preparation of a U.S. EPA Region 9 Sample Plan for EPA-Lead Superfund
Projects. (9QA-5-89) Available from QAO, Region 9.
2. EPA, 1988. Documentation Requirements for Data Validation of Non-CLP Laboratory
Data for Organic and Inorganic Analyses. (9QA-7-89) Available from QAO, Region 9.
3. EPA, 1987. Data Quality Objectives for Remedial Response Activities. 2 Parts.
Development Process (EPA-540/G-87/003). Example Scenario (EPA- 540/G-87/004). Avail-
able in EPA Region 9 Library.
4. EPA, 1986. Development of Data Quality Objectives, Description of Stages I and II.
Quality Assurance Management Staff, EPA Headquarters. Available from QAO, Region 9.
5. EPA, 1984. Policy and Program Requirements to Implement the Mandatory Quality As-
surance Program. EPA Order 5360.1.
6. EPA, 1980. Interim Guidelines and Specifications for Preparing Quality Assurance Project
Plans, QAMS-005/80 (EPA 600/4-83-004). Quality Assurance Management Staff, EPA
Headquarters. Available from QAO, Region 9.
7. EPA, 1980. Guidelines and Specifications for Preparing Quality Assurance Program Plans,
QAMS-004/80. Quality Assurance Management Staff, EPA Headquarters. Available from
QAO, Region 9.
8. NEIC, 1986. NEIC Policies and Procedures Manual (EPA-330/9-78-001-R). NEIC Office
of Enforcement. Available in EPA Region 9 Library.
19
------- |