Publication Numbers:
EPA: EPA-505-B-04-900B
DoD:DTICADA426957
Intergovernmental Data Quality Task Force
Uniform Federal Policy for
Quality Assurance Project Plans
Part 2B, Quality Assurance/Quality Control
Compendium: Minimum QA/QC Activities
Final
Version 1
March 2005
-------
This page intentionally left blank.
-------
TABLE OF CONTENTS
EXECUTIVE SUMMARY iii
1.0 INTRODUCTION 1
1.1 Purpose 2
1.2 Scope of the QA/QC Compendium 3
1.3 Background 4
1.3.1 IDQTF Workgroup 4
1.3.2 Findings from the Information Collection Process 4
1.4 QA/QC Compendium - Key Decisions 5
1.4.1 Relationship of Minimum QA/QC Activities to the Systematic Planning
Process and Site-Specific Project Quality Objectives 5
1.4.2 Organization of the QA Matrix 6
1.4.3 Relationship of QC Samples to Data Quality Indicators 7
1.4.4 Data Review Definitions 8
1.5 Organization of the QA/QC Compendium 8
2.0 FOUNDATIONS OF THE QA MATRIX 11
2.1 Data Type: Definitive versus Screening Data 11
2.2 Role of Data Quality Indicators in Selecting QA/QC Samples 12
2.2.1 Introduction to Data Quality Indicators 13
2.2.2 Minimum QC Samples and Data Quality Indicators 14
2.2.3 Proficiency Testing Samples 14
2.2.3.1 Guidelines 19
2.2.3.2 Availability 19
2.2.3.3 Cost and Effectiveness Issues 20
2.3 Stages of Data Collection, Analysis, and Use 20
2.3.1 Planning 20
2.3.2 Sampling and Analysis 20
2.3.3 Data Review 20
2.3.3.1 Data Review - Definitions and Scope 21
2.3.3.2 Implementation of Data Review Activities 22
2.4 Organization of the QA Matrix Based on the CERCLA Process 25
2.4.1 Investigation Phases 25
2.4.2 Construction, Post-construction, and Post-ROD Phases 27
3.0 THE QA MATRIX: MINIMUM ACTIVITIES FOR QA/QC UNDER CERCLA .... 31
3.1 Reading the Matrix 31
3.2 Key Definitions 31
3.3 Applying the Matrix 31
3.4 QA Matrix 32
APPENDIX AQC SAMPLES AND DATA QUALITY INDICATORS 53
APPENDIX B ACRONYMS AND DEFINITIONS 59
IDQTF, UFP-QAPP Part 2B i Final
QA/QC Compendium, March 2005
-------
TABLE OF CONTENTS (Continued)
LIST OF TABLES
Table 1. Evaluation of QC Samples by DQI 15
Table 2. Data Review Process Summary 22
Table 3. Steps of the Major Construction, Post-construction, and Post-ROD Phases 27
Table 4. QA Matrix: Investigation Phases 33
Table 5. QA Matrix: Post-ROD Phases 43
LIST OF FIGURES
Figure 1. Organization of the QA Matrix 7
Figure 2. Potential Participants in the Data Review Process 24
Figure 3. Example Remedy Scenarios and Comparable EPA/DoD Terminology 28
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
Final
-------
EXECUTIVE SUMMARY
Introduction
This Quality Assurance/Quality Control Compendium (QA/QC Compendium) has been prepared
by the Intergovernmental Data Quality Task Force (IDQTF) to supplement Part 1 of the Uniform
Federal Policy for Quality Assurance Project Plans (the UFP-QAPP Manual). The UFP-QAPP
Manual and the QA/QC Compendium along with the UFP-QAPP Workbook and Example QAPPs,
serve as companion documents to the IDQTF's Uniform Federal Policy for Implementing
Environmental Quality Systems (UFP-QS). The UFP-QS and the UFP-QAPP were developed as
consensus initiatives by the U.S. Environmental Protection Agency (EPA), the Department of
Defense (DoD), and the Department of Energy (DOE).
The purpose of the UFP-QAPP is to serve as a single national consensus document for consistent
and systematic implementation of project specific guidelines per Section 6 (PartB) of ANSI/ASQ
E4 (Quality Systems for Environmental Data and Technology Programs Requirements with
guidance for use). The purpose of this QA/QC Compendium is to establish minimum specifications
for data quality activities for all phases and data uses in the CERCLA process.
Background
There are inconsistencies in QA/QC activities A .. ..... D . . , .
^ ^ Applicability Beyond Hazardous Waste
implemented as part of the CERCLA process
The QA/QC Compendium was developed to address
quality assurance for hazardous waste cleanups but
the document may be used as a model for other
programs.
across EPA, DoD, and DOE. An IDQTF
workgroup collected information on the written
policies of EPA Regions and DoD facilities
regarding QA/QC activities for CERCLA
projects. The findings from that survey showed
that less than 1 percent of the possible QA/QC
activities were required in written policy by more than 75 percent of the participants. The lack of
a consistent approach leads to costly negotiations between the Federal agencies and regulatory
agencies to establish operable unit, site and/or facility-specific specifications on a case-by-case
basis. Such negotiated quality guidelines may be inconsistent with quality specifications negotiated
for similar CERCLA projects at other operable units, sites, or facilities within the same or different
states, EPA regions, or Federal facility field offices.
The IDQTF workgroup spent almost 2 years collecting and reviewing information and developing
a consensus on data quality specifications for each CERCLA phase (e.g., RI, FS), data use (e.g.,
human health risk assessment), project stage (e.g., planning), and data type (screening versus
definitive data). The workgroup assessed the current state of required QA/QC activities and
developed a value-added set of minimum QA/QC activities appropriate for all CERCLA phases.
This QA/QC Compendium presents the consensus workgroup product, including a table that
illustrates the consensus minimum QA/QC activities for each project stage, data use, and data type
of the individual CERCLA phases (a QA matrix).
IDQTF, UFP-QAPP Part 2B iii Final
QA/QC Compendium, March 2005
-------
Highlights of the QA/QC Compendium
The QA/QC activities identified in this document cannot be easily summarized in an executive
summary. However, the following outline of the principles that form the basis of the guidelines may
help the user place the more detailed specifications in context:
Workgroup development of the QA matrix revealed that the differences between minimum
QA/QC activities that support screening data versus definitive data are more significant than
any differences between CERCLA phases or data uses.
The specifications in the QA matrix do not substitute for the Systematic Planning Process
(SPP) prescribed by E4, the UFP-QS, and the UFP-QAPP Manual. The implementation of
a team-based SPP is one of the QA/QC activities in the project planning stage.
QA/QC activities specified in the QA matrix represent a consensus minimum list of
activities. QA/QC activities may be added, depending on project objectives and on site-
specific conditions.
The workgroup performed an analysis of value-added QC samples that provide information
on data quality indicators (e.g., precision, accuracy, sensitivity). As a result of this analysis,
a reduction was made in the minimum QC samples specified for CERCLA projects based
on each sample's respective added value to the understanding of the quality of data.
The data review project stage includes three types of data review processes: sampling and
analysis verification; sampling and analysis validation; and datausability assessment. These
three processes encompass sampling data quality (e.g., results from sample collection
activities) as well as analytical data quality. In addition, the definitions and examples of
activities for these three steps go beyond what is currently considered to be data review.
Certain issues related to data review are addressed in Section 5.0 of the UFP-QAPP Manual.
These include data review inputs, example data review activities, opportunities for
streamlining, and documentation of data review activities in the project-specific QAPP.
Organization of This Document
This document is organized into three sections and two appendices:
Section 1: Introduction, including scope and background, as well as overview of key
decisions.
Section 2: Foundations of the QA matrix, summarizing IDQTF policy decisions regarding
key definitions and the minimum activities of the QA matrix itself.
Section 3: Introduction to the QA matrix, including appropriate matrix use in the
development of project-specific QAPPs.
Appendices: Appendix A provides an evaluation of QC samples and their contribution to
understanding specific data quality indicators. Appendix B provides a comprehensive list
of acronyms and definitions.
The QA/QC Compendium has been created as a stand-alone document to help the reviewer
understand its development and the decisions it embodies; however, should be used along side the
other parts of the UFP-QAPP.
IDQTF, UFP-QAPP Part 2B iv Final
QA/QC Compendium, March 2005
-------
Quality Assurance/Quality Control Compendium:
Minimum QA/QC Activities
1.0 INTRODUCTION
This document is a product of the Intergovernmental Data Quality Task Force (IDQTF), which
comprises three Federal agencies: the Environmental Protection Agency (EPA), Department of
Energy (DOE), and Department of Defense (DoD). The mission of the IDQTF is to develop a
quality assurance system for environmental data collection.
The goals of the IDQTF include the following:
Develop a written agreement that constitutes an adequate quality assurance (QA)/quality
control (QC) program.
Develop a guidance/framework that outlines the roles and responsibilities of the EPA
(Headquarters and Regions) and other Federal agencies with regard to QA/QC oversight.
Develop guidance for implementation of Federal agency-wide specifications and procedures
regarding data quality.
The IDQTF is in the process of developing several work products to promote the goals of the task
force. This compendium is one of several IDQTF products that are specifically designed to provide
Federal consensus policies for the implementation of the national quality standard developed by
ANSI/ASQ and known as E4 -Quality Systems for Environmental Data and Technology Programs
-Requirements with guidance for use (American National Standards Institute and American Society
for Quality, 2004).
Current Products of the IDQTF
Uniform Federal Policy for Implementing Environmental Quality Systems (Final, Version 2, March 2005, also
called UFP-QS) - A high-level policy based on E4, Section 5 (Part A), "Management Systems."
Part 1 of the Uniform Federal Policy for Quality Assurance Project Plans (Final, Version 1, March 2005, also
called UFP-QAPP Manual) - Designed to implement Section 6 (Part B) of E4, "Collection and Evaluation of
Environmental Data."
UFP-QAPP Workbook (Final, March 2005, Part 2A of the UFP-QAPP) - Blank worksheets to assist with the
preparation of QAPPs by addressing specific requirements of the Manual.
QA/QC Compendium (Final, March 2005, Part 2B of the UPF-QAPP) - Designed to supplement the UFP-QAPP
Manual with minimum QA/QC activities for investigations and cleanups under the Comprehensive Environmental
Response, Compensation, and Liability Act (CERCLA) and similar programs.
Example QAPPs (Part 2C of the UFP-QAPP) - Provides several example QAPPs that are based on the
requirements in the Manual.
IDQTF, UFP-QAPP Part 2B 1 Final
QA/QC Compendium, March 2005
-------
1.1 Purpose
The purpose of this Quality Assurance/Quality Control Compendium: Minimum QA/QC Activities
(QA/QC Compendium) is to outline the minimum QA/QC activities that should be included in a
Quality Assurance Project Plan (QAPP) for sites undergoing investigation and cleanup under
CERCLA. These activities are listed in a QA matrix located in Section 3 of this document. The QA
matrix is designed to address the problem of inconsistency of QA/QC activities for CERCLA
projects. The goal is to ensure appropriate data
f
Applicability Beyond Hazardous Waste
The QA/QC Compendium was developed to address
quality assurance for hazardous waste cleanups but
the document may be used as a model for other
programs.
quality at CERCLA sites by instituting a set p:
uniform QA/QC activities common to all
CERCLA data collection and use activities.
The current practice has been characterized by
activities that vary among EPA Regions,
facility types, and sites. It is costly in terms of
time and expense to negotiate and implement
different specifications for individual projects, and cumbersome for a manager who has no
guidelines to follow. Use of the minimum activities found in this QA/QC Compendium will save
time and money, as well as reduce redundant activities and/or activities that add little value to the
understanding of the quality of the data. It is anticipated that the members of the IDQTF (DoD,
DOE, and EPA) will adopt the specifications advocated in this document.
This document addresses several goals for the management of environmental data collection and use,
including:
Improve the quality of environmental data collection, analysis, and use by ensuring that an
appropriate minimum level of QA/QC takes place for every CERCLA phase, data type, and
use.
Reduce costs by:
- Minimizing project-specific conflicts on QA/QC activities, or
- Eliminating those QA/QC samples that are redundant or provide limited value to
understanding the true quality of data.
Establish a common understanding of required QA/QC activities across all Federal agencies.
This includes all branches of DoD, all EPA Regions, and all DOE facilities.
This QA/QC Compendium supplements Part 1 of the UFP-QAPP, the UFP-QAPP Manual. This
document contains:
Background on the major decisions made by the IDQTF in developing minimum QA/QC
activities.
Definitions of QA/QC activities.
A list of specific QA/QC activities, organized in a simple matrix format (called a QA
matrix), that can be applied to CERCLA data collection, analysis, and use.
IDQTF, UFP-QAPP Part 2B 2 Final
QA/QC Compendium, March 2005
-------
The users of this QA/QC Compendium should include those persons charged with the task of
generating a QAPP. The QA matrix may be used by contractors, project managers, and others
dealing with Federal facilities engaged in the CERCLA process. It is designed as an outline of
QA/QC activities that should be performed and documents that should be generated during and
preceding CERCLA actions. These guidelines may be appropriate for non-CERCLA decisions that
are similar in nature, such as the corrective action program of the Resource Conservation and
Recovery Act (RCRA). The matrix is comprehensive in that it addresses the CERCLA process from
the planning stage to the site closeout stage. The activities are labeled as minimum because it is the
intent of the IDQTF that all activities specified in the QA matrix be included in the project-specific
QAPP. Additionally, project-specific QA/QC activities may be added by the project team.
1.2 Scope of the QA/QC Compendium
The scope of the QA/QC activities in this document includes the sample collection, sample analysis,
and data use components of the CERCLA process at Federal facilities. These specifications apply
to the collection and use of data for:
All phases of the CERCLA process (e.g., site investigation, remedial investigation).
The purpose of making specific decisions using primary (original) data collected during
CERCLA phases.
Both screening data, for intermediate decisions, and definitive data, for final decisions.
All stages of project management, from planning, to data collection and analysis, to data
review.
Secondary data (i.e., data generated for another purpose than the project for which it is used) is not
addressed in this document (see Section 2.7 of the UFP-QAPP Manual). In addition, QA/QC
activities specific to radiochemical data collection, analysis, and use are not included in the matrix.
Radiation QA/QC guidance can be found in the Multi-Agency Radiation Survey and Site
Investigation Manual (MARSSIM) which provides a nationally consistent consensus approach to
conducting radiation surveys and investigations, and the Draft Multi-Agency Radiological
Laboratory Analytical Protocols (MARLAP) manual which addresses the need for a nationally
consistent approach to producing radioanalytical laboratory data that meet a project's or program's
data requirements.
Specific attention was paid to the value of
different types of QC samples and their role in
understanding three data quality indicators
(DQIs): precision, accuracy, and sensitivity.
Therefore, the final QA matrix listed certain
QC samples as minimum activities, but omitted
others that may be commonly used because
they were considered to be of limited value or duplicative of other QC samples. Some of those
omitted from the QA matrix may be reinstated by project teams making site-specific decisions.
Use of Minimum Activities in QA Matrix
Activities in the QA matrix should occur as specified.
Activities not specified in the matrix, but necessary to
the project, may be added.
IDQTF, UFP-QAPP Part 2B 3 Final
QA/QC Compendium, March 2005
-------
After careful analysis of each CERCLA phase-data use combination, it became apparent that the
greatest commonality of minimum QA/QC activities was with regard to data type. The matrix
therefore divides the QA/QC activities into two types of data - screening data and definitive data.
Screening data can support an intermediate or preliminary decision but should eventually
be supported by definitive data before a project is complete.
Definitive data should be suitable for final decision-making (of the appropriate level of
precision and accuracy, as well as legally defensible).
1.3 Background
1.3.1 IDQTF Workgroup
In 1999, the IDQTF initiated a workgroup to gather information on Federal agencies' understanding
of current QA/QC activities under CERCLA. The workgroup was chaired by Robert Runyon, QA
Manager from EPA Region 2, and included chemists, other QA personnel, and remedial project
managers (RPMs) from the three IDQTF agencies (EPA, DoD, and DOE). The goal of this
information collection effort was to determine the level of agreement in written policy among EPA
Regions, DoD components, and DOE and to use that as a point of departure for recommending
minimum QA/QC activities for all Federal agencies.
During the course of workgroup discussions, a variety of specific issues were examined in depth,
including:
The minimum QA/QC activities to support DQIs (precision, accuracy, and sensitivity) for
each project stage.
Clarity of definitions for QA/QC activities and data types.
The nomenclature and organization of post-ROD and post-construction project stages.
The role of CERCLA phases versus data uses as the common denominator of QA/QC
activities.
Definitions of data review activities, and the differences between the current scope of such
activities and the desired future scope.
1.3.2 Findings from the Information Collection Process
The information collection and analysis process
showed that the most significant common
ground among EPA Regions and between EPA
and other Federal agencies was the lack of
agreement on minimum activities for QA/QC.
For each project stage under a given data use,
a list of potential QA/QC activities was
provided from which those submitting the
information on QA/QC policies could pick (the
pick list). Although there are 10,530 possible
IDQTF, UFP-QAPP Part 2B 4
QA/QC Compendium, March 2005
Findings on Current QA/QC Activities
There is little current consensus on QA/QC activities
for most CERCLA phases and data uses.
The broadest agreement for QA/QC activities is for
definitive data used during a remedial investigation.
Discrepancies exist between EPA Regions and other
Federal agencies on definitions of key terms relating
to environmental data QA/QC.
Final
-------
combinations of QA/QC activity, data use, and project stage, the consolidated QA matrix (i.e.,
combining all information received from the three agencies) revealed that only 37 individual
activity-data use-project stage combinations fit in the "most agree" category (i.e., 75% or more of
respondents agreed). All 37 combinations were part of the remedial investigation (RI) phase using
definitive data. The areas of agreement covered all four primary uses of definitive data in the RI
phase: nature of contamination, extent of contamination, human health, and ecological risk
assessment.
Understanding the Terms Used in the QA Matrix
The QA matrix data collection instrument was organized around four key terms. Understanding these terms is
important to understanding the information presented in this document.
Project stage - refers to the stage of the project preparation, execution, or assessment. Five basic project stages
are used in the matrix: planning, field sampling, on-site field measurement, off-site/fixed lab measurement, and
data review. (Note: Not all stages are used in every project.)
CERCLA phase - refers to the regulation-mandated project phases prior to, during, and following remediation
of a site as defined in the National Contingency Plan (NCP). Phases include but are not limited to preliminary
assessment (PA), site investigation (SI), remedial investigation (RI), feasibility study (FS), and removal.
Data use - refers to the purpose of the data collected and analyzed under a given CERCLA phase. Examples
include nature of contamination, human health risk assessment, process control, and compliance determination.
Data type - (i.e., screening data and definitive data) refers to the general level of data quality, based on the
ultimate use of the data. Screening data can be used for intermediate decisions, whereas final decisions require
the use of definitive data.
1.4 QA/QC Compendium - Key Decisions
The information collection process was a point of departure for the workgroup to consider the
desired QA/QC activities in the future. This QA/QC Compendium, which is based on the
workgroup's information collection and analysis process, reflects the following key decisions:
The organization of the QA matrix is based on a combination of CERCLA phases and
functions. A team-based site-specific systematic planning process is specified in the project
planning stage.
Minimum specifications for QA/QC activities are typically consistent within a specific
CERCLA phase, but vary depending upon whether the data to be used are definitive data or
screening data.
QC samples are selected as minimum activities based on the information they provide on
specific data quality indicators.
Data review encompasses both sampling and analytical activities, beginning with a
completeness check, through to a data usability assessment based on the decision to be made.
1.4.1 Relationship of Minimum QA/QC Activities to the Systematic Planning Process and
Site-Specific Project Quality Objectives
In accordance with ANSI/ASQ E4 and the UFP-QS, all environmental data collection and use are
to take place in accordance with a site-specific systematic planning process (SPP). Using this
scientifically based, logical approach to planning for data collection and use at a site helps to ensure
IDQTF, UFP-QAPP Part 2B 5 Final
QA/QC Compendium, March 2005
-------
that the amounts and types of data collected are appropriate to the decisions to be made at the site,
as well as to the special physical, environmental, and chemical characteristics of the site. The
minimum specifications documented in the QA matrix do not take the place of this site-specific SPP.
In fact, the development of a team-based SPP is one of the first QA/QC activities performed in the
project planning stage.
Although minimum QA/QC activities are specified for all environmental data collection and use,
a wide range of site-specific guidelines for those activities should be determined that relate to the
ultimate use of the data. These guidelines include, but are not limited to:
Types of decisions that will be supported by the data.
Project quality objectives.
Acceptance criteria for data quality indicators (also known as measurement performance
criteria).
Sampling plan, including location of environmental and QC samples.
Types of contaminants that require laboratory analysis (on-site, field, or fixed lab).
QA/QC activities specified in the QA matrix represent a minimum list of activities. Other QA/QC
activities may be added, depending on the decisions to be made and on site-specific conditions.
1.4.2 Organization of the QA Matrix
The workgroup's final product is a matrix of minimum QA/QC activities, which are organized as
follows:
By CERCLA phases (e.g., RI, FS), for investigation phases that occur prior to a Record of
Decision (ROD) and for removal actions
By data uses (e.g., confirmatory sampling), for post-ROD, construction, and post-
construction phases
By data type (i.e., screening versus definitive data)
By project stage (e.g., planning, field sampling)
Each QA/QC activity is listed on the vertical axis under the project stage to which it is related (see
Figure 1). Check marks across the matrix identify whether the activity applies to the particular
CERCLA phase or data use, and the specific data type. Dashes indicate that the specific activity is
not a minimum activity for that CERCLA phase or data use.
IDQTF, UFP-QAPP Part 2B 6 Final
QA/QC Compendium, March 2005
-------
CERCLA Phase: Data Use _^,
Project Stage >|rData Type -^-
Planning
Field Sampling
On- Site Field Measurements
Off-Site/Fixed Lab Measurements
Data Review
Site Investigation
Screening
Definitive
Figure 1. Organization of the QA Matrix
1.4.3 Relationship of QC Samples to Data Quality Indicators
Assurance of data quality is done, in part, through the use of quality control samples. There are
several types of QC samples, and each type is related to a set of data quality indicators (DQIs),
which are derived from the parameters of precision, accuracy, representativeness, comparability,
completeness, and sensitivity (known as PARCCS parameters). The utility of a specific QC sample
depends on its applicability to field and laboratory scenarios (e.g., analytical method, sample matrix)
as well as on what kind of information is derived from the result of the QC sample. Which QC
samples (e.g., matrix spike, laboratory control sample) should be specified for definitive and
screening data was determined by linking the contribution of each QC sample to the performance
of a specific DQI.
Key Definitions: Data Quality Indicators (DQIs) and QC Samples
Data Quality Indicators - refers to the elements that are used to characterize the quality of data. These are
reflected in quantitative statistics and qualitative descriptors that are used to interpret the degree of acceptability
or utility of data to the user. The principal data quality indicators are precision, accuracy/bias, representativeness,
comparability, completeness, and sensitivity.
QC Samples - refers to the types of control samples (collected at the site or created in the laboratory) that are
analyzed with field samples in order to evaluate the quality of the field results. Examples include matrix spikes,
field duplicates, surrogate spikes, laboratory control samples, and equipment blanks.
To determine the relationship of QC samples to data quality, the workgroup evaluated the function
of each type of QC sample on the basis of the DQI it was designed to support. Certain types of QC
samples were chosen to be minimum activities for the QA matrix based on their contribution to the
understanding of one or more DQIs. The following criteria were used during this evaluation:
Provides an overall measurement of a DQI.
Identifies specific sources of error (e.g., laboratory, field, transport).
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
7
Final
-------
Provides added value to understanding the data produced from the analysis.
QC samples that either provided the most reliable information on overall data quality or identified
specific sources of error were selected to be the minimum activities in the QA matrix. QC samples
that are typically operational requirements (e.g., calibration and instrument performance checks) and
are needed to run the analytical equipment are not generally included in the matrix because they do
not need to be specified as a minimum QA/QC activity. QC samples that were identified by the
workgroup as redundant or not informative were al so removed from consideration (e.g., bottle blank,
water source blank, matrix spike for organics). Table 1 in Section 2.3.1 summarizes the manner in
which each QC sample was examined against criteria and the category into which each was placed.
1.4.4 Data Review Definitions
Data review activities encompass a wide range of assessment activities for verification, validation,
and usability assessment. Data review is defined by the workgroup as:
The process of examining and/or evaluating data to varying levels of detail and
specificity by a variety of personnel who have different responsibilities within the
data management process. It includes verification, validation, and usability
assessment.
The specifications for data review include two major elements:
Assessment of the sample design and collection process, in addition to assessment of
laboratory analytical data.
Expansion of the understanding of the three major elements of the data review process -
verification, validation, and usability assessment. This expanded understanding goes beyond
assessing the completeness of data and compliance with method, procedural, and contractual
requirements; it includes assessment of performance and review of project-specific
criteria found in the QAPP, and assessment of the usability of the data for the site-specific
decision for which it was collected.
The data review process is separated into three steps. Each of these three steps has specific activities
that should be completed by a data reviewer.
Verification - Review for completeness.
Validation - Review for compliance with methods, procedures, and contracts, as well as for
conformity with quality objectives of QAPP.
Usability Assessment - Assess results of previous data review steps to determine usability
of data for making required decisions.
1.5 Organization of the QA/QC Compendium
The remainder of this document is organized in two sections. Section 2 describes in more detail
the key policy decisions that are the foundation of the QA matrix and the issues that were addressed
by the workgroup. Section 3 describes the QA matrix and the minimum QA/QC activities, as well
IDQTF, UFP-QAPP Part 2B 8 Final
QA/QC Compendium, March 2005
-------
as what they mean and how to use them. Appendix A contains tables and figures illustrating the
contribution of QC samples to DQIs. Appendix B contains a comprehensive list of definitions.
Some of these definitions are included in the text as well, in order to enhance reader understanding.
IDQTF, UFP-QAPP Part 2B 9 Final
QA/QC Compendium, March 2005
-------
This page intentionally left blank.
-------
2.0 FOUNDATIONS OF THE QA MATRIX
As described in Section 1, the main components of the QA matrix are type of data, CERCLA phase,
data use, project stage, and QA/QC activity. Type of data is divided into definitive and screening,
depending on the decision to be made using that data. Within those two types of data, the level of
quality is organized by CERCLA phase and/or data use. The potential QA/QC activities are sorted
by their place in the project process, from planning to the data usability assessment. Selection of
QA/QC activities as minimum specifications is based on the value added to data quality.
This section describes in detail each of those components. The discussion of specific QA/QC
activities focuses on QC samples and data review. Many of the activities are self-explanatory, so
they will not be described in detail.
2.1 Data Type: Definitive versus Screening Data
After careful analysis of each CERCLA phase-data use combination, it became apparent during the
development of the matrix that the greatest commonality of minimum QA/QC activities was with
regard to data type:
Screening data can support an intermediate or preliminary decision but should eventually
be supported by definitive data before a project is complete.
Definitive data should be suitable for final decision-making (of the appropriate level of
precision and accuracy, as well as legally defensible).
Either data type can be effective for various decisions. The major differences in QA/QC activities
for the consolidated QA matrix are largely between definitive data and screening data, rather than
between CERCLA phases or data uses.
Screening data should not be confused with data of poor quality or with field screening technologies.
Field analyses may produce either screening or definitive data, depending on the nature of the
technology. Although definitive data are held to a more rigorous quality standard, screening data
should be of sufficient quality to support the intermediate decision in which they are used. For data
to be categorized as definitive, they should be accompanied by a series of quality control measures.
These QA/QC activities are outlined in the matrix. Screening data cannot be used to make final
decisions (such as no further action or response complete) or for risk assessment, site closure, or
listing (or delisting) on the National Priorities List (NPL); however, they can be used to make
intermediate decisions, even those that are significant, such as decisions regarding placement of
monitoring wells or estimates of extent of contamination.
IDQTF, UFP-QAPP Part 2B 11 Final
QA/QC Compendium, March 2005
-------
Examples of Appropriate Use of Screening Data
for Intermediate Decisions
During project scoping to narrow down an analyte
list.
During soil removal to identify when the removal is
getting close to cleanup objectives. Screening data
can indicate when definitive samples should be taken
to confirm achievement of cleanup goals.
During process control functions when construction
is underway and engineering adjustments are made
to optimize treatment.
During the remedial investigation to determine well
placement.
Although screening data are used only for
preliminary or intermediate decisions, the
quality of the data is still very important. To
ensure that screening data meet project quality
objectives (PQOs), positive controls should be
used to verify that the analysis will detect
contaminants in samples when they are present.
The purpose of using positive controls is to
eliminate false negatives. Examples of positive
control samples include a laboratory-fortified
blank at the reporting limit, a proficiency test
(PT) sample, or a manufacturer-supplied
positive control. In addition, confirmatory
analyses using definitive data are a minimum
activity for screening data to confirm results.
Definitive data are held to a more rigorous
quality standard and are used to make final
decisions such as level or existence of risk,
response complete, or site closure. In general,
definitive data refers to analytical data of
known quality, concentration, and level of
uncertainty, and those levels of quality and
uncertainty are consistent with the
specifications for the decision to be made. In
assessing the usability of definitive data for
the decision to be made, it is important to
recognize that acceptable quality and certainty in the datapoints does not mean that the data
set can be used. For the data set to be usable, the data points (including sample location and
procedures) should meet criteria of representativeness, completeness, and comparability.
These criteria are set during the planning stage, when the PQOs are established and sampling design
rationale is developed.
2.2 Role of Data Quality Indicators in Selecting QA/QC Samples
The Uniform Federal Policy for Implementing Environmental Quality Systems (UFP-QS) defines
data quality indicators as:
...the quantitative statistics and qualitative descriptors that are used to interpret the
degree of acceptability or utility of data to the user. The principal data quality
indicators are precision, accuracy/bias, representativeness, comparability,
completeness, and sensitivity.
Under Superfund, numerous QC samples are typically specified during investigation and cleanup,
with limited knowledge of what the benefits to data quality are. It is the view of the IDQTF
workgroup that these QC samples are collected and analyzed at significant cost, and the information
Examples of Appropriate Use of Definitive Data
for Final Decisions
For listing of a site on the National Priorities List.
For a determination that no further action is
required.
To identify whether any unacceptable risk is present
(risk assessment).
To confirm achievement of cleanup goals.
For compliance sampling of air or water discharges.
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
12
Final
-------
they convey may be either repetitive, not used, or interpreted incorrectly. The workgroup analyzed
all the possible types of QC samples currently required and related their contributions to data quality
indicators.
Tables A-l through A-4 in Appendix A outline the contributions that the types of QC samples make
to understanding each DQI. The accompanying charts assist in the evaluation of QC samples and
explain which activities measure specific indicators at various points in the analytical process. The
results of the evaluation of DQIs are reflected in the QA/QC activities identified in the matrix.
Appendix B defines each QC sample (as well as other terms), including those not identified as
minimum activities at CERCLA sites.
2.2.1 Introduction to Data Quality Indicators
The workgroup systematically considered the DQIs of precision, accuracy/bias, and sensitivity
when determining which QC samples should be minimum specifications. In addition, the
workgroup examined the DQIs of comparability and identified the qualitative measures that can
be used to evaluate the achievement of this indicator. Finally, the workgroup recognized that
representativeness is generally a factor of the sampling decision, is qualitative in nature and is
indicated by project specific PQOs and associated statistical measures (e.g., confidence levels that
do not translate into minimum QC samples).
Precision is the degree to which a set of observations or measurements of the same property,
obtained under similar conditions, conform to themselves. Precision is usually expressed
as standard deviation, variance, or range, in either absolute or relative terms (NELAC, 1999).
Examples of QC samples for precision include field duplicates, laboratory duplicates, matrix
spike duplicates, analytical replicates, and surrogates.
Accuracy/Bias is the degree of agreement between an observed value and an accepted
reference value. Accuracy includes a combination of random error (precision) and
systematic error (bias) components, which are due to sampling and analytical operations
(EPA, 1993). Examples of QC samples for accuracy include PT samples, matrix spikes,
laboratory control samples (LCSs), and equipment blanks. The contamination subset of
accuracy refers to measurements that indicate contamination of a sample. These consist of
blanks, which indicate equipment contamination or method errors.
Representativeness is a measure of the degree to which data accurately and precisely
represent a characteristic of a population, a parameter variation at a sampling point, a process
condition, or an environmental condition (EPA, 1998).
Comparability is the degree to which different methods, data sets, and/or decisions agree
or can be represented as similar. Comparability describes the confidence (expressed
qualitatively or quantitatively) that two data sets can contribute to a common analysis and
interpolation .
Sensitivity is the capability of a test method or instrument to discriminate between
measurement responses representing different levels (e.g., concentrations) of a variable of
interest (NELAC, 1999). Examples of QC samples for determining sensitivity include
laboratory-fortified blanks, a method detection limit study, and initial calibration low
standards at the quantitation limit. Ultimately, sensitivity is a derived measure that
represents values that can be differentiated with some degree of statistical confidence.
IDQTF, UFP-QAPP Part 2B 13 Final
QA/QC Compendium, March 2005
-------
2.2.2 Minimum QC Samples and Data Quality Indicators
The workgroup determined each QC sample's value using the known information concerning the
QC sample's contribution to measuring precision, accuracy, contamination subset (bias) of accuracy,
and sensitivity. Because quality control samples are currently required and impose significant costs,
the workgroup attempted to identify which of those samples are not cost-effective, that is, which
provide very little additional information on the quality of the data or only duplicate the information.
Each quality control sample found in Appendix A was evaluated using the following criteria:
Provides overall measurement of accuracy, precision, etc.
Identifies critical sources of error
Is an operational requirement
Is considered not useful to other QC samples or is redundant
QC samples that provided the best overall measure of data quality and identified critical sources of
error were kept as minimum QA/QC activities. Other quality control check samples may be added
on a proj ect-specific basis but are not minimum specifications for every proj ect. Those QC samples
that were identified as an operational requirement were not listed as minimum activities in the
matrix. It was assumed that those samples would be collected and analyzed as part of standard
procedure. (The exception is the matrix spike for inorganics, which is listed in the QA matrix in
order to stress that the sample is for inorganics and not organics.) Table 1 presents the results of the
evaluation of QC samples using DQIs. (Definitions in Appendix B provide further explanation of
each QC sample.)
The issue of comparability can be addressed through several types of QC samples. Split samples
can contribute to the determination of comparability; however, it is the view of the Workgroup that
a split sample (see definition in Appendix B) is useful only when accompanied by a batch-specific
proficiency testing (PT) sample. Without the associated PT sample, the only information obtained
from split samples is that the results of the samples are different; no acceptance criteria for that
difference are available. Therefore, split samples are not a minimum QA/QC activity. They can be
added on a project-specific basis and should be used only when accompanied by a PT sample for
proper evaluation of results.
2.2.3 Proficiency Testing Samples
In its examination of QC samples that measure DQIs, the workgroup felt that batch-specific PT
samples are an excellent measure of the overall accuracy of the data associated with a batch. The
comment column in Table 1 provides a description of the criteria by which the specification for a
batch-specific PT sample was evaluated. Table A-2 in Appendix A, describes the manner in which
a PT sample contributes to an understanding of accuracy.
IDQTF, UFP-QAPP Part 2B 14 Final
QA/QC Compendium, March 2005
-------
Table 1. Evaluation of QC Samples by DQI
QC Sample
Overall
Measure of
DQI
Identifies
Critical Sources
of Error
Operational
Requirement
Not Useful
Comment
PRECISION
Co-located
Field
Duplicate
Lab Duplicate
Internal
Standard
Subsample
Field
Duplicate
Matrix Spike
Duplicate
Analytical
Replicate
inorganics
organics
The definition of field duplicate was clarified by differentiating
between subsample (in which one sample is collected and then split
into two or more portions) and co-located samples (in which two
different samples are collected from the same location). In the view of
the workgroup, the co-located field duplicate contributes more
information about the measurement precision of the sampling process,
including the sampling equipment and heterogeneity of the site;
therefore, it is a minimum activity in the matrix.
The workgroup felt that laboratory duplicates are usually useful only
for inorganic compounds. In order to have a comparable measure of
precision for organic compounds, surrogate spikes should be evaluated
if no target analytes are detected.
See comment for co-located field duplicate.
Matrix spike duplicates are a commonly used QC sample; however,
the results are largely a function of the spiking procedure (e.g.,
number of analytes spiked, length of time between spiking and
extraction). It is the view of the workgroup that they are not an
effective measurement of precision in environmental media.
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
15
Final
-------
Table 1. Evaluation of QC Samples by DQI (Continued)
QC Sample
Overall
Measure of
DQI
Identifies
Critical Sources
of Error
Operational
Requirement
Not Useful
Comment
ACCURACY/BIAS
Batch-specific
Proficiency
Testing
Sample
Matrix Spike
Surrogate
Spike
Laboratory
Control
Sample
'inorganics
-organics
inorganics
organics
There are several different types of proficiency testing samples: site-
specific, amputated, and full-volume. In the view of the workgroup,
the site-specific PT sample provides the most complete measurement
of accuracy. However, it is by far the most expensive and complicated
to obtain, and acceptance criteria must be established. Therefore, site-
specific PT samples are not a minimum activity. The ampulated PT
sample is the least expensive, is readily available, and has known
acceptance criteria. It can only be single blind, therefore it provides a
less complete measurement of accuracy. The full-volume PT sample is
in between the two other types as far as both cost and measurement of
accuracy is concerned. It is readily available and has known
acceptance criteria and the possibility of being double blind. The
specification for a batch-specific PT sample in the matrix is for either
an ampulated or full volume PT sample. The specific type should be
determined for the project at the scoping meeting (see Section 2.2.3
for further explanation).
The workgroup felt that a matrix spike is more appropriate for
inorganic compounds than for organic compounds, while a surrogate
spike can be used for organic compounds only. The surrogate spike
can identify matrix effects as long as the surrogates properly mimic
the analytes of concern.
See comment for matrix spike.
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
16
Final
-------
Table 1. Evaluation of QC Samples by DQI (Continued)
QC Sample
Calibrations
and
Instrument
Performance
Checks
Overall
Measure of
DQI
Equipment
Blank
Field Blank
Method Blank
Instrument
Blank
Bottle Blank
Storage Blank
Reagent Blank
VGA Trip
Blank
non-
dedicated
equipment
Identifies
Critical Sources
of Error
Operational
Requirement
Not Useful
Comment
ACCURACY/BIAS (CONTAMINATION SUBSET)
The equipment blank is performed only once a day, and therefore
cannot track contamination in every cooler. In addition, it cannot
quantify effects for soil. If nondedicated equipment is used, then the
equipment blank is useful to test the decontamination technique of
those doing the sampling.
Field blank is a new term created by the workgroup (see definition in
Appendix B). It refers to a trip blank that is carried to the sampling
site and is useful for all methods, not just volatile compounds as with
the VOA trip blank. A field blank in every cooler will identify if
contamination has occurred, whether the equipment is dedicated or
not.
See comment for field blank.
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
17
Final
-------
Table 1. Evaluation of QC Samples by DQI (Continued)
QC Sample
Temperature
Blank
Overall
Measure of
DQI
Identifies
Critical Sources
of Error
Operational
Requirement
Not Useful
Comment
Temperature monitoring of samples is important; however, a
temperature blank indicates the temperature of the samples only at the
arrival time. There is no way to know if the samples were warmer at
some time and then cooled down immediately before delivery. The
workgroup felt that instead of a temperature blank, a temperature
indicator is needed to notify the recipients when the samples have
exceeded a maximum temperature. (See definition of shipping
container temperature blank in Appendix B.) The exact nature of the
device can be determined by the project team .
SENSITIVITY
Laboratory-
fortified Blank
Initial
Calibration
Low Standard
Method
Detection
Limit Study
At the quantitation limit.
Method detection limit studies are useful for prequalification
purposes; however, in the view of the workgroup, they are not useful
for interpreting data on a sample-by-sample basis. Therefore, they are
not a minimum activity in the matrix.
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
18
Final
-------
APT sample is:
A sample, the composition of which is unknown to the laboratory or analyst, which is
provided to that analyst or laboratory to assess capability to produce results within
acceptable criteria.
2.2.3.1 Guidelines
Batch-specific PT samples are a minimum activity for almost all definitive data uses in the CERCLA
program (see Tables 4 and 5 for details). There are different kinds of batch-specific PT samples that
can be used. The specification in the QA matrix is that the proj ect team should decide whether a full
volume PT sample (one that comes ready for preparation and analysis with the other samples) or an
amputated PT sample (one that comes in liquid form and must be diluted prior to preparation and
analysis) should be used for a specific proj ect. (See Appendix B for more complete definitions.) The
full volume PT sample can be double blind (the laboratory does not know it is a PT sample and
therefore does not know the contaminants and concentrations) or single blind (known to be a PT
sample, but with unknown contaminants and concentrations). The amputated PT sample can only
be single blind.
The IDQTF determined that it is not necessary for PT samples to be double blind samples (although
it is recommended). In addition, the IDQTF decided that there will be no specification that the PT
sample be a site-specific sample, made out of the environmental media present at the site. The cost
and time associated with characterizing the site matrix and establishing acceptance criteria were
thought to be too extensive, providing little added value for the purpose.1
The contaminants and concentrations to be included in the batch-specific PT sample and the
acceptance criteria by which the results will be evaluated should be established during the project
scoping stage.
2.2.3.2 Availability
The workgroup felt that in order for a batch-specific PT sample to be useful, it should be in a media
similar to that of the environmental samples being tested (e.g., a solid or aqueous media). Ideally,
it should also contain contaminants of concern at the site and, if possible, concentrations of concern
at the site. The IDQTF recognizes that the number of PT samples available that fit each site's
specifications may be limited. Therefore, although the QA matrix lists batch-specific PT samples
as a minimum activity, full implementation of this may take some time. Project-specific
implementation of this activity should be conducted using common sense, recognizing that initially
it will not always be possible to meet the specifications.
1 If the project team agrees to the use of site-specific PT samples (because they are
already in use and readily available such as at DOE sites), it is not necessary to analyze an
additional amputated or full volume PT sample.
IDQTF, UFP-QAPP Part 2B 19 Final
QA/QC Compendium, March 2005
-------
2.2.3.3 Cost and Effectiveness Issues
The IDQTF recognizes that the specification for batch-specific PT samples can add significant
additional cost to the project's analytical budget. Therefore, the use of batch-specific PT samples
can be used as a replacement for step Ha of data validation of analytical laboratory results.
Discussion of the use of PT samples to streamline data validation is contained in Section 2.3.3.2.
2.3 Stages of Data Collection, Analysis, and Use
QA/QC activities in the matrix are grouped by five project stages: planning, field sampling, on-site
field measurements, off-site/fixed lab measurements, and data review. Although field sampling, on-
site field measurements, and off-site/fixed lab measurements have similar QA/QC activities, the
difference is that the first stage deals with collection of samples, while the latter two stages deal with
sample analysis. Thus, if a QA/QC activity is a minimum activity in one stage, it will also be a
minimum activity in the other two.
2.3.1 Planning
The planning stage in the QA matrix reflects the scoping and preparation stage of a project. This
stage has specifications for a systematic planning process, sampling design rationale, development
and approval of a QAPP, scheduling, and training. For the convenience of the matrix user, planning
activities involving definitive data and screening data are listed separately in the tables; however,
the matrix organization does not demand two different processes (meetings, QAPPs, etc.). The
planning stage for a single project should be one coordinated effort that addresses the use of both
screening data and definitive data.
2.3.2 Sampling and Analysis
The sampling and analysis phases of the QA matrix are separated into three sections. Field
sampling activities take place during sample collection. On-site field measurements occur when
analysis is performed on-site, such as in-situ testing (e.g., with a temperature probe), on-site analysis
(e.g., turbidity readings), and field trailer/mobile lab analysis. Off-site/fixed lab measurements
occur when analysis is performed in an off-site laboratory. The sampling and analysis stages include
minimum specifications for specific QC samples such as field blanks or matrix spikes. In addition,
these stages have guidelines for preparation, such as inspection and maintenance of supplies, and
guidelines for review or oversight (e.g., internal/external audits). The three stages have almost
identical QA/QC activities, but the CERCLA phase and data use activities will differ depending on
whether the data type is definitive or screening.
2.3.3 Data Review
The QA matrix outlines a variety of activities for data review. Because EPA, DoD, and DOE define
data review steps differently, the workgroup forged a common understanding of the components
of data review.
IDQTF, UFP-QAPP Part 2B 20 Final
QA/QC Compendium, March 2005
-------
2.3.3.1 Data Review Definitions and Scope
The workgroup defined data review as:
The process of examining and/or evaluating data to varying levels of detail and
specificity by personnel within the data management process. It includes
verification, validation, and usability assessment.
The data review process is separated into three steps. Each of these three steps has specific activities
that should be completed by a data reviewer.
Step I: Verification (completeness check) - Confirmation by examination and provision
of objective evidence that the specified requirements (sampling and analytical) have been
completed.
Step II: Validation (Ha - Compliance with methods, procedures, and contracts; lib -
Comparison with quality objectives of QAPP) - Confirmation by examination and
provision of objective evidence that the particular requirements for a specific intended use
are fulfilled. Validation is a sampling and analytical process evaluation that includes
evaluating compliance with method, procedure, or contract requirements, and extends to
evaluation of criteria based upon the quality objectives (e.g., PQOs) developed in the QAPP.
The purpose of validation is to assess and document the performance associated with the
sampling and analysis to determine the quality of specified data.
Step III: Usability assessment -Determination of the adequacy of data, based on the results
of validation and verification, for the decisions being made. The usability step involves
assessing and documenting whether the process execution and resulting data meet project
quality objectives documented in the QAPP.
Table 2 describes the objectives, scope, and steps of data review associated with each process term.
The table identifies where the scope of the terms or the steps involved in the process are expansions
of current practice. Those expansions encompass the following:
The terms verification and validation apply to field sampling activities as well as to the
analytical component of data generation.
Validation assesses not only compliance with method, procedure, and contract requirements,
but also assesses compliance with QAPP-specific requirements.
Usability assessments are one of the minimum activities of data review for all CERCLA
phases and data uses. This is the final step of data review, and as such, it assesses whether
the data are suitable as a basis for decisions.
IDQTF, UFP-QAPP Part 2B 21 Final
QA/QC Compendium, March 2005
-------
Table 2. Data Review Process Summary
Process Term
Verification
Validation
Usability
Assessment*
Objective
Review to see if data
required for the project are
available.
Assess and document
performance of the field
sample collection process.
Assess and document
performance of the
analytical process.
Assess and document
usability to meet project
quality objectives.
Scope
- Sampling*
- Analysis
- Sampling*
- Analysis
- Sampling
- Analysis
Data Review Step
I. Completeness check
Ha. Check compliance with
method, procedure, and
contract requirements
lib. Compare with measurement
performance criteria from the
QAPP*
III. Assess usability of data by
considering project quality
objectives and the decision to
be made*
* The scope of the term or the step involved is in expansions of current practice.
2.3.3.2 Implementation of Data Review Activities
Specifications for the implementation of the data review process acknowledge two important issues:
Data review should take into account the relationship of the data reviewer to the entity that
originally performed the work.
Data review steps can be streamlined in a variety of ways.
Relationship of Data Reviewer to Generation of Data
Implementation of the data review process should take into account the relationship of the data
reviewer to the entity that performed the work (generated the data). This relationship requires a
balance between the need to maintain the integrity of the process (e.g., the entity who generates the
analytical or field data may have a conflict of interest in conducting the review and therefore may
be precluded from performing the review) and ensuring that those with the appropriate expertise are
involved in the data review step. The relationship of the data reviewer to each step of the data review
process is described below:
Step I (verification): Both the data generator and client are expected to perform data
verification.
Step II (validation):
StepIIa (Compliance withMethods, Procedures, and Contracts). Validation associated
with step Ha should be conducted by an entity at least one step removed from the entity
that generated the data (field or analytical). In general this will mean that validation step
Ha of analytical data will be conducted outside the laboratory, while the validation of the
field sampling activities will be conducted by entities working for the prime contractor
who are not responsible for the field sampling activities.
~ Step lib (Comparison to Quality Objectives of QAPP). Validation step lib will usually
involve those that have been involved in the development of the QAPP and/or the
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
22
Final
-------
project, but may also include a reviewing entity that is separate from the entities
conducting the work.
Step III (usability assessment): The usability assessment should be performed by the full
project team, although it may also involve people outside the project execution.
Figure 2 presents a flow chart of the data review process and the potential participants in each step.
Streamlining Data Review
Streamlining of the data review process (streamlining data review) is meant to reduce time and costs
while still confirming the quality of the data. Thus, any streamlining option developed and
documented by the project team should recognize that:
The type and amount of data reviewed should be sufficient to develop a clear understanding
of the quality of the data.
The practice of reviewing a subset of data (or of a data indicator such as a successful PT
sample) as a substitute for review of all data should be reevaluated if problems are detected
that call into question the quality of a data set.
Streamlined data review occurs when efficiencies are created in the data review process by:
Looking at a subset of data that is representative of a larger universe.
Examining the data in an alternate manner (e.g., use of a successful batch of specific PT
samples as a substitute for validation of some compliance indicators).
Different EPA Regions, DoD components, and DOE facilities have negotiated a variety of
streamlining options with different projects. The decision as to the nature and type of streamlining
to be conducted will be made on a site-by-site or facility-by-facility basis and documented in the
QAPP. The QAPP should also contain decision criteria that allow for revision of the initial
streamlining plan. For example, decision criteria contained in the QAPP could specify that if
problems are identified in the investigation, then streamlining cannot occur. Other factors may also
lead to a revision of the initial streamlining decision, such as intense political interest and concern
on the part of the community. The QAPP should contain a clause that prohibits streamlining when
conditions are not optimal.
As noted in Section 2.2.3, Proficiency Testing Samples, the specification for batch-specific PT
samples was added to foster streamlining of validation. In order for this streamlining activity to be
implemented, the project team should agree on the contaminants and concentrations to be included
in the batch-specific PT sample (amputated or full volume), as well as the acceptance criteria by
which the results will be evaluated, at the project scoping stage. If established criteria are achieved,
then it will not be necessary to conduct validation activities (step Ha) on analytical data. Validation
step Ha for sampling activities will still be needed since the batch-specific PT samples cannot
substitute for validation of these activities. Section 5.3 of the UFP-QAPP Manual contains further
criteria for and direction on streamlining opportunities.
IDQTF, UFP-QAPP Part 2B 23 Final
QA/QC Compendium, March 2005
-------
Figure 2. Potential Participants in the Data Review Process
Data
Review
Steps
Potential Data
Review
Participants
Verification
Validation
Ha
Validation
lib
Data Usability
Assessment
Prime Contractor
Staff Independent of
Field Crew
Subcontractor
Independent
Review3
Key Members of
Project Planning
Team
Independent
Review3
Field Team
Members
Prime Contractor
Project
Planning
Team
results com
with methods,
procedures, and
contracts';
Are all
inputs
omplete?
Data Validation
Report that
addresses the
performance of
the field and
analytical data2
e the dat
adequate to
support the
ecisions bein
made1;
Are the quality
objectives of the
QAPP met?
Laboratory
Analytical
Data
Verification
Report2
results comptv
with methods,
procedures, and
contracts?
Are all
inputs
omplete?
Data Review
Complete
Prime Contractor
Validation
Subcontractor
Independent
Review3
Key Members of
Project Planning
Team
Independent
Review3
Laboratory
Laboratory QC
Group
Prime Contractor
Potential Data
Review
Participants
1. Data review of field activity and laboratory analysis can occur separately or at the same time and by the same personnel, if appropriate.
2. Does not have to be separate report - may be part of RI/FS or other document.
3. Determined by the project team.
NOTE: A "no" answer to the questions will result in corrective action, flagging, or contact with the client.
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
24
Final
-------
2.4 Organization of the QA Matrix Based on the CERCLA Process
The QA matrix uses a combination of CERCLA phases and data uses as organizing principles:
CERCLA phases (e.g., remedial investigation) are used to organize investigation
activities of the CERCLA process.
Data uses organize the post-ROD or post-construction phases of the CERCLA process.
This was in part because DoD, DOE, and EPA use different terms to describe the same
CERCLA phase, and because the construction phases have unique activities associated
with them.
2.4.1 Investigation Phases
Below is a list of the CERCLA phases for investigation or preconstruction activities, a brief
explanation of the purpose of each phase, and the data uses that fall under each phase. In the phases
for which data use subcategories are not warranted, only the CERCLA phase appears in the matrix
(such as for site inspection).
Preliminary Assessment (PA) - uses only secondary data (interviews, observations,
historical data, etc.). These data are used to assess the likelihood of contamination in an
area and the probable nature of the contamination. This phase requires only screening
data. Since no new data are generated during this phase, it differs from the other phases
in that there are no QA/QC activities except under the data review and usability
assessment stages of the project.
Site Inspection (SI) - consists of on-site data gathering to determine whether there is
a release or potential release of contaminants and the nature of the associated threats to
human health and the environment. This phase uses both definitive and screening data
and is not divided by data uses. SI data may be used for the following outcomes or
decisions: removal, further investigation, hazard ranking system (HRS), or no further
action. Decisions for further investigation and removal can be made with screening data
if they are intermediate decisions; however, the other decisions should use definitive data
only.
Remedial Investigation (RI) - consists of data gathering undertaken by the lead agency
or responsible party to determine the nature and extent of risk to human health or the
environment due to a contaminant release or potential release. In addition, the data
collected during the RI phase may be used to assess various remediation alternatives.
The RI phase emphasizes data collection and site characterization, and both screening
and definitive data can be used. The QA/QC activities for screening data are not
differentiated by data use; however, definitive data contain distinctions between nature
of contamination, extent of contamination, risk assessment of human health, and risk
assessment of ecological factors.
Feasibility Study (FS): Extent of Contamination - is undertaken by the lead agency
to develop and evaluate options for a remedial action. The only additional type of data
collection and use in the FS phase is for understanding the extent of contamination.
Otherwise the study depends on data collected during other CERCLA phases (generally
the RI).
IDQTF, UFP-QAPP Part 2B 25 Final
QA/QC Compendium, March 2005
-------
Treatability Studies - are performed at bench or pilot scale. These studies model the
treatment system to simulate the remedial technology selected to treat the waste
encountered at the site. Bench-scale studies use a composition similar to the waste on
a site in order to demonstrate whether the system is effective. Pilot studies use real
waste found at a treatment site in order to model parameters for performance, design, and
operation. The two types of treatability studies have identical QA/QC activities.
Screening and definitive data may be used at different stages of the treatability study
process. Screening data can be used to provide daily operational information or when
the process is reaching steady state.
Non-Time-Critical (NTC) Removal - the phase that involves implementation and
evaluation of removal actions that can be initiated more than 6 months after
contamination is identified. An engineering evaluation/cost analysis (EE/CA) is used
to determine risk and select cleanup alternatives in this phase. (Note: Because of the
project-specific nature of emergency [time-critical] cleanups, it is not possible to define
QA/QC activities for time-critical removal actions that would allow for the necessary
amount of flexibility. Minimum QA/QC activities for time-critical removals should be
defined on a project-specific basis. For this reason, only NTC removals are addressed
in the matrix.)
2.4.2 Construction, Post-construction, and Post-ROD Phases
Construction is the proj ect phase during which a remedial action is implemented. Some proj ects do
not require construction but may require monitoring or other activities using sampling and analysis.
The post-construction and post-ROD phases include monitoring and operation and maintenance
(O&M) of the remedial action. To examine the critical processes and QA/QC activities for these
phases, the workgroup established common nomenclature for the different stages and milestones in
the process. Once the proper correlation between EPA and DoD terms was made, all QA/QC needs
were identified for each phase of the process. (Note: In these phases, activities are differentiated by
data use, rather than CERCLA phase.)
Table 3 presents the major construction, post-construction, and post-ROD phases, with a brief
summary of the purpose and data uses that would be applicable to each phase. To demonstrate the
correlation in nomenclature between EPA and DoD, timelines of remedy scenarios (treatment and
off-site disposal, removal and off-site disposal, containment, and groundwater and surface water
restoration) were illustrated (see Figure 3).
IDQTF, UFP-QAPP Part 2B 26 Final
QA/QC Compendium, March 2005
-------
Table 3. Steps of the Major Construction, Post-construction, and Post-ROD Phases
Purpose
Build the remedy (uses
engineering data only).
Determine if the remedy is
working correctly.
Verify that the cleanup goal
has been achieved.
Implement and maintain the
remedy over an extended
period of time.
Ensure that the remedy
remains effective.
Ensure that site cleanup
goals and monitoring
requirements have been met.
Data Uses
Not applicable to the QA
matrix
- Process control
- Process analysis
Confirmation sampling
- Compliance monitoring
- Process control
- Monitoring program
effectiveness and
optimization
- Monitoring program
effectiveness and
optimization
- Effectiveness monitoring
- Compliance monitoring
No new data collected; uses
data from long-term
monitoring or response-
complete phases.
EPA Term
Remedial action
(RA)
Operational and
functional, and
construction
complete
Final RA report
Long-term remedial
action (LTRA), and
operation and
maintenance (O&M)
Operation and
maintenance (O&M)
or five-year review
Deletion
DoD Term
Remedial action-
construction (RA-C)
Operating properly and
successfully (OPS), and
remedy in place (RIP)
Response complete
Remedial action-
operation (RA-O), and
long-term monitoring
(LTM)
Long-term monitoring
(LTM)
Site closure
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
27
Final
-------
Figure 3. Example Remedy Scenarios and Comparable EPA/DoD Terminology
On-Site Soil Treatment and Off-Site Disposal
Remedy In Response
DoD Place (RIP) Complete
Terms Remedial Action-
Construction (RA-C)
EPA
Terms
Remedial
Action (RA)
Construction
Completion (CC)
Final RA
Report
T
Site
Closure
Long Term
Monitoring (LTM)a
Five Year Review3
(on-going)
sur
Deletion
Containment Remedy
Response
Site
DoD
Terms
RA-C
Operating Properly
and Successfully (OPS)b
RIP
1
Complete
t .
Closu
LTM J
RA
EPA
Terms
Operational and Functional
r
CC
Final RA
Report
Operation &
Maintenance (O&M)
Deletion
Groundwater and Surface Water Restoration
Preliminary
Last Site
Response RIP Final Closeout Closure
DoD
Terms
EPA
Terms
RA-C
RA
OPS
RIP Closeout Report Complete
Remedial Action -
Operation (RA-O)
Operational and
Functional
CC
Interim RA
Report
Long Term
Remedial Action
(LTRA)d
O&M
f
1
Report
Final RA
Report
Deletion
a If level greater than that of unrestricted use.
bBRAC
0 If final RA at NPL installation.
10 years, if fund-financed.
QA/QC Compendium, March 2005
28
Final
-------
From the identification of data uses, the workgroup identified six unique data uses: process analysis,
process control, confirmation sampling, monitoring program effectiveness and optimization,
compliance determination, and effectiveness monitoring. These data uses are the major categories
for the construction, post-construction, and post-ROD phases in the matrix. They are explained in
further detail below.
Construction: Process Analysis - is categorized as a construction data use and
involves testing to determine whether the remedy is operating properly and as
designed, and falls under the operational and functional phase. Data generated here
may eventually lead to the commissioning decision of remedy in place (RIP). Both
screening and definitive data are acceptable for the process analysis data use.
Definitive data have the highest possible level of data quality and should be used for
the final OPS (operating properly and successfully, DoD) determination, as well as
the RIP decision. Screening data can be used for intermediate decisions.
Construction: Process Control - is a construction data use that takes place during
the RA-O/O&M and operational and functional phases. During the operational and
functional phase, sophisticated process control is used to prove that the selected
technology can operate properly to reach cleanup levels and to optimize the remedy.
In the RA-O/O&M phase, process control is used to monitor influent, effluent, and
secondary waste streams; optimize the remedial system; and troubleshoot and
monitor natural attenuation. Process control specifications apply to both constructed
and in-situ remedies. It does not demand the highest level of data quality and
therefore is not appropriate for compliance testing. During process control,
parameters may be measured in the field, but not necessarily at a fixed analytical
laboratory. Both screening and definitive data can be used for process control.
Construction and Post-construction: Confirmation Sampling - is both a
construction and post-construction data use that needs definitive data. The data are
used to substantiate a response-complete decision (cleanup goal achieved) and may
be used to confirm the results of process analysis. The highest level of data quality is
demanded. For example, confirmation sampling should be undertaken when
excavation is nearing completion to determine that cleanup levels have been reached.
It also should be used to confirm that treatment goals have been achieved.
Post-construction: Monitoring Program Effectiveness/Optimization - uses both
screening and definitive data and is used in the RA-O/O&M (including monitored
natural attenuation) and long-term monitoring phases. The data generated are used to
examine the sampling frequency, content, and location of long-term monitoring
programs. Data may be used to reduce the frequency of sampling or to refine the
analyte list. They may also be used to track trends and model the change in
contamination.
Post-construction: Compliance Determination - uses definitive data to confirm
that specific regulatory criteria are met. The data are used to measure contaminant
levels in effluent, secondary waste streams, disposed waste, backfill, and so forth
during the RA-O/O&M phase. The data are also used for permit compliance and
monitored natural attenuation. Data generated for compliance determination during
the long-term monitoring phase can be used for waste disposal, permit compliance,
and five-year reviews. The highest level of data quality is needed for this data use.
IDQTF, UFP-QAPP Part 2B 29 Final
QA/QC Compendium, March 2005
-------
Post-construction and Post-ROD: Effectiveness Monitoring - uses definitive data
during the long-term monitoring phase to examine whether a no-action decision
remains appropriate and to model and evaluate trends in contamination levels
associated with both action and no-action decisions. The data are used to evaluate
whether the remedy is operating properly and efficiently and to determine if there is
any benefit to changing operational parameters. The highest level of data quality is
needed for this data use.
IDQTF, UFP-QAPP Part 2B 30 Final
QA/QC Compendium, March 2005
-------
3.0 THE QA MATRIX: MINIMUM ACTIVITIES FOR QA/QC UNDER CERCLA
The QA matrix in this section lays out the minimum activities to be used on CERCLA proj ects. This
section identifies the QA/QC activities appropriate to the CERCLA phase, data use, project stage,
and type of data (screening or definitive) being generated. For a complete understanding of the
QA/QC activities specified, refer to Section 2 of this compendium. The minimum specifications
presented in the QA matrix should be used in conjunction with the development of a proj ect-specific
QAPP and project quality objectives.
3.1 Reading the Matrix
The matrix is organized first according to investigation and construction/post-construction/ post-
ROD phases. The first set of tables is for the investigation phases, the second set is for post-
investigation (construction, post-construction, and post-ROD) phases. CERCLA phases (SI, RI,
construction, etc.) and appropriate data uses are listed across the top of the matrix (the columns of
the matrix). Those CERCLA phase-data use combinations are differentiated by data type, that is,
as screening or definitive. Each table is divided into project stages (planning, sampling, usability,
etc.). Each project stage has a list of specific QA/QC activities, which make up the rows of the
matrix. Therefore each cell in the matrix represents a CERCLA phase (and data use) and QA/QC
activity. For each CERCLA phase-data use combination, QA/QC activities that should be
performed in order to meet minimum data quality guidelines are marked with a check ( ^ ). Those
QA/QC activities that are not minimum specifications for a specific CERCLA phase-data use
combination are identified by a dash (-).
3.2 Key Definitions
During the development of the QA matrix, the workgroup created or refined definitions of terms
used in this compendium. Matrix users should become familiar with the glossary in Appendix B to
ensure consistent understanding and application by matrix users. Definitions for specific QC
samples and other terminology used throughout the document can be found in the glossary and
occasionally in the text.
3.3 Applying the Matrix
The activities presented in the matrix are minimum QA/QC activities for the collection and analysis
of data at CERCLA sites. The purpose of a minimum set of activities is to streamline the planning
and QAPP-writing process. With the baseline specifications established by the QA matrix, a project
team can begin establishing project-specific quality objectives and identifying the specific
acceptance criteria that relate to the PQOs.
Data quality is a project-specific variable that can be defined using the systematic planning process
for a specific project. A project team may determine that, based on project-specific needs, other
QA/QC activities should be added. The QA matrix supplements the UFP-QAPP Manual and is
meant to complement the specifications of that document. Beyond the actual QA/QC activities in
the matrix, additional descriptive information appropriate to the project-specific quality objectives,
such as sampling frequency, sample location, and acceptance criteria, should be developed.
IDQTF, UFP-QAPP Part 2B 31 Final
QA/QC Compendium, March 2005
-------
During the pre-QAPP coordination stage of the planning process (i.e., planning or scoping session),
stakeholders should determine which CERCLA phases and data uses are relevant to the site and
what type of data will be collected. Then they should refer to the matrix for the list of minimum
QA/QC activities appropriate for those CERCLA phases and data uses. At that point, other QA/QC
activities may be added, as appropriate, and the details of how to implement all the QA/QC activities
should be defined (i.e., what the specific procedures and criteria will be for each activity).
3.4 QA Matrix
In the QA matrix that follows, the first set of tables is for the investigation phase and includes the
following CERCLA phases:
Preliminary assessment
Site inspection
Remedial investigation
Feasibility study
Treatability study
Non-time-critical removals
Site investigations, non-time-critical removals, remedial investigations, feasibility studies, and
treatability studies may use both screening and definitive data; however, preliminary assessments
should use only screening data.
The second set of tables is for post-investigation phases, that is, the construction, post-construction,
and post-ROD phases. The construction phase involves two different data uses: process analysis and
process control. The following data uses are possible during the post-construction phase:
Confirmation sampling
Compliance determination
Monitoring program effectiveness/optimization
Effectiveness monitoring
Both screening data and definitive data are allowed for construction (process analysis and process
control) and monitoring program effectiveness/optimization during post-construction. Only
definitive data is acceptable for confirmation sampling, compliance determination, and effectiveness
monitoring during post-construction activities.
IDQTF, UFP-QAPP Part 2B 32 Final
QA/QC Compendium, March 2005
-------
Table 4. QA Matrix: Investigation Phases
Data Use: /&&&&&&&&&&&&&*
QA/QC Activity
SCREENING
DEFINITIVE
PLANNING
PI. Requirement for a Systematic Planning Process (e.g.,
establishment of Project Quality Objectives [PQOs],
Measurement Quality Objectives [MQOs])
P2. Requirement for a Pre-QAPP Coordination with all
Stakeholders (e.g., planning or, scoping session)
P3. Requirement for a Sampling Design Rationale (including
both rationale for sampling locations and techniques)
P4. Requirement for a Quality Assurance Project Plan
(QAPP) **
P5. Requirement for internal QAPP review and approval
procedures
P6. Requirement for QAPP modifications and/or change-
control procedures °°
P7. Documented Schedule and Budget
P8. Documented Project Personnel Training, Education, and
Experience Criteria and Standard Operating Procedures
(SOPs) for verifying qualifications (Includes identification of
all project-specific appropriate Quality Assurance [QA]
Personnel, including QA Officer)
"
"
"
-
**Can be found in other documents (e.g.
O&M plan, treatability plan). It does not need
to be a stand-alone document.
Applies to stand-alone QAPPs and those
that are part of larger documents.
= Activity is required; = Activity is not required
QA/QC Compendium, March 2005
33
Final
-------
Table 4. QA Matrix: Investigation Phases
Data Use: /&&&&&&&&&&&&&*
QA/QC Activity
SCREENING
DEFINITIVE
FIELD SAMPLING
81. Requirement for inspection and acceptance of supplies
S2. Criteria for field sampling supplies, preservation, and
sample container quality control (including verification
SOPs)
S3. Description of equipment/instrumentation and personnel
qualifications
S4. Criteria for acceptable field sampling equipment
performance
85. Requirement for maintenance and verification of
ongoing acceptable field sampling equipment performance
(including SOPs, e.g., verification of testing, inspection,
calibration)
S6. Criteria for cleaning and decontamination of field
sampling equipment (including SOPs)
S7. Requirement for project-specific field sampling
performance criteria
S8. Requirement for documentation and record keeping (e.g.,
Field logbooks)
89. Procedures (e.g., SOPs, workplan) for field sampling
management (Documentation to answer "Is the sample
traceable?", e.g., sample location, transport, storage, and
shipping procedures)
--
"
v/
^
/
v>
V-
tf
/
V-
/
*
*
V-
= Activity is required; = Activity is not required
QA/QC Compendium, March 2005
34
Final
-------
Table 4. QA Matrix: Investigation Phases
Data Use: /&&&&&&&&&&&&&*
QA/QC Activity
SCREENING
DEFINITIVE
FIELD SAMPLING (CONT'D)
S10. Procedures (e.g., SOPs, workplan) for field sample
collection (Documentation to answer "Was the sample
collected properly?", e.g., sample collection methods)
S 1 1 . Documentation of data management, handling, and
tracking controls
S12. Requirement for Collocated Field duplicate samples
(including acceptance criteria)
S13. Requirement for Field blanks (for dedicated and non-
dedicated equipment)
S14. Requirement for Field equipment or rinse blanks (for
non-dedicated equipment)
SI 5. Requirement for Cooler Temperature Indicator
(**Perform as necessary**)
S16. Requirement for Matrix spike (MS) (inorganics only)
(including acceptance criteria)
S17. Requirement for Internal Pre-startup readiness review
S18. Requirement for Internal field sampling audits and/or
oversight
SI 9. Requirement for External field sampling audits and/or
oversight
S20. Requirement for Positive Control Sample, if required
by measurement criteria
-
-------
Table 4. QA Matrix: Investigation Phases
Data Use: /&&&&&&&&&&&&&*
QA/QC Activity
SCREENING
DEFINITIVE
ON-SITE FIELD MEASUREMENTS
Fl. Requirement for inspection and acceptance of supplies
F2. Criteria for field supplies, calibration standards, and
sample container quality control
F3. Descriptions of equipment/instrumentation and personnel
qualifications
F4. Criteria for acceptable field measurement/analysis
equipment/instrument performance
F5. Requirement for maintenance/verification of field
measurement/analysis equipment performance
F6. Criteria for cleaning and decontamination of field
measurement/analysis equipment
F7. Requirement for project-specific measurement/analysis
performance criteria
F8. Documentation of measurement/analysis quality system
F9. Requirement for documentation and record keeping (e.g.,
analyst logs, field logs)
F10. Procedures (e.g., SOPs, workplan) related to sample
management (Documentation to answer "Is the measurement
traceable?")
Fl 1. Procedures (e.g., SOPs, workplan) related to sample
measurement/analysis (including preparation and cleanup)
"
"
"
--
"
^
/
/
/
"
/
/
/
/
/
/
/
/
/
/
*
/
= Activity is required; = Activity is not required
QA/QC Compendium, March 2005
36
Final
-------
Table 4. QA Matrix: Investigation Phases
Data Use: /&&&&&&&&&&&&&*
QA/QC Activity
SCREENING
DEFINITIVE
ON-SITE FIELD MEASUREMENTS (CONT'D)
F12. Description of data deliverables/data package content,
generation, and data review procedures
F 1 3 . Documentation of data management, handling, and
tracking controls
F14. Requirement for Laboratory Duplicate samples to
measure intralaboratory precision
F15. Requirement for Matrix Spike (MS) (inorganics only)
F16. Comparability criteria for confirmatory analyses
(compare screening and definitive data)
F17. Requirement for Confirmatory Analyses
F18. Requirement for Proficiency Testing samples - Batch-
Specific
F19. Requirement for pre-startup measurement/ analysis
readiness review
F20. Requirement for internal measurement/analysis audits
and/or oversight
F21. Requirement for external measurement/analysis audits
and/or oversight
F22. Requirement for Positive Control Sample, if required
by measurement criteria
"
--
"
"
v
/
*
/
*
/
--
*
/
V*
V*
yf
^
\s
y*
--
= Activity is required; = Activity is not required
QA/QC Compendium, March 2005
37
Final
-------
Table 4. QA Matrix: Investigation Phases
Data Use: /&&&&&&&&&&&&&*
QA/QC Activity
SCREENING
DEFINITIVE
OFF-SITE/FIXED LAB MEASUREMENTS
LI. Requirement for inspection and acceptance of supplies
L2. Criteria for supplies, calibration standards, and sample
container quality control (including verification SOPs)
L3. Descriptions of equipment/instrumentation and
personnel requirements
L4. Criteria for acceptable laboratory equipment/ instrument
performance
L5. Requirement for maintenance and verification of
ongoing acceptable laboratory equipment/instrument
performance (including SOPs)
L6. Criteria for cleaning and decontamination of equipment
and instruments (including SOPs)
L7. Requirement for project-specific measurement/ analysis
performance criteria
L8. Documentation of a laboratory quality system (e.g.,
Laboratory QA Manual)
L9. Requirement for documentation and record keeping
(e.g., analyst logs)
L10. Procedures (e.g., SOPs, workplan) related to laboratory
sample management (Documentation to answer "Is
measurement traceable?")
"
"
"
= Activity is required; = Activity is not required
QA/QC Compendium, March 2005
38
Final
-------
Table 4. QA Matrix: Investigation Phases
Data Use: /&&&&&&&&&&&&&*
QA/QC Activity
SCREENING
DEFINITIVE
OFF-SITE/FIXED LAB MEASUREMENTS (CONT'D)
LI 1. Procedures (e.g., SOPs, workplan) related to sample
analysis, including preparation and cleanup (Documentation
to answer "Was the measurement in control?")
LI 2. Description of data deliverables/data package content,
generation, and data review procedures
L 1 3 . Documentation of data management, handling, and
tracking controls
LI 4. Requirement for Laboratory Duplicate samples
(including acceptance criteria)
LI 5. Requirement for Matrix Spike (MS) (inorganics only)
(including acceptance criteria)
LI 6. Comparability criteria for confirmatory analyses
(Includes comparisons between screening and definitive data
and/or two definitive methods reflected through separate
columns of the matrix)
L17. Requirement for Confirmatory Analyses
LI 8. Requirement for Proficiency Testing samples Pre-
qualification (including acceptance criteria)
T1OTD ' 1 f T> f ' ^ +' 1 TD+U
specific (including acceptance criteria)
L20. Requirement for Pre-startup laboratory audits/
readiness reviews
--
"
/
*
*
*
*
/
*
/
-
"
V
/
n/
"
*
"
^
/
= Activity is required; = Activity is not required
QA/QC Compendium, March 2005
39
Final
-------
Table 4. QA Matrix: Investigation Phases
Data Use: /&&&&&&&&&&&&&*
QA/QC Activity
SCREENING
DEFINITIVE
OFF-SITE/FIXED LAB MEASUREMENTS (CONT'D)
L21. Requirement for Internal laboratory audits and/or
oversight
L22, Requirement for External laboratory audits and/or
oversight
L23. Requirement for Positive Control Sample, if required
by measurement criteria
"
*
DATA REVIEW
Dl. Laboratory internal data review SOPs
D2. Laboratory data deliverable requirements (specifications
for hard copy and/or electronic data deliverables) - Tabular
sample results with QC results
D3. Laboratory data deliverable requirements (specifications
for hard copy and/or electronic data deliverables) - Tabular
sample results, QC results, and raw data
D4. Requirement for internal laboratory verification of
meeting data deliverable requirements and project-specific
MQO requirements
D5. Requirement for verification (completeness review) of
sampling and analytical data, and other data deliverables.
D6. Requirement for review of findings from verification
and preparation of report
V
/
v
= Activity is required; = Activity is not required
QA/QC Compendium, March 2005
40
Final
-------
Table 4. QA Matrix: Investigation Phases
Data Use: /&&&&&&&&&&&&&*
QA/QC Activity
SCREENING
DEFINITIVE
DATA REVIEW (CONT'D)
D7. Requirement for validation (assessment of sampling and
analytical data against technical requirements)
D8. Criteria for validation
D9. Requirement for documentation of results of validation
(e.g., exceedances and exceptions)
D10. Requirement for (regulatory) review of data assessment
report
Dl 1. Requirement to reconvene project team (see P2) to
perform usability assessment
D12. Requirement for usability assessment and
documentation of results by project team
D13. Requirement for preparation and review of final
usability report
= Activity is required; = Activity is not required
QA/QC Compendium, March 2005
41
Final
-------
This page intentionally left blank.
-------
Table 5. QA Matrix: Post-ROD Phases
Data Use:
QA/QC Activity
SCREENINGI DEFINITIVE
PLANNING
PI. Requirement for a Systematic Planning Process (e.g., establishment
of Project Quality Objectives [PQOs], Measurement Quality Objectives
[MQOs])
P2. Requirement for a Pre-QAPP Coordination with all Stakeholders
(e.g., planning meeting, scoping meeting)
P3. Requirement for a Sampling Design Rationale (including both
rationale for sampling locations and techniques)
P4. Requirement for a Quality Assurance Project Plan (QAPP) **
P5. Requirement for internal/external QAPP review and approval
procedures
P6. Requirement for QAPP modifications and/or change-control
procedures **
P7. Documented Schedule and Budget
P8. Documented Project Personnel Training, Education, and Experience
Criteria and Standard Operating Procedures (SOPs) for Yerifying
qualifications (Includes identification of all project-specific appropriate
Quality Assurance [QA] Personnel, including QA Officer)
/
/
s
/
"
-
*
^
^
*/
s
*
** Can be found in other documents (e.g. O&M plan,
treatability plan). It does not need to be a stand-alone
document.
Process Control. Requirement for internal QAPP review
and approval procedures only.
Applies to stand-alone QAPPs and those that are part of
larger documents.
= Activity is required; = Activity is not required
QA/QC Compendium, March 2005
43
Final
-------
Table 5. QA Matrix: Post-ROD Phases
Data Use:
QA/QC Activity
SCREENINGI DEFINITIVE
FIELD SAMPLING
SI. Requirement for inspection and acceptance of supplies
S2. Criteria for field sampling supplies, preservation, and sample
container quality control (including verification SOPs)
S3. Description of equipment/instrumentation and personnel
qualifications
S4. Criteria for acceptable field sampling equipment performance
S5. Requirement for maintenance and verification of ongoing acceptable
field sampling equipment performance (including SOPs, e.g. verification
of testing, inspection, calibration)
S6. Criteria for cleaning and decontamination of field sampling
equipment (including SOPs)
87. Requirement for project-specific field sampling performance criteria
S8. Requirement for documentation and record keeping (e.g., Field
logbooks)
S9. Procedures (e.g., SOPs, workplan) for field sampling management
(Documentation to answer "Is the sample traceable?", e.g., sample
location, transport, storage, and shipping procedures)
S10. Procedures (e.g., SOPs, workplan) for field sample collection
(Documentation to answer "Was the sample collected properly?", e.g.,
sample collection methods)
s
/
*
s
*
/
-
--
/
s
/
/
*
/
-------
Table 5. QA Matrix: Post-ROD Phases
Data Use:
QA/QC Activity
SCREENINGI DEFINITIVE
FIELD SAMPLING (CONT'D)
S 1 1 . Documentation of data management, handling, and tracking
controls
S12. Requirement for Collocated Field duplicate samples (including
acceptance criteria)
S13. Requirement for Field blanks (for dedicated and non-dedicated
equipment)
S14. Requirement for Field equipment or rinse blanks (for non-dedicated
equipment)
SI 5. Requirement for Cooler Temperature Indicator (**Perform as
necessary**)
S16. Requirement for Matrix spike (MS) (inorganics only) (including
acceptance criteria)
S17. Requirement for Pre-startup readiness review
S18. Requirement for Internal field sampling audits and/or oversight
SI 9. Requirement for External field sampling audits and/or oversight
S20. Requirement for Positive Control Sample, if required by
measurement criteria
"
"
-
s
"
"
-
*
"
-
s
-------
Table 5. QA Matrix: Post-ROD Phases
Data Use:
QA/QC Activity
SCREENINGI DEFINITIVE
ON-SITE FIELD MEASUREMENTS (CONT'D)
F2. Criteria for field supplies, calibration standards, and sample
container quality control
F3. Descriptions of equipment/instrumentation and personnel
qualifications
F4. Criteria for acceptable field measurement/ analysis
equipment/instrument performance
F5. Requirement for maintenance/verification of field measurement/
analysis equipment performance
F6. Criteria for cleaning and decontamination of field
measurement/analysis equipment
F7. Requirement for project-specific measurement/analysis performance
criteria
F8. Documentation of measurement/analysis quality system
F9. Requirement for documentation and record keeping (e.g., analyst
logs, field logs)
F10. Procedures (e.g., SOPs, workplans) related to sample management
(Documentation to answer "Is the measurement traceable?")
Fl 1. Procedures (e.g., SOPs, workplans) related to sample
measurement/analysis (including preparation and cleanup)
s
/
/
"
/
\f
/
*
/
/
/
^
*
V-
*
/
= Activity is required; = Activity is not required
QA/QC Compendium, March 2005
46
Final
-------
Table 5. QA Matrix: Post-ROD Phases
Data Use:
QA/QC Activity
SCREENINGI DEFINITIVE
ON-SITE FIELD MEASUREMENTS (CONT'D)
F12. Description of data deliverables/data package content, generation,
and data review procedures
F13. Documentation of data management, handling, and tracking
controls
F14. Requirement for Laboratory duplicate samples to measure
intralaboratory precision
F15. Requirement for Matrix spike (MS) (inorganics only)
F16. Comparability criteria for confirmatory analyses (compare
screening and definitive data)
F17. Requirement for Confirmatory Analyses
F18. Requirement for Proficiency Testing samples - Batch-Specific
F19. Requirement for pre-startup measurement/analysis readiness
review
F20. Requirement for internal measurement/analysis audits and/or
oversight
F21. Requirement for external measurement/analysis audits and/or
oversight
F22. Requirement for Positive Control Sample, if required by
measurement criteria
"
s
-
/
*
-
*/
s
"
/
"
"
-
"
-
-
"
"
/
*
"
-
/
/
-
s
-.
/
s
*
-------
Table 5. QA Matrix: Post-ROD Phases
Data Use:
QA/QC Activity
SCREENINGI DEFINITIVE
OFF-SITE/FIXED LAB MEASUREMENTS
LI. Requirement for inspection and acceptance of supplies
L2. Criteria for supplies, calibration standards, and sample container
quality control (including verification SOPs)
L3. Descriptions of equipment/ instrumentation and personnel
requirements
L4. Criteria for acceptable laboratory equipment/ instrument
performance
L5. Requirement for maintenance and verification of ongoing acceptable
laboratory equipment/instrument performance (including SOPs)
L6. Criteria for cleaning and decontamination of equipment and
instruments (including SOPs)
L7. Requirement for project-specific measurement/analysis performance
criteria
L8. Documentation of a laboratory quality system (e.g., Laboratory QA
Manual)
L9. Requirement for documentation and record keeping (e.g., analyst
logs)
L10. Procedures (e.g., SOPs, workplan) related to laboratory sample
management (Documentation to answer "Is the measurement
traceable?")
V
s
/
*
s
*/
s
*/
s
"
"
"
"
s
^
^
^
^
*/
^
/
s
-------
Table 5. QA Matrix: Post-ROD Phases
Data Use:
QA/QC Activity
SCREENINGI DEFINITIVE
OFF-SITE/FIXED LAB MEASUREMENTS (CONT'D)
LI 1. Procedures (e.g., SOPs, workplan) related to sample analysis,
including preparation and cleanup (Documentation to answer "Was the
measurement in control?")
LI 2. Description of data deliverables/data package content, generation,
and data review procedures
L 1 3 . Documentation of data management, handling, and tracking
controls
LI 4. Requirement for Laboratory duplicate samples (including
acceptance criteria)
LI 5. Requirement for Matrix spike (MS) (inorganics only) (including
acceptance criteria)
LI 6. Comparability criteria for confirmatory analyses (Includes
comparisons between screening and definitive data and/or two definitive
methods - reflected through separate columns of the matrix)
L17. Requirement for Confirmatory Analyses
LI 8. Requirement for Proficiency Testing samples - Pre-qualification
(including acceptance criteria)
LI 9. Requirement for Proficiency Testing samples - Batch-specific
(including acceptance criteria)
V
/
*
"
*
__
"
--
"
s
/
/
/
/
/
/
"
/
/
/
/
/
= Activity is required; = Activity is not required
QA/QC Compendium, March 2005
49
Final
-------
Table 5. QA Matrix: Post-ROD Phases
Data Use:
QA/QC Activity
SCREENINGI DEFINITIVE
OFF-SITE/FIXED LAB MEASUREMENTS (CONT'D)
L20. Requirement for Pre-startup laboratory audits/readiness reviews
L21. Requirement for Internal laboratory audits and/or oversight
L22. Requirement for External laboratory audits and/or oversight
L23. Requirement for Positive Control Sample, if required by
measurement criteria
*/
/
-
s
-
-
-
-
DATA REVIEW
Dl . Laboratory internal data review SOPs
D2. Laboratory data deliverable requirements (specifications for hard
copy and/or electronic data deliverables) - Tabular sample results with
QC results
D3 . Laboratory data deliverable requirements (specifications for hard
copy and/or electronic data deliverables) - Tabular sample results, QC
results, and raw data
D4. Requirement for internal laboratory verification of meeting data
deliverable requirements and project-specific MQO requirements
D5. Requirement for verification (completeness review) of sampling and
analytical data, and other data deliverables
D6. Requirement for review of findings from verification and
preparation of report
-
>/
*f
s
-
/
-------
Table 5. QA Matrix: Post-ROD Phases
Data Use:
QA/QC Activity
SCREENINGI DEFINITIVE
DATA REVIEW (CONT'D)
D7. Requirement for validation (assessment of sampling and analytical
data against technical requirements)
D8. Criteria for validation
D9. Requirement for documentation of results of validation (e.g.,
exceedances and exceptions)
D10. Requirement for (regulatory) review of data assessment report
Dl 1 . Requirement to reconvene project team (see P2) to perform
usability assessment
D12. Requirement for usability assessment and documentation of results
by project team
D13. Requirement for preparation and review of final usability report
--
"
-
--
"
-
--
"
-
^
^
^
*<
v>
-------
This page intentionally left blank.
-------
APPENDIX A
QC SAMPLES AND DATA QUALITY INDICATORS
-------
This page intentionally left blank.
-------
TABLE A-l. QC SAMPLES THAT CONTRIBUTE TO DETERMINING PRECISION
QC Measurement
Field Duplicates (FD) - Co-located
Field Duplicates (FD) - Subsample
Laboratory Duplicates (LD)
Matrix Spike Duplicates (MSD)
Analytical Replicates (AR)
Internal Standards (IS)
Sampling
Equipment
Sample
Container
Preservation
Env.
Media
Transport
Storage
at Lab
Preparation
Reagents
Preparation
Equipment
Analysis
Reagents
Analysis
Equipment
Frequency
5-10%
5-10%
5-10%
(inorganics)
5-10»/o
(organics)
Variable
160%
(GC/MS)
Variable
(others)
Additional
Cost?
₯
₯
N
Y
N
N
I 1
I 1
~n TL
D j~
1
1
Order, Collect | Store arad|
Prepare, Samples Transport
Voiriify and Samplm
Sampling Preserve
Eqifipm-e-nt/
Conta iners/
Re-agents
1
1
Store
Samples'
at
Lab
1
1
Order,
Prepare,
Verify
Somp-le
Pr-SfS a nd
Analysts
Reagents,
and
Equipment
LD 1
MSD J
Pre-pa re
Samples
[Extract,
Digest,
Distill,
Cleanup,
elc4
AR |ono s
15 (all sa*1
I
1
Analjcijc
Sample*
(Instrumental
R*-4pOriS^J
V
r
V
r*
ample) i ^^
npl»6j J
1 t»
1 ^
Pnoducc
Oa+a
££4!,I 11
Note: Abbreviations are given in table.
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
55
Final
-------
TABLE A-2. QC SAMPLES THAT CONTRIBUTE TO DETERMINING ACCURACY
QC Measurement
Site-Specific PT Sample
Full-Volume PT Sample
Ampulated PT Sample
Matrix Spike (MS)
Surrogate Spikes (SS)
Laboratory Control Sample (LCS)
Initial Calibration/Continuing
Calibration/Continuing Calibration
Verification/Instrument Performance
Check Sample (IC/CC/CCV/IPC)
Sampling
Equipment
Sample
Container
Preservation
Env.
Media
Transport
Storage
at Lab
Preparation
Reagents
Preparation
Equipment
Analysis
Reagents
Variabl
eY
Analysis
Equipment
Frequency
Variable
Variable
5-10%
100%
(organics)
5%
Variable
Additional
Cost?
Y
Y
Y
N
N
N
f
1
t
~iill Y^i-^li ^j-nj- P>F C8 "1.
1^-
^"
Amputated PES -|_ -^
LFB/LCS J **
MS
Order, Collect (Store and)
Prepare, Samples Transport
Verify arid Samples
Samp-ling Preserve-
E-quipmenV
Cennlaiin-e'rs/
Reagents
I
T
Store
Samples
at
Lab
I
t
Order,
Prepare,
Verify
Sample
Prep and
Analysts
Reagents
and
Equipment
SS
1
f
Prepare
Samples
[Extract,
Digest,
Distill,
Cleanup,
etc.)
i^~ ,-j~j~ .j-f-'.f i\r^i Ti '^^_
1 1 >
| ^ ^,
Analyze Produce
Samples Data
(Instrumental
Respa-ris*-)
K Is" (G
Note: PES equivalent to PT (proficiency testing) sample. Other abbreviations are defined in table.
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
56
Final
-------
TABLE A-3. QC SAMPLES THAT CONTRIBUTE TO DETERMINING ACCURACY/BIAS (CONTAMINATION SUBSET)
QC Measurement
Equipment Blank (EB)
Volatile Trip Blank (TB)
Bottle Blank (BB)
Storage Blank (SB)
Method Blank/Reagent Blank
(MB/RB)
Instrument Blank (IB)
Shipping Container Temperature
Blank
Sampling
Equipment
Sample
Container
Preservation
Env.
Media
Transport
Storage
at Lab
Preparation
Reagents
Preparation
Equipment
Analysis
Reagents
Analysis
Equipment
Frequency
1 par day
per type of
sample
equip.
1 per cooler
ofVOA
1 per lot of
bottles
1 per SDG
1 per bntch
(5%)
As needed
1 per cooler
Additional
Cost?
Y
Y-
Y
Y
N
N
N
I
II
I
QtacJ-ij^r. l
B}
T
=t (S*««M
SB J stor4ii«cj*s oondlt lo-m s-) % ^^^,
BB (t>«=tti«c cl«.cinliirke-ss3 Ji ^"
1 ^ IB'T *r
I ' ** J -**'
1 1 1 >
I 1 I ^
p»r-ej.ptO res,. Sen if* jpica^ Troiiis p*o>rt FVo-pca res , S on irk |p Bess; ^"ca pTipftl^BiH. D«ola
Ve^rify a n-d Sea nca pd
Ec| ui punn^anct ^^ Si|B , .^
Note: SDB = sample delivery group. Other abbreviations are defined in table.
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
57
Final
-------
TABLE A-4. QC SAMPLES THAT CONTRIBUTE TO DETERMINING METHOD SENSITIVITY
QC Measurement
Laboratory-Fortified Blank (LFB) at Quantitation Limit
Method Detection Limit (MDL) Study
Initial Calibration Low Standard at Quantitation Limit"
Preparation
Reagents
Preparation
Equipment
Analysis
Reagents
Analysis
Equipment
Frequency
1 pep sample deliverygroup (SDG)
Annual-
Whenever calibration is performed
Additional Cost?
Y
N
N
3 Not run for ICP.
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
58
Final
-------
APPENDIX B
ACRONYMS AND DEFINITIONS
-------
This page intentionally left blank.
-------
Acronyms
ANSI/ASQ American National Standards Institute/American Society for Quality
AR Analytical replicates
BB Bottle blank
CCV Continuing calibration verification
CERCLA Comprehensive Environmental Response, Compensation, and Liability Act
DoD Department of Defense
DOE Department of Energy
DQI Data quality indicator
DQO Data quality objective
EB Equipment blank
EPA Environmental Protection Agency
FS Feasibility study
FIRS Hazard Ranking System
IB Instrument blank
1C Initial calibration
IDQTF Intergovernmental Data Quality Task Force
IS Internal standard
LCS Laboratory control sample
LD Laboratory duplicate
LFB Laboratory fortified blank
MB Method blank
MDL Method detection limit
MS Matrix spike
MSD Matrix spike duplicate
NPL National priorities list
NTC Non-time-critical
O&M Operation and maintenance
PA Preliminary assessment
PQO Project quality objective
PT Proficiency test
QA Quality assurance
QAPP Quality Assurance Project Plan
QC Quality control
RB Reagent blank
RI Remedial investigation
ROD Record of Decision
SB Storage blank
SI Site investigation
SOPs Standard operating procedures
SPP Systematic planning process
SS Surrogate spike
UFP Uniform Federal Policy
IDQTF, UFP-QAPP Part 2B
QA/QC Compendium, March 2005
61
Final
-------
Definitions
Accuracy. The degree of agreement between an observed value and an accepted reference value.
Accuracy includes a combination of random error (precision) and systematic error (bias),
components which are due to sampling and analytical operations. Examples of QC measures for
accuracy include PT samples, matrix spikes, laboratory control samples (LCSs), and equipment
blanks.
Aliquot. A measured portion of a sample taken for analysis.
Analyte. A property which is to be measured.
Analytical replicates. Inj ecting multiple aliquots of the same sample extract or conducting multiple
measurements on the same sample using the same analytical system to evaluate analytical precision.
Audit (quality). A systematic and independent examination to determine whether QA/QC and
technical activities are being conducted as planned and whether these activities will
effectivelyachieve quality objectives.
Blank. A sample subjected to the usual analytical or measurement process to establish a zero
baseline or background value. A sample that is intended to contain none of the analytes of interest.
A blank is used to detect contamination during sample handling, preparation, and/or analysis.
Bottle blank. Sample designed to evaluate contamination introduced from the sample container(s)
in a particular lot.
Co-located samples. See field duplicates, co-located samples.
Comparability. The degree to which different methods or data agree or can be represented as
similar. Comparability describes the confidence that two data sets can contribute to a common
analysis and interpolation.
Confirmatory analysis. The process of generating sufficient evidence to ensure that a result for
a specific sample is valid. Analytes must be identified correctly in order to be quantified. The
identity and quantity of residues should be confirmed. Analytical methods which lack specificity
demand confirmation. This confirmation should be accomplished through an accompanying method
with greater specificity.
Continuing calibration verification. A check of the initial calibration that is performed during the
course of an analytical shift at period intervals using a calibration check standard. Continuing
calibration verification applies to both external standard and internal standard calibration techniques,
as well as to linear and nonlinear calibration models. The purpose is to assess the continued
capability of the measurement system to generate accurate and precise data over a period of time.
Contractor. Any organization or individual contracting to furnish services or items or to perform
work.
IDQTF, UFP-QAPP Part 2B 62 Final
QA/QC Compendium, March 2005
-------
Cooler temperature indicator. A device that monitors the temperature inside the sample cooler.
Examples may include a continuously recording thermostat, a temperature strip that notes when a
maximum temperature has been exceeded, or a shipping container temperature blank.
Data deliverable. Reports of analytical results from the laboratory. There are three levels of data
deliverables, from most limited to most complete: (1) tabulated sample results; (2) tabulated sample
results with QC results; and (3) tabulated sample results, QC results, and raw data printouts.
Data quality indicators. The quantitative statistics and qualitative descriptors that are used to
interpret the degree of acceptability or utility of data to the user. The principal data quality indicators
are precision, accuracy/bias, comparability, completeness, representativeness, and sensitivity.2 Also
referred to as data quality attributes.
Data quality objectives. Qualitative and quantitative statements derived from the DQO process,
as defined by EPA QA/G-4. DQOs can be used as the basis for establishing the quality and quantity
of data needed to support decisions.
Data quality objectives process. A systematic planning tool based on the scientific method that
clarifies study objectives, defines the appropriate type, quantity, and quality of data, and specifies
tolerable levels of potential decision errors needed to answer specific environmental questions and
to support proper environmental decisions. The DQO process is one type of systematic planning
process. See also systematic planning process.
Data review. The process of examining and/or evaluating data to varying levels of detail and
specificity by a variety of personnel who have different responsibilities within the data management
process. It includes verification, validation, and usability assessment.
Definitive data. Analytical data of known quality, concentration, and level of uncertainty. The
levels of quality and uncertainty of the analytical data are consistent with the requirements for the
decision to be made. Suitable for final decision-making.
Equipment blank. A sample of water free of measurable contaminants poured over or through
decontaminated field sampling equipment that is considered ready to collect or process an additional
sample. The purpose of this blank is to assess the adequacy of the decontamination process. Also
called rinse blank or rinsate blank.
Field blank. A blank used to provide information about contaminants that may be introduced
during sample collection, storage, and transport; also a clean sample, carried to the sampling site,
exposed to sampling conditions, transported to the laboratory, and treated as an environmental
sample.
Field duplicate (replicate) samples. 1) A generic term for two (or more) field samples taken at the
same time in the same location. They are intended to represent the same population and are taken
2The definition in the UFP-QS does not include sensitivity; however, sensitivity is
considered a principal DQI in the Compendium.
IDQTF, UFP-QAPP Part 2B 63 Final
QA/QC Compendium, March 2005
-------
through all steps of the analytical procedure in an identical manner and provide precision
information for the data collection activity. 2) The UFP-QAPP recognizes two categories of field
duplicate samples defined by the collection method: co-located field duplicates and subsample field
duplicates. See dso field duplicates, co-located and field duplicates, subsample.
Field duplicate, co-located. Two or more independent samples collected from side-by-side
locations at the same point in time and space so as to be considered identical. These separate
samples are said to represent the same population and are carried through all steps of the sampling
and analytical procedures in an identical manner. These samples are used to assess precision of the
total method, including sampling, analysis, and site heterogeneity. Examples of co-located field
duplicates include ambient air monitoring samples, surface water grab samples, and side-by-side
sample core soil samples.
Field duplicate, subsample. Duplicate (replicate) samples resulting from one sample collection at
one sample location. For example, duplicate subsamples may be taken from one soil boring or
sediment core.
Field measurements. Those activities associated with performing analyses or measurement in the
field. They include in-situ testing (e.g., with a temperature probe), on-site analyses (e.g., turbidity
readings), and field trailer/mobile lab analyses.
Field sampling. The set of procedures associated with the collection of environmental samples.
Holding time. The period of time a sample may be stored prior to its required analysis.
Initial calibration. Analysis of analytical standards at different concentrations that is used to define
the linearity and dynamic range of the response of the analytical detector or method.
Initial calibration low standard. Calibration standard whose concentration is at the lowest value
at which the analytical instrument is capable of producing acceptable qualitative and quantitative
data; the lowest part of the calibration curve (i.e., the quantitation limit).
Instrument blank. An aliquot of analyte-free water or solvent processed through the instrumental
steps of the measurement process to determine the presence of carryover from the previous analysis.
Analysis does not include any sample preparation.
Internal standard. A standard added to a test portion of a sample in a known amount and carried
through the entire determination procedure as a reference for calibrating and controlling the
precision and bias of the applied analytical method.
Laboratory control sample. A sample of known composition prepared using reagent-free water
or an inert solid that is spiked with analytes of interest at the midpoint of the calibration curve or at
the level of concern. It is analyzed using the same sample preparation, reagents, and analytical
methods employed for regular samples.
Laboratory duplicates/replicates. Two or more representative portions taken from one
homogeneous sample by the laboratory and analyzed in the same laboratory. Laboratory duplicate
IDQTF, UFP-QAPP Part 2B 64 Final
QA/QC Compendium, March 2005
-------
samples are quality control samples that are used to assess intralaboratory preparatory and analytical
precision.
Laboratory fortified blank. A low-level LCS sample (e.g., at the quantitation limit) used to
evaluate laboratory preparatory and analytical sensitivity and bias for specific compounds.
Matrix spike. A sample prepared by adding a known concentration of a target analyte to an aliquot
of a specific homogenized environmental sample for which an independent estimate of the target
analyte concentration is available. The matrix spike is accompanied by an independent analysis of
the unspiked aliquot of the environmental sample. Spiked samples are used to determine the effect
of the matrix on a method's recovery efficiency.
Matrix spike duplicate. A homogeneous sample used to determine the precision of the
intralaboratory analytical process for specific analytes (organics only) in a sample matrix. Sample
is prepared simultaneously as a split with the matrix spike sample, as each is spiked with identical,
known concentrations of targeted analyte(s).
Method blank. A sample of a matrix similar to the batch of associated samples (when available)
in which no target analytes or interferences are present at concentrations that impact the analytical
results. Itis processed simultaneously with samples of similar matrix and under the same conditions
as the samples.
Method detection limit studies. A statistical determination that defines the minimum concentration
of a substance that can be measured and reported with 99 percent confidence that the analyte
concentration is greater than zero.
Oversight. The oversight process involves independent (outside of work process) internal and
external assessment of the quality system and projects for conformance with requirements,
effectiveness of requirements in maintaining quality, and taking (or ensuring or effecting)
appropriate corrective action.
Positive control sample. A prepared standard which undergoes an analytical procedure at a
specified frequency for the purpose of providing comparison with an unknown sample based on
specified criteria, thereby monitoring recovery to assure that a test and/or its components are
working properly and producing correct or expected results. This term is a generic term that can
refer to a number or different QC samples which can be used as a "positive control" (e.g.,
Laboratory Control Sample, Matrix Spike).
Precision. The degree to which a set of observations or measurements of the same property,
obtained under similar conditions, conform to themselves. Precision is usually expressed as standard
deviation, variance, or range, in either absolute or relative terms. Examples of QC measures for
precision include field duplicates, laboratory duplicates, matrix spike duplicates, analytical
replicates, and internal standards.
Proficiency testing sample (sometimes called a performance evaluation (PE) sample). A sample,
the composition of which is unknown to the laboratory or analyst, which is provided to that analyst
or laboratory to assess capability to produce results within acceptable criteria. PT samples can fall
IDQTF, UFP-QAPP Part 2B 65 Final
QA/QC Compendium, March 2005
-------
into three categories: (1) prequalification, conducted prior to a laboratory beginning project work,
to establish initial proficiency; (2) periodic (e.g., quarterly, monthly, or episodic) to establish
ongoing laboratory proficiency; and (3) batch-specific, which is conducted simultaneously with
analysis of a sample batch. A PT sample is sometimes called a performance evaluation sample.
PT sample, amputated. A PT sample that is received as a concentrate and must be diluted to
volume before being treated as an analytical sample. It can only be single blind.
PT sample, full volume. A PT sample that is received by the laboratory ready to be treated as an
analytical sample. It does not require dilution, therefore can be single or double blind.
PT sample, site-specific. A PT sample created using well-characterized contaminated media that
is treated as an analytical sample by the laboratory to test its capabilities.
Project quality objectives. Qualitative and quantitative statements derived from a Systematic
Planning Process (e.g., EPA QA/G-4 DQO process) that clarify study objectives, define the
appropriate type of data, and specify tolerable levels of potential decision errors. PQOs will be used
as the basis for establishing the quality and quantity of data needed to support decisions.
Quality assurance. An integrated system of management activities involving planning,
implementation, assessment, reporting, and quality improvement to ensure that a process, item, or
service is of the type and quality needed and expected by the client.
Quality assurance project plan. A formal document describing in comprehensive detail the
necessary quality assurance, quality control, and other technical activities that must be implemented
to ensure that the results of the work performed will satisfy the stated performance criteria.
Quality control. The overall system of technical activities that measure the attributes and
performance of a process, item, or service against defined standards to verify that they meet the
stated requirements established by the customer; operational techniques and activities that are used
to fulfill requirements for quality; also the system of activities and checks used to ensure that
measurement systems are maintained within prescribed limits, providing protection against "out of
control" conditions and ensuring the results are of acceptable quality.
Quality system. A structured and documented management system describing the policies,
objectives, principles, organizational authority, responsibilities, accountability, and implementation
plan of an organization for ensuring quality in its work processes, products (items), and services.
The quality system provides the framework for planning, implementing, and assessing work
performed by the organization and for carrying out the required QA and QC.
Readiness review. A systematic, documented review of the readiness for the startup or continued
use of a facility, process, or activity. Readiness reviews are typically conducted before proceeding
beyond project milestones and prior to initiation of a major phase of work.
Reagent blank. An aliquot of water or solvent free of measurable contaminants analyzed with the
analytical batch and containing all the reagents in the same volume as used in the processing of the
samples. The method blank goes through preparatory steps; the reagent blank does not.
IDQTF, UFP-QAPP Part 2B 66 Final
QA/QC Compendium, March 2005
-------
Representativeness. A measure of the degree to which data accurately and precisely represent a
characteristic of a population, a parameter variation at a sampling point, a process condition, or an
environmental condition.
Requirement. A formal statement of a need and the expected manner in which it is to be met;
documented statements that specify activities that must be done; the mandated activities.
Screening data. Analytical data of known quality, concentration, and level of uncertainty. The
levels of quality and uncertainty of the analytical data are consistent with the requirements for the
decision to be made. Screening data are of sufficient quality to support an intermediate or
preliminary decision but must eventually be supported by definitive data before a project is
complete.
Secondary data. Data not originally collected for the purpose for which they are now being used.
In addition, the level of QA/QC provided at the time of the original data collection may be unknown.
Sensitivity. The capability of a test method or instrument to discriminate between measurement
responses representing different levels (e.g., concentrations) of a variable of interest. Examples of
QC measures for determining the sensitivity include laboratory-fortified blanks, a method detection
limit study, and initial calibration low standards at the quantitation limit.
Shipping container temperature blank. A container of water designed to evaluate whether or not
samples were adequately cooled during sample shipment.
Split sample. Two or more representative portions taken from one sample in the field or laboratory,
analyzed by at least two different laboratories and/or methods. Prior to splitting, a sample is mixed
(except volatiles, oil and grease, or when otherwise directed) to minimize sample heterogeneity.
These are quality control samples used to assess precision, variability, and data comparability
between different laboratories. (Split samples should be used when accompanied by a PT sample.)
Stakeholders. Individuals or groups of individuals with a strong interest in the Agency' s work and
policies. This includes "affected parties" (individuals or groups directly affected by EPA policies
or decisions).
Standard Operating Procedures. A written document that details the method for an operation,
analysis, or action with thoroughly prescribed techniques and steps and that is officially approved
as the method for performing certain routine or repetitive tasks.
Storage blank. Sample composed of water free of measurable contaminants and stored with a
sample set in the same kind of sample container. Storage begins upon receipt of sample shipment
at the laboratory. The storage blank is analyzed at the end of the sample storage period to assess
cross-contamination occurring during sample storage (typically analyzed only for volatile organic
compounds).
Surrogate spike or analyte. A pure substance with properties that mimic the analyte of interest
(organics only). Surrogates are brominated, fluorinated, or isotopically labeled compounds unlikely
IDQTF, UFP-QAPP Part 2B 67 Final
QA/QC Compendium, March 2005
-------
to be found in environmental samples. These analytes are added to samples to evaluate analytical
efficiency by measuring recovery.
Systematic planning process. Systematic planning is a process that is based on the scientific
method and includes concepts such as objectivity of approach and acceptability of results.
Systematic planning is based on a common sense, graded approach to ensure that the level of detail
in planning is commensurate with the importance and intended use of the work and the available
resources. This framework promotes communication among all organizations and individuals
involved in an environmental program. Through a systematic planning process, a team can develop
acceptance or performance criteria for the quality of the data collected and for the quality of the
decision.
Usability assessment. Evaluation of data based upon the results of validation and verification for
the decisions being made. In the usability step, reviewers assess whether the process execution and
resulting data meets quality objectives based on criteria established in the QAPP.
Validation. Confirmation by examination and provision of objective evidence that the particular
requirements for a specific intended use are fulfilled. Validation is a sampling and analytical
process evaluation that includes evaluating compliance with methods, procedures, or contracts, and
comparison with criteria based upon the quality objectives developed in the project QAPP. The
purpose of validation is to assess the performance associated with the sampling and analysis to
determine the quality of specified data. [Compliance with method, procedural, and contractual
requirements. Comparison to project quality criteria from the QAPP.]
Verification. Confirmation by examination and provision of objective evidence that the specified
requirements (sampling and analytical) have been completed. This is to be a completeness check.
Volatile organic compound trip blank. A clean sample of water free of measurable contaminants
that is taken to the sampling site and transported to the laboratory for analysis without having been
exposed to sampling procedures. Analyzed to assess the contamination introduced during sample
shipment. Typically analyzed only for volatile organic compounds.
IDQTF, UFP-QAPP Part 2B 68 Final
QA/QC Compendium, March 2005
-------
Bibliography
Environmental Data Quality Workgroup, 2003. Quality Systems Manual for Environmental
Laboratories, Version 2.
EPA, 1993. Interim Final Guidance for Planning for Data Collection in Support of Environmental
Decision Making Using the Data Quality Objectives Process (QAMS, EPA QA/G-4).
EPA, February 1994. USEPA Contract Laboratory Program National Functional Guidelines for
Inorganic Data Review (EPA-540/R-94-013, PB94-963502).
EPA, February 1998. EPA Guidance for Quality Assurance Project Plans (EPA QA/G-5, EPA
600/R-98/018).
EPA, July 16, 1998. Policy and Program Requirements for the Mandatory Agency-Wide Quality
System (EPA Order 5360.0 CHG 1).
EPA, October 1999. USEPA Contract Laboratory Program National Functional Guidelines for
Organic Data Review (EPA-540/R-99-008, PB99-963506).
EPA, June 2001. Draft Final Contract Laboratory Program Guidance for Field Samplers (EPA-
540-R-00-003).
EPA, June 2001. USEPA Contract Laboratory Program National Functional Guidelines for Low
Concentration Organic Data Review (EPA-540-R-00-006).
EPA, Environmental Data Registry (EDR 6.0, 2001). http://www.epa.gov/edr/.
EPARegion 1, September 1999. Region 1, EPA-New England Compendium of Quality Assurance
Project Plan Guidance, Final.
EPARegion 3, September 1996. Quality Management Plan for EPA Region HI (RQMP\96-00\).
EPA Region 3. EPA Region III QA Directives.
European Commission on Co-ordinated Analytical Control, 2000. Quality Control Procedures for
Pesticide Residue Analysis.
Intergovernmental Data Quality Task Force, 2005. Uniform Federal Policy for Implementing
Environmental Quality Systems., Version 2.
Intergovernmental Data Quality Task Force, March 2005. Uniform Federal Policy for Quality
Assurance Project Plans.
National Environmental Laboratory Accreditation Conference, July 1999. NELAC Standards.
National Environmental Laboratory Accreditation Conference, June 2000. NELAC Standards.
IDQTF, UFP-QAPP Part 2B 69 Final
QA/QC Compendium, March 2005
-------
Naval Facilities Engineering Command, September 1999. Navy Installation Restoration Chemical
Data Quality Manual (IR CDQM).
Wagner, R.E., andKotas, W. (eds.), 1992. Guide to Environmental Analytical Methods, 2nd edition.
Genium Publishing Corporation: Schenectady, New York.
IDQTF, UFP-QAPP Part 2B 70 Final
QA/QC Compendium, March 2005
------- |