EPA
JT '.Vas'e P'ograms £rTcrc^n~,er
Wasnington DC 20460
540G87003
Superfund
Data Quality
Objectives for
Remedial Response
Activities RECEIVED
,£8211989
D eve lop nrent
Agency
U.S. Environmental Protection
-------
EPA/540/G-87/003 \/
(OSVVER Directive 9355.0-7B)
March 1987
* DATA QUALITY OBJECTIVES
3 FOR REMEDIAL RESPONSE ACTIVITIES
t
i4 Development Process
Prepared for:
Office of Emergency and Remedial Response
and
Office of Waste Programs Enforcement
Office of Solid Waste and Emergency Response
U.S. Environmental Protection Agency
Washington, DC 20460
Prepared by:
CDM Federal Programs Corporation
7611 Little River Turnpike, Suite 104
Annandale, VA 22003
EPA Contract No. 68-01-6939
March 1987
-------
-------
NOTICE
This document has been reviewed in accordance with
U.S. Environmental Protection Agency policy and
approved for publication. Mention of trade names
or commercial products does not constitute endorse-
ment or recommendation for use.
-------
-------
PREFACE
This document. Data Quality Objectives For Remedial Activities (Development Process), guides the
user through the process of developing data quality objectives (DQOs) for site-specific remedial
activities. Remedial response activities include remedial investigations (RI). feasibility studies
(FS). remedial design (RD). and remedial action (RA). This guidance manual should be used in
conjunction with the Data Quality Objectives For Remedial Response Activities (Example Scenario -
RI/FS Activities at a Site With Contaminated Soils And Ground Water) which provides an outline of
how the DQO process is applied to a hypothetical site situation.
These guidance documents will be updated in the future to focus on quantification of DQO's and other
statistical issues.
This is one of a series of guidance documents prepared in accordance with the National Oil and
Hazardous Substance Pollution Contingency Plan (NCP) final rule, published in the Federal Register
November 20, 1985 and effective February 18. 1986. These guidance documents will be updated in the
near future to be consistent with provisions of the Superfund Amendments and Reauthorization Act
(SARA) and the new NCP. The guidance document series includes the following titles:
Guidance on Remedial Investigations Under CERCLA (EPA 540/G-85/002)
t Guidance on Feasibility Studies Under CERCLA (EPA 540/G-85/003)
Superfund Remedial Design and Remedial Action Guidance (OSWER Directive 9355.0-4A)
Compendium of Field Operations Methods (planned June 1987)
Superfund Public Health Evaluation Manual (OSWER Directive 9285.4-1)
0 Superfund Exposure Assessment Manual (OSWER Directive 9285.5-1)
Collectively, these documents provide guidance for the development and performance of technically
sound and cost-effective remedial response activities which will support the program goals of both
the Office of Emergency and Remedial Response (OERR) and the Office of Waste Programs Enforcement
(OWPE). These documents are also available for use by state agencies and private parties conducting
remedial response activities to ensure consistency with the intent of CERCLA and SARA.
in
-------
-------
TABLE OF CONTENTS
Section
1.0 INTRODUCTION 1-1
1.1 PURPOSE 1-1
1.2 DATA QUALITY OBJECTIVE POLICY BACKGROUND 1 -2
1.3 FORMAT 1-2
2.0 DATA QUALITY OBJECTIVE DEVELOPMENT PROCESS 2-1
2.1 DQO STAGES 2-1
2.1.1 STAGE 1 - IDENTIFY DECISION TYPES 2-1
2.1.2 STAGE 2 - IDENTIFY DATA USES/NEEDS 2-1
2.1.3 STAGE 3 - DESIGN DATA COLLECTION PROGRAM 2-1
2.2 RI/FS PROCESS 2-1
2.2.1 GENERAL APPROACH 2-3
2.2.2 PHASED RI/FS APPROACH 2-3
2.3 REMEDIAL DESIGN 2-3
2.4 REMEDIAL ACTION 2-5
2.5 DATA QUALITY OBJECTIVES DOCUMENTATION 2-5
2.6 REFERENCES 2-5
3.0 RI/FS DQO STAGE 1 - IDENTIFY DECISION TYPES 3-1
3.1 IDENTIFY AND INVOLVE DATA USERS 3-1
3.1.1 DECISION MAKER'S ROLE 3-1
3.1.2 DATA USERS' ROLE 3-3
3.2 EVALUATE AVAILABLE INFORMATION 3-3
3.2.1 DESCRIBE CURRENT SITUATION 3-3
3.2.2 REVIEW AVAILABLE DATA 3-5
3.2.3 ASSESS ADEQUACY OF DATA 3-6
3.3 DEVELOP CONCEPTUAL MODEL 3-6
3.3.1 EVALUATION OF THE CONCEPTUAL MODEL 3-6
3.3.2 COMPUTER MODELS 3-9
3.4 SPECIFY OBJECTIVES/DECISIONS 3-10
3.4.1 OBJECTIVES 3-10
3.4.2 DECISION TYPES 3-10
3.5 REFERENCES 3-12
4.0 RI/FS DQO STAGE 2 IDENTIFY DATA USES/NEEDS 4-1
4.1 IDENTIFY DATA USES 4-3
4.1.1 DATA USE CATEGORIES 4-3
4.1.2 RI/FS USES 4-7
4.2 IDENTIFY DATA TYPES 4-7
4.3 IDENTIFY DATA QUALITY NEEDS 4-9
4.3.1 DATA QUALITY FACTORS 4-9
4.3.2 COST ANALYSIS OF ALTERNATIVES 4-13
4.4 IDENTIFY DATA QUANTITY NEEDS 4-13
4.5 EVALUATE SAMPLING AND ANALYSIS OPTIONS 4-14
-------
-------
TABLE OF CONTENTS
(Continued)
4.5.1 SAMPLING AND ANALYSIS APPROACH (PHASING) 4-14
4.5.2 RESOURCE CONSIDERATIONS 4-17
4.6 REVIEW PARCC PARAMETER INFORMATION 4-17
4.6.1 PRECISION 4-17
4.6.2 ACCURACY 4-18
4.6.3 REPRESENTATIVENESS 4-18
4.6.4 COMPLETENESS 4-19
4.6.5 COMPARABILITY 4-19
4.7 UTILIZING PARCC PARAMETER INFORMATION 4-19
4.8 REFERENCES 4-20
5.0 RI/FS DQO STAGE 3 - DESIGN DATA COLLECTION PROGRAM 5-1
5.1 ASSEMBLE DATA COLLECTION COMPONENTS 5-1
5.2 DEVELOP DATA COLLECTION DOCUMENTATION 5-1
5.2.1 SAMPLING AND ANALYSIS PLANS 5-1
5.2.2 WORK PLANS 5-4
5.2.3 ENFORCEMENT CONCERNS 5-4
5.3 REFERENCES 5-5
6.0 REMEDIAL DESIGN (reserved)
7.0 REMEDIAL ACTION (reserved)
APPENDIX A STATISTICAL CONSIDERATIONS
APPENDIX B ANALYTICAL CONSIDERATIONS
APPENDIX C SAMPLING CONSIDERATIONS
APPENDIX D REVIEW OF QAMS DQO CHECKLIST
APPENDIX E POTENTIALLY APPLICABLE OR RELEVANT AND APPROPRIATE REQUIREMENTS
APPENDIX F HISTORICAL PRECISION AND ACCURACY DATA CLASSIFIED BY MEDIA BY
ANALYTICAL LEVEL
APPENDIX G RCRA APPENDIX VIII CLP HSL COMPARISON
APPENDIX H CONTRACT REQUIRED DETECTION LIMITS FOR HSL ANALYSES USING CLP IFB
PROCEDURES
-------
LIST OF FIGURES
Figure
2-1 DQO Three-Stage Process 2-2
2-2 Phased RI/FS Approach and the DQO Process 2-4
3-1 DQO Stage 1 Elements 3-2
3-2 Decision Makers Data Users Interaction 3-4
3-3 Elements of a Conceptual Evaluation Model 3-7
3-4 Example Conceptual Model Illustration 3-8
3-5 Relationship of Risk and Data Quality/Quantity 3-13
4-1 DQO Stage 2 Elements 4-2
4-2 Sample Type Specification Logic Diagram 4-8
4-3 Integration of Analytical Support Levels 4-16
5-1 Stage 3 Elements Design Data Collection Program 5-2
LIST OF TABLES
Table
3-1 Generic RI/FS Objectives 3-11
4-1 Data Uses 4-4
4-2 DQO Summary Form 4-5
4-3 Summary of Analytical Levels 4-11
4-4 Appropriate Analytical Levels 4-12
5-1 Quality Assurance Project Plan Elements 5-3
-------
LIST OF ACRONYMS
ARAR Applicable or Relevant and Appropriate Requirements
ATSDR Agency for Toxic Substances and Disease Registry
CERCLA Comprehensive Environmental Response. Compensation, and Liability Act of 1980 (Superfund)
CDC Centers for Disease Control
CLP Contract Laboratory Program
COE U.S. Army Corps of Engineers
DQO Data Quality Objective
EMSL-LV Environmental Monitoring and Support Laboratory - Las Vegas
ESD Environmental Services Division (of EPA)
FIT Field Investigation Team
FS Feasibility Study
GC/MS Gas Chromatograph/Mass Spectrograph
HSL Hazardous Substance List
MDL Method Detection Limit
NBS National Bureau of Standards
NCP National Contingency Plan
NEIC National Enforcement Investigation Center
NPL National Priorities List
ORC Office of Regional Counsel
PARCC Precision. Accuracy. Representativeness, Completeness, Comparability
PRP Potentially Responsible Party
QAMS Quality Assurance Management Staff
QAPP Quality Assurance Program Plan
QAPjP Quality Assurance Project Plan
QA/QC Quality Assurance/Quality Control
RA Remedial Action
RAS Routine Analytical Service
RD Remedial Design
RI Remedial Investigation
ROD Record of Decision
RPM Remedial Project Manager
RSCC Regional Sample Control Center
S&A Sampling and Analysis
SARA Superfund Amendments and Reauthorization Act of 1986
SAS Special Analytical Service
SMO Sample Management Office
SRM Standard Reference Material
TAC Technical Advisory Committee
TAT Technical Assistance Team
TIC Tentatively Identified Compounds
TSCA Toxic Substances Control Act
VOC Volatile Organic Compounds
-------
ACKNOWLEDGMENTS
This document was developed for the Office of Solid Waste and Emergency Response (OSWER) by a task
force composed of the following individuals:
Randall Kaltreider (Hazardous Site Control Division. OERR)
Linda Boornazian (CERCLA Enforcement Division, OWPE)
Andrew Szilagyi (CDM Federal Programs Corporation)
Jeffery Sullivan (Camp Dresser & McKee Inc.)
RoseMary Ellersick (CDM Federal Programs Corporation)
Tom Pedersen (Camp Dresser & McKee Inc.)
James Occhialini (Camp Dresser & McKee Inc.)
Dennis Gagne (Region 1, Waste Management Division)
Bill CoakJey (Region 2, Environmental Services Division)
Edward Shoener (Region 3, Hazardous Waste Management Division)
Diane Moshman (Region 5, Waste Management Division)
Steve Lemons (Region 6, Environmental Services Division)
Bill Bunn (Region 7, Environmental Services Division)
Mike Carter (Hazardous Response Support Division. OERR)
Duane Geuder (Hazardous Response Support Division, OERR)
Michael Kosakowski (CERCLA Enforcement Division, OWPE)
Dennisse Beauchamp (CERCLA Enforcement Division, OWPE)
Gary Liberson (Lloyd Associates)
Craig Zamuda (Policy Analysis Staff, OERR)
John Warren (Statistical Policy Branch. OPPE)
Wendy Sydow (CDM Federal Programs Corporation)
Paul Clay (NUS Corporation)
Helpful suggestions and comments on the draft document were provided by the following as well as
other EPA and contractor staff.
David F. Doyle (Camp Dresser & McKee Inc.)
Dean Neptune (QAMS)
Gene Brantly (RTI)
Daniel Michael (RTI)
-------
1.0 INTRODUCTION
Data quality objectives (DQOs) are qualitative and quantitative statements which specify the quality
of the data required to support Agency decisions during remedial response activities. DQOs are
determined based on the end uses of the data to be collected. For example, depending on the project
phase, sufficient data may have to be collected to characterize the site, evaluate remedial
alternatives, determine design criteria, or monitor site conditions and/or remedial action
effectiveness. DQOs are applicable to all data collection activities, including those performed for
preliminary assessments/site investigations (PA/SI), remedial investigations (RI), feasibility
studies (FS), remedial design (RD), and remedial actions (RA). The level of detail and data quality
needed will vary based on the intended use of the data. The variability of site characteristics
makes it impossible to apply a generic set of DQOs to all CERCLA activities, however investigators
are expected to take advantage of previous experience and data collected for similar sites.
DQOs are established prior to data collection and are not considered a separate deliverable.
Rather, the DQO development process is integrated with the project planning process, and the results
are incorporated into the sampling and analysis (S&A) plan, quality assurance project plan (QAPjP)
and, in general terms, into the work plan for the site. The DQO process results in a well thought
out sampling and analysis plan which details the chosen sampling and analysis option and statements
of the confidence in decisions made during the remedial process. Confidence statements are possible
through the application of statistical techniques to the data.
Data quality objectives should be specified for each data collection activity associated with a
remedial response. The majority of these data collection activities take place during a remedial
investigation (RI) but additional data needs may be identified during the feasibility study (FS).
remedial design (RD), and remedial action (RA).
All investigation activities should be conducted and documented in a manner that ensures that
sufficient data of known quality are collected to support sound decisions concerning remedial action
selection. This applies to fund-lead, federal or state enforcement-lead, and potentially
responsible party-lead projects.
1.1 PURPOSE
The purpose of this guidance document is to identify the framework and process by which DQOs are
developed and the individuals responsible for development of DQOs. This document is intended to
guide the user through the process of DQO development. Each site will have a unique history, data
availability, and other factors. Therefore, a unique set of DQOs must be developed for each site.
This DQO guidance acts as a supplement to existing remedial program guidance by providing procedures
for determining a quantifiable degree of certainty which can be used in making site-specific
decisions. In actual practice to date, projects conducted under CERCLA have complied with the
intent of the DQO process. DQOs have been incorporated as parts of sampling and analytical plans,
quality assurance project plans or work plans. The purpose of this guidance is to provide a more
formal approach to integration of DQO development with S&A plan development and to improve the
overall quality and cost effectiveness of data collection and analysis activities.
This guidance focuses specifically on the DQO process. RI/FS activities (planning and
implementation) are presented only as a framework for DQOs and as such are not fully developed in
accordance with RI/FS guidance. Similarly, this document is not meant to be guidance on overall
development of sampling and analysis plans, quality assurance project plans, or work plans. Future
documents will emphasize statistical considerations in the DQO process.
-------
-------
1.2 DATA QUALITY OBJECTIVE POLICY BACKGROUND
Mr. Alvin Aim, then Deputy Administrator of the EPA, in his memorandum of May 24, 1984 to the
Assistant Administrators (AAs), stated that one of the most important steps in assuring the quality
of environmental data is development of DQOs. He requested active participation of the AAs in the
development of DQOs during the stages in which policy and guidance is crucial, and asked for
identification of significant ongoing environmental data collection activities. The Quality
Assurance Management Staff (QAMS) issued guidance on development of DQOs in October 1984. A
checklist for DQO review was then issued in a memorandum from Stan Blacker on April 3, 1985.
Appendix D includes a comparison of this checklist with this DQO guidance document. Additional
guidance on the development of DQOs, specifically related to Stages 1 and 2 of the process, was
provided in a draft document issued by QAMS March 17, 1986.
The approach to developing and implementing DQOs for remedial response activities has been
established by a DQO Task Force comprising technical personnel from EPA Headquarters (OERR and
OWPE), Regions 1, 2, 3, 5, 6 and 7; and EPA remedial contractors. The methodology used by the DQO
Task Force was to apply the guidance provided by QAMS to the remedial response process. The efforts
of the Task Force included identifying the elements of the DQO process within existing planning
documents and organizing them into a formal implementation approach. The DQO development process
presented in this document is based on the best available information but may be revised as
additional information becomes available.
1.3 FORMAT
This document includes the following sections:
1.0 Introduction
2.0 DQO Development Process - the process for developing DQOs and how DQO development
relates to the remedial response program.
3.0 RI/FS-DQO Stage 1 - identification and involvement of data users, development of a
conceptual site model and definition of decision types that will be made during the
RI/FS process.
4.0 RI/FS-DQO Stage 2 - determining data needs and uses, establishing criteria for
decisions, and identifying and selecting analytical and sampling options.
5.0 RI/FS-DQO Stage 3 - assembling sampling and analytical components into an overall
sampling design and documentation required for a sampling and analytical program.
6.0 Remedial Design - Reserved
7.0 Remedial Action - Reserved
Appendix A Statistical Considerations - provides a description of some statistical approaches which
may be applied during a remedial action program.
Appendix B Analytical Considerations - describes the various options that are available for
analyzing samples from uncontrolled hazardous waste sites.
Appendix C Sampling Considerations - provides discussion of sampling rationale related to the DQO
development process.
-------
Additional appendices to the DQO document provide information on the QAMs DQO checklist, established
criteria for RI/FS activities, and CLP performance criteria.
Sections of this manual are applicable to specific components of the remedial response process.
Sections 1 and 2 are applicable to all remedial response activities; Sections 3. 4 and 5 apply
specifically to the RI/FS process. Sections 6 and 7 are forthcoming and will provide guidance for
the application of DQOs to Remedial Design Activities (Section 6) and to Remedial Actions (Section
7).
A companion to this guidance is the Data Quality Objectives For Remedial Response Activities Example
Scenario) (EPA 1987) which provides an example case study of implementation of the DQO process.
-------
2.0 DATA QUALITY OBJECTIVE DEVELOPMENT PROCESS
Data quality objectives are identified during project scoping and development of sampling and analysis
plans. DQOs are established to ensure that the data collected are sufficient and of adequate quality for
their intended uses. Data collected and analyzed in conformance with the DQO process described in this
document can be used in assessing the uncertainty associated with decisions related to remedial
response.
2.1 DQO STAGES
Data quality objectives are developed through a three-stage process, as illustrated in Figure 2-1.
Although the three stages are discussed sequentially in this guidance document, they should be undertaken
in an interactive and iterative manner, whereby all the DQO elements are continually reviewed and
reevaluated. As such, the DQO process is integrated with development of the S&A plan and is revised as
needed based upon the results of each data collection activity. This process is illustrated in the
example document.
2.1.1 STAGE 1 - IDENTIFY DECISION TYPES
Stage 1 of the DQO process defines the types of decisions which will be made regarding site remediation
through identifying data users, evaluating available data, developing a conceptual model, and specifying
objectives for the project. Available information is compiled and analyzed to develop a conceptual model
of the site. This model describes suspected sources, contaminant pathways, and potential receptors. The
model facilitates identification of decisions which must be made and deficiencies in the existing
information. Stage 1 results in the specification of the decision making process and identification of
why new data are needed.
2.1.2 STAGE 2 - IDENTIFY DATA USES/NEEDS
Stage 2 stipulates criteria for determining data adequacy. This stage involves specifying the data
necessary to meet the objectives set in Stage 1. Stage 2 includes selection of the sampling approaches
and the analytical options for the site, including evaluation of multiple-option approaches to effect
more timely or cost-effective data collection and evaluation.
2.1.3 STAGE 3 - DESIGN DATA COLLECTION PROGRAM
Stage 3 results in the specification of the methods by which data of acceptable quality and quantity will
be obtained to make decisions. This information is provided in documents such as the S&A plan, and is
summarized in the work plan.
2.2 RI/FS PROCESS
2.2.1 GENERAL APPROACH
The overall objective of an RI/FS is to determine the nature and extent of the threat posed by the
release of hazardous substances and to evaluate proposed remedies. The ultimate goal is to select a
cost-effective remedial alternative which mitigates threats to and provides protection of public health.
welfare, and the environment, consistent with the NCP.
The term, uncertainty, is used as a catchall term to describe the likelihood of all types of errors
associated with a particular decision. There is not a precise statistical definition of the term since
(lie precise definition varies from decision to decision: however, it can be stated that uncertainty is
always a function of the distribution of the statistics used in making the decision.
-------
-------
STAGE 1
IDENTIFY DECISION TYPES
IDENTIFY & INVOLVE DATA USERS
EVALUATE AVAILABLE DATA
DEVELOP CONCEPTUAL MODEL
SPECIFY OBJECTIVES/DECISIONS
I
STAGE 2
IDENTIFY DATA USES/NEEDS
IDENTIFY DATA USES
IDENTIFY DATA TYPES
IDENTIFY DATA QUALITY NEEDS
. IDENTIFY DATA QUANTITY NEEDS
EVALUATE SAMPLING/ANALYSIS OPTIONS
REVIEW PARCC PARAMETERS
STAGE 3
DESIGN DATA COLLECTION PROGRAM
ASSEMBLE DATA COLLECTION COMPONENTS
DEVELOP DATA COLLECTION DOCUMENTATION
FIGURE 2-1
DQO THREE-STAGE PROCESS
-------
RIs consist of data gathering activities undertaken to determine the degree and extent of contamination
at a site. The data are used in the identification, screening, and evaluation of remedial alternatives.
The objective of the RJ is to collect the necessary data to determine the distribution and migration of
contaminants; identify cleanup criteria; and identify and support the remedial alternative evaluation.
Feasibility studies entail development, screening, and evaluation of remedial alternatives. The
objectives of the FS are to develop and evaluate the remedial action alternatives with respect to
protection of public health and the environment, compliance with ARARs, and reduction of mobility and/or
toxicity. In order to ensure that adequate and sufficient data are collected for performance of the FS,
site managers must continually coordinate the evaluation and re-evaluation of data collected during the
RI.
The RI/FS typically addresses data collection and site characterization from the perspective of
contaminant source and contaminant migration pathways. Once pathways are established and human and
environmental receptors are identified, further data collection efforts can be directed toward evaluating
the potential impact upon receptors, and for use in evaluating potential remedial technologies and
alternatives.
Through the process of developing DQOs, a series of statements and definitions of the types, quantity and
quality of data required for specific uses will be developed.
2.2.2 PHASED RI/FS APPROACH
The amount and quality of data required to support selection of a remedial alternative will vary by site.
In many situations it may not be possible to identify all data needs during the initial scoping
activities. Rather, data needs will become more clearly defined as additional data are obtained and
evaluated. By separating the remedial investigation into phases, data can be collected and evaluated
sequentially, with a refinement or redefinition of data collection needs at the completion of each phase.
Figure 2-2 illustrates the phased RI/FS approach.
It is seldom possible to identify fully all the data needed to complete an RI/FS at the outset of the
scoping process. For complex sites, the phased approach provides more control of investigative
activities than a singular sampling/analysis event. Applying the DQO process to a phased investigation
improves the usability of the data and the cost effectiveness of the investigation.
2.3 REMEDIAL DESIGN
Following selection of a remedy (based on the RI/FS) and approval of the Record of Decision (ROD) or
Enforcement Decision Document (EDD), design activities are initiated. Additional field data collection
activities may be required during the remedial design phase to supplement the technical data available
from the RI/FS.
Cost estimates should be refined to the + 15/-IO percent range based on data collected during the RD (EPA
1986). The type of data required during the RD varies depending on the type of remedies. For soil
excavation, a good estimate of contaminated soil volume is needed: for treatment options, a refined
estimate of the physical/chemical waste character may be required. If the RI is carefully planned with
accurate foresight of FS and RD data needs, sampling activities during the RD phase should be minimized.
The practical application of DQOs to RD activities will be described in future updates to this document.
-------
INITIATION OF
RI/FS
FIGURE 2-2
PHASED RI/FS APPROACH AND THE DQO PROCESS
2-4
-------
2.4 REMEDIAL ACTION
RA activities entail the actual implementation of the alternative selected in the ROD/HDD. As with the
RD, additional data collection activities may have to be conducted during the RA. and the DQO process
utilized. Data collected during the RA are used to evaluate the progress of the RA and to verify that
the set performance criteria were achieved.
2.5 DATA QUALITY OBJECTIVES DOCUMENTATION
The DQO development process is initiated during project scoping and is completed in conjunction with the
development of an S&A plan for each project phase. The three stages of the DQO development process are
interactive in nature. As additional details regarding the site are discovered, the decisions which will
be made during the project are refined. This allows for further specification of data needs and for
design of the data collection program.
As the DQO process continues, the scoping of the project will become refined. Additional decision types
may be needed (Stage 1), or data collection activities may be modified (Stage 2 and Stage 3) based on
evaluation of data (Stage 1).
Development of DQOs in a formal manner ensures that the appropriate data are obtained to meet the
objectives of the RI/FS, RD or RA. Documentation of DQOs can be provided primarily in the S&A plan
(which includes QAPjP elements), and summarized in the work plan.
2.6 REFERENCES
U.S. Environmental Protection Agency. 1985a. Guidance on Remedial Investigations Under CERCLA.
Office of Emergency and Remedial Response, Office of Waste Programs Enforcement, Office of Solid
Waste and Emergency Response, Washington, DC. Office of Research and Development. Cincinnati,
Ohio. EPA/540/G-85/002. June.
. 1985b. Guidance on Feasibility Studies Under CERCLA. Office of Emergency and Remedial
Response, Office of Waste Programs Enforcement, Office of Solid Waste and Emergency Response,
Washington, DC. Office of Research and Development, Cincinnati, Ohio. EPA/540/G-85/003. June.
. 1986. Superfund Remedial Design and Remedial Action Guidance, Office of Emergency and
Remedial Response. OSWER Directive 9355.0-4A. June.
2-5
-------
3.0 RI/FS DQO STAGE 1 - IDENTIFY DECISION TYPES
Stage 1 of the DQO process is undertaken to identify the individuals responsible for decisions, to
identify and involve data users, and to define the types of decisions which will be made as part of each
RI/FS. Decisions are made following evaluation of data at various points during the RI/FS. The general
decision types are identified early in Stage 1 to ensure that an investigative approach which will yield
data sufficient to support the decisions.
The major elements of Stage 1 include:
Identifying and involving data users
Evaluating available information
Developing a conceptual model
Specifing RI/FS objectives and decisions
Stage 1 of the DQO process is an inherent part of the project scoping process. The thought process by
which a work plan is developed naturally encompasses the Stage 1 DQO elements. Figure 3-1 illustrates
the Stage 1 elements. Although the elements of Stage I can be thought of as distinct steps, they are a
continuous thought process.
3.1 IDENTIFY AND INVOLVE DATA USERS
DQO development requires involving the data users during planning of remedial activities. Because of the
interdisciplinary nature of remedial activities, it is important that the appropriate technical expertise
is identified and obtained for the DQO development process.
3.1.1 DECISION MAKER'S ROLE
The key RI/FS decision is remedy selection (i.e., ROD/EDD signature). For the majority of RI/FS
projects, remedy selection is the responsibility of the Regional Administrator (RA). Program management
responsibilities are delegated to the Waste Management Division Director and managers, with project
specific management and oversight assigned to Remedial Project Managers (RPMs). Senior management staff
are likely to be involved primarily in scoping of the RI/FS and review and approval of the decision
document.
The EPA RPM is the designated decision maker for the DQO development process. In this role, the RPM is
responsible for coordinating the DQO development process; and overseeing remedial contractors, state
officials, or private parties conducting the RI/FS.
For federal lead projects, day-to-day decision making becomes the responsibility of the remedial
contractor's site manager under the direction of the RPM. Remedial contractors incorporate technical
review and oversight by senior level management and technical experts into their internal scoping and
project planning process. For state lead or private party lead projects, the state project manager or
private party project manager will be a key decision maker assisted by their contractor's site manager.
The RPM should be in close contact with the federal remedial contractor, state project manager, or
private party project manager to ensure that project activities are proceeding on track and are
consistent with EPA policy and guidance.
3-1
-------
-------
IDENTIFY & INVOLVE DATA USERS
EVALUATE
AVAILABLE DATA
DEVELOP CONCEPTUAL MODEL
- CONTAMINANT SOURCES
- MIGRATION PATHWAYS
- POTENTIAL RECEPTORS
- CONTAMINANTS OF CONCERN
SPECIFY OBJECTIVES/DECISIONS
FIGURE 3-1
DQO STAGE 1 ELEMENTS
-------
3.1.2 DATA USERS'ROLE
The interactions of decision makers and various data users during the DQO development process is
illustrated in Figure 3-2 and discussed below.
Primary Data Users
Primary data users are those individuals involved in ongoing RI/FS activities. These activities include
RI/FS planning and implementation, project management and oversight, site specific decision making, and
DQO development. For federal lead projects, this includes the RPM and the remedial contractor's site
manager and staff. For state lead or private party lead projects, this includes the state or private
party manager and their contractor's site manager/staff, along with the RPM.
The contractor site manager must identify the appropriate contractor technical staff based upon the
overall problems at the site. For example, if ground water contamination is a concern,
geologists/hydrogeologists and water supply/treatment engineers may be involved, at a minimum. If
surface water contamination is a concern, aquatic biologists, limnologists and water resource engineers
may be involved. Analytical chemists can assist in specifying the types of analyses to be used and the
limitations of the particular techniques or methods. Individuals familiar with the interactions of
chemicals in the environment, such as geochemists, soil scientists, and chemists, must also be involved
to assess environmental impacts. Geostatisticians can provide assistance in evaluating spatially
distributed data. Toxicologists and individuals familiar with risk assessments should also be involved
early in the scoping process to ensure that appropriate consideration is given to potential migration
pathways, receptors and contaminants of concern.
Secondary Data Users
Secondary data users rely on RI/FS outputs to support their activities. Secondary data users provide
input to the decision maker and primary data users by communicating generic or site specific data needs.
Depending on project lead, secondary data users may include the state, enforcement personnel, ATSDR, U.S.
Army Corps of Engineers, and others. The level of involvement of secondary data users will vary
according to site specific requirements, program lead, or Agency policy.
Technical Support and Project Review/Audit
At the request of the RPM, technical specialists may provide support related to project specific sampling
and analytical activities, regulatory requirements, etc. Project review and audit personnel such as ESD,
Office of Regional Counsel, and EPA HQ help ensure QA program integrity and compliance with program
policy.
3.2 EVALUATE AVAILABLE INFORMATION
Available information is reviewed and evaluated as the initial step in the RI/FS process. This review
provides the foundation for additional on-site activities and serves as the database for RI/FS scoping.
The review and an initial site visit are used for a preliminary interpretation of site conditions.
3.2.1 DESCRIBE CURRENT SITUATION
The initial data review should be as thorough and accurate as possible. Information should be obtained
from EPA technical and enforcement files, state/local regulatory agency files. USGS files, and other
relevant sources. Files from potentially responsible parties (PRPs) should also be referred to when
available. A detailed list of potential data sources is contained in Section 2.0 of the Guidance for
Remedial Investigations Under CERCLA (EPA 1985a).
-------
DIVISION MGMT
PROGRAM & PROJECT
OVERSIGHT
ROD/SETTLEMENT
RECOMMENDATIONS TO RA
RA/AA
ROD/SETTLEMENT
DECISIONS
ENFORCEMENT
NEGOTIATIONS
STATE -4-
REMEDY
CONCURRENCE
DQO
DECISION MAKER
RPM
PRIMARY DATA USERS
RPM
CONTRACTOR'S SITE MANAGER & STAFF
STATE PROJECT MANAGER
(STATE LEAD PROJECT)
PRIVATE PARTY PROJECT MANAGER
(PRIVATE PARTY LEAD)
ATSDR
HEALTH
ASSESSMENTS
CORPS
RD/RA ACTIVITIES
(FEDERAL LEAD PROJECT)
SECONDARY USERS
(ENF/STATE/ATSDR/CORPS)
GENERIC DATA NEEDS
ATSDR PAPER
STATE STANDARDS
CHAIN OF CUSTODY
SITE SPECIFIC DATA NEEDS
SPECIAL PATHWAY INFO
PRP IDENTIFICATION
RD/RA NEEDS
TECHNICAL SUPPORT
(ESD/OTHERS/TAC TEAM)
UPON REQUEST BY RPM
SAMPLING/ANALYTICAL
SUPPORT
REGULATORY REQUIREMENTS
PROJ REVIEW/AUDIT
(ESD/ORC/HQ)
QA INTEGRITY
COMPLIANCE WITH POLICY
INPUT
OUTPUT
FIGURE 3-2
DECISION MAKER DATA USERS INTERACTION
3-4
-------
The preliminary data are confirmed by on-site observations. The goals of the initial site inspection are
as follows:
Utilizing field analytical procedures, obtain data on volatile chemical contaminants,
radioactivity, and explosivity hazards to determine appropriate health and safety levels.
Estimate if any conditions could pose an imminent danger to public health
Confirm the information contained in previous documents.
Record observable data missing in previous documents.
Update site conditions if undocumented changes have occurred.
Perform an inventory of possible off-site sources of contamination.
Obtain data such as location of access routes, sampling points and the site organization
requirements for the field investigation.
Geophysical surveys, limited field screening, or limited field analysis may be performed during the
initial site inspection. This type of initial sampling may help determine the variability of the media,
provide background information, or determine if site conditions have changed.
3.2.2 REVIEW AVAILABLE DATA
For many sites, previous studies have provided useful information upon which further investigations can
be based. The quality of the data should be analyzed to determine its usability. These evaluations
determine the uncertainty associated with the conclusions drawn from the data.
A number of factors relate to the quality of data and its adequacy for use in the RI/FS process,
including the following considerations:
Age of the data
Analytic methods used
Detection limits of methods
QA/QC procedures and documentation
Methods used for sample collection are as important to consider as the methods used for sample analysis.
These considerations fall into two broad categories: statistical and standard operating procedures
(SOPs). The statistical considerations relate to the representativeness of the data and the level of
confidence that may be placed in conclusions drawn from the data (confidence levels are discussed in
Appendix A). Following SOPs ensures sample integrity and data comparability and reduces sampling and
analytical error. Typical issues to consider include the following:
Sampling objective and approach
Sample collection methods
Chain of custody documentation
Sample preservation techniques
-------
Sample shipment methods
Holding times |
If limited or no information exists on sample collection, preservation techniques or holding times, the
data should be interpreted with caution. I
3.2.3 ASSESS ADEQUACY OF DATA
The uncertainty associated with each data measurement activity should be considered when data are
evaluated. Although data may be validated analytically, the level of precision of a particular data
point may not provide sufficient certainty for use in a decision. (Precision and its use in decision
making is discussed in Appendix A.)
It is important to recognize the distinction between uncertainty associated with a measurement activity
and uncertainty associated with a decision during development of DQOs. The uncertainty associated with a
measurement activity is a function of the statistical distribution of errors for each reported
concentration value. At a typical site, many measurement activities are performed and many data are
obtained. Decisions are made after analyzing and summarizing the data. The uncertainty associated with
a decision is a function of the statistical distributions of the factors (statistics) which were used in
reaching the decision. Assessment of data adequacy, then, has two steps. The first step is data
validation. The second step is determining if the data is sufficient to reduce the uncertainty
surrounding a decision to an acceptable level.
Data validation identifies invalid data and qualifies the usability of the remaining data. The output of
data validation is qualitative or quantitative statements of data quality. Once the quality of
individual measurements are known, a compilation of all data points into a cohesive statement regarding,
for example, the areal extent of contamination can be made. Areas requiring remediation can then be
delineated based on specific action levels. The confidence associated with such a remediation decision
incorporates both the confidence in individual measurements as well as in the estimated area requiring
remediation. These types of confidence statements can only be made if a detailed statistical evaluation
of the data is undertaken. Details regarding establishment of criteria and action levels are discussed
in Section 4.0 of this document.
F
3.3 DEVELOP CONCEPTUAL MODEL ,
Conceptual models describe a site and its environs and present hypotheses regarding the contaminants
present, their routes of migration, and their potential impact on sensitive receptors. The hypotheses ,
are tested, refined and modified throughout the RI/FS. Figure 3-3 depicts the basic elements of a
conceptual model for an uncontrolled hazardous waste site. The development of a conceptual model for a *
hypothetical site is presented in Section 3.4 of the Example Scenario document.
3.3.1 EVALUATION OF THE CONCEPTUAL MODEL \
t
The conceptual model should be detailed enough to address potential or suspected sources, types and |
concentrations of contaminants, affected media, rates and routes of migration, and receptors. Figure 3-4
presents an illustration which supports a conceptual model. I
I
*
"I
i
3-6
-------
SOURCE
CONTAMINANTS
CONCENTRATION
TIME
LOCATION
MEDIA
RATE OF MIGRATION
TIME
LOSS FUNCTIONS
TYPE
SENSITIVITY
TIME
CONCENTRATION
NUMBER
HYPOTHESIS
TO BE
TESTED
SOURCE EXISTS
SOURCE CAN BE
CONTAINED
> SOURCE CAN
BE REMOVED
AND DISPOSED
SOURCE CAN
BE TREATED
PATHWAY EXISTS
PATHWAY CAN
BE INTERRUPTED
PATHWAY CAN
BE ELIMINATED
RECEPTORS ARE NOT
IMPACTED BY MIGRATION
OF CONTAMINANTS
RECEPTOR CAN
BE RELOCATED
INSTITUTIONAL CONTROLS
CAN BE APPLIED
RECEPTORS CAN BE
PROTECTED
FIGURE 3-3
ELEMENTS OF A CONCEPTUAL EVALUATION MODEL
-------
VOLATILIZATION
POTENTIAL SOURCES
SURFACE RUNOFF
LAGOON
PERCHED
WATER TABLE
GLACIAL TILL
DRUMS
CONTAMINATED
SOILS
UNCONFINED AQUIFER
/ '/ x X ^
BEDROCK
FIGURE 3-4
EXAMPLE CONCEPTUAL MODEL
ILLUSTRATION
-------
The following are assessed during development of the conceptual model to determine appropriate remedial
and/or removal actions at a site:
Population, environmental, and welfare concerns at risk
Routes of exposure
Spatial distribution of contaminants
Atmospheric dispersion potential and proximity of targets
Amount, concentration, hazardous properties, environmental fate and form of the substance(s)
present
Hydrogeological factors
Climate
Extent to which the source can be adequately identified and characterized
Potential for reuse, recycling or treatment of substances at the site
Likelihood of future releases if the substances remain on-site
Extent to which natural or man-made barriers currently contain the substances and the
adequacy of the barriers
Assessment of the potential pathways of migration and a model of such
Extent to which the substances have migrated or are expected to migrate from their source and
whether migration poses a threat to public health, welfare, or the environment
Extent to which contamination levels exceed applicable or relevant and appropriate federal or
state requirements (ARARs) relating to public health or environmental standards and criteria
, Data evaluation should be undertaken at the initiation of any remedial action program and at each point
? within the program that additional data are obtained. Additional data collected during the RI are used
to expand the conceptual model and determine if sufficient data of adequate quality have been obtained to
= address the issues of concern.
I
- 3.3.2 COMPUTER MODELS
!
Common, but difficult, questions to be addressed during a remedial action program deal with defining the
i extent of contamination, setting action limits and establishing the acceptable likelihood of an incorrect
decision. These types of questions generally require that data be evaluated utilizing tools such as
ground water models, air quality models, and/or geostatistical methods. Ground water models include
several levels of analysis: simple graphical techniques, analytical solution techniques, and numerical
solution techniques. Using this broad definition of modeling, one of these techniques is almost always
applied to examine a ground water contamination problem. Thus, the primary question becomes not when to
I use modeling, but what level of analysis is required to meet the objectives of the study.
, The role of modeling must be evaluated with respect to the entire site investigation. The evaluation of
? small sites with relatively uniform geology may be accomplished by the use of simple analytical models.
3-9
-------
Larger sites with complex stratigraphy, involving contamination in multiple layers and with variable
aquifer parameters, can only be represented by a sophisticated numerical model.
A common misconception about ground water modeling and geostatistical techniques is that they are applied
only during the final stages of an RI, after all tlie data are collected. Modeling techniques can be
applied throughout the RI. For example, during the early stages of an RI, modeling can be used to guide
the data collection program. Sensitivity analyses can help identify the types of data needed, as well as
critical sampling locations. As data collection proceeds during a phased RI. or when a large amount of
data exist from previous investigations, models can be used to provide a consistent framework for
organizing the data. During the latter stages of an FS, models can be applied to predict the future
behavior of a ground water system under natural or artificial stresses, such as varied pumping schemes.
3.4 SPECIFY OBJECTIVES/DECISIONS
In a broad sense, the objective of a remedial action program is to determine the nature and extent of the
release or threat of release of hazardous substances and to select a cost effective remedial action to
minimize or eliminate that threat. Achieving this broad objective requires that several complicated and
interrelated activities be performed, each having objectives, acceptable levels of uncertainty, and
attendant data quality requirements. The expression of these objectives in clear precise decision
statements is the first step toward the development of a cost-effective data collection program.
3.4.1 OBJECTIVES
Project objectives should address major areas of the remedial process. These include characterizing the
site with respect to the environmental setting, proximity and size of human population, and nature of the
problem; identifying potential remedies; and determining specific performance levels of the potential
remedies.
Specifying the objectives can be thought of as identifying problems to be solved. Objectives tend to be
geared toward separate media or sources. However, these objectives should be consistent with the
ultimate objective of selecting a remedial alternative(s) to address the entire site. Table 3-1 lists
general RI/FS objectives.
Defining the types of decisions which will be made regarding remedial actions requires a clear
understanding of the problems posed by the site and awareness of the consequences of making a wrong
decision.
3.4.2 DECISION TYPES
The consequences of making a wrong decision regarding site remediation will vary depending on the
situation. For example, a decision may be made not to implement a remedial alternative designed to
mitigate the migration of contaminants in ground water because the data indicate that dispersion and
degradation of the contaminants will reduce concentrations to health-based levels. If the contaminants
actually migrated beyond the site and were encountered in the ground water system, it may be suggested
that a wrong decision was made. The consequences of this wrong decision at a site where residents derive
their drinking water from the contaminated aquifer would be different from the consequences of
contamination of an aquifer which was not used as a water supply.
The consequences of a wrong decision must be weighed for each major decision to be made during the
remedial action process. Where the consequences of a wrong decision carry significant public health.
safety or environmental impacts, greater attention must be paid to obtaining the data required to ensure
that the decision is sound.
3-10
-------
TABLE 3-1
General RI/FS Objectives
Objective
RI
Activity
FS
Activity
Determine presence or absence
of contaminants
- Determine types of contaminants
Determine quantities (concentrations)
of contaminants
Determine mechanism of contaminant
release to pathways
Determine direction of pathway(s)
transport
Determine boundaries of source(s)
and pathways
Determine environmental/public.
health factors
Determine source/pathway contaminant
characteristics with respect to
mitigation (bench studies)
- Establish presence/absence of
contaminants at source and in
all pathways.
- Establish "nature" of contaminants
at source and in pathways; relate
contaminants to PRP-cost recovery
- Establish concentration gradients
Establish mechanics of source/
pathway(s) interface
Establish pathway(s)/transport
route(s), Identify potential
receptor(s)
Establish horizontal/vertical
boundaries of source(s) and
pathway(s) of contamination
Establish routes of exposure,
and environmental and public
health threat
Establish range of contaminants/
concentrations
Evaluate applicability of no action alternative
for source areas/pathways.
Evaluate environmental/public health threat;
identify applicable remedial technologies.
Evaluate costs to achieve applicable or
relevant and appropriate standards
Evaluate effectiveness of containment
technologies
Identify most effective points in
pathway to control transport of
contaminants
Evaluate costs to achieve relevant/applicable
standards; identify applicable remedial
technologies
Evaluate applicable standards or risk; identify
applicable remedial technologies
- Evaluate treatment schemes
-------
The risk of making a wrong decision is related to the quantity and quality of information available. As
shown in Figure 3-5. as the quantity and quality of data increase, the risk of making a wrong decision
generally decreases. This is not a linearly inverse relationship since at some point the collection of
additional data or improvement of data quality will not significantly decrease the risk of making wrong
decisions.
Data quantity and data quality are independent variables which must be considered jointly during
assessment of the consequences of making a wrong decision. Collecting increasing quantities of data
points which are of low quality may not add significantly to the reduction of risk of making a wrong
decision. Increasing the data quality of a limited number of samples may not add significantly to the
body of knowledge to be used in making a decision.
The value of obtaining additional data or increasing data quality has traditionally been based on
professional judgment for RI/FS projects. The intent of the DQO process is to provide a systematic
approach for the evaluation of the risk associated with making a wrong decision and for determining
levels of uncertainty associated with decisions to provide a framework for the RPM.
3.5 REFERENCES
Federal Register. 1985. National Oil and Hazardous Substances Pollution Contingency Plan. Final
Rule. Vol. 50. No. 224. November 20.
U.S. Environmental Protection Agency (EPA). 1985a. Guidance on Remedial Investigations Under
CERCLA. Office of Emergency and Remedial Response. Office of Waste Programs Enforcement, Office
of Solid Waste and Emergency Response. Washington, D.C. Office of Research and Development,
Cincinnati, Ohio. EPA/540/G-85/002. June.
. 1985b, Guidance on Feasibility Studies Under CERCLA. Office of Emergency and Remedial
Response, Office of Waste Programs Enforcement, Office of Solid Waste and Emergency Response,
Washington, D.C. Office of Research and Development. Cincinnati. Ohio. EPA/540/G-85/003. June.
3-12
-------
INCREASING
RISK OF
MAKING WRONG
DECISIONS
INCREASING DATA QUALITY/QUANTITY
FIGURE 3-5
RELATIONSHIP OF RISK AND DATA
QUALITY/QUANTITY
3-13
-------
4.0 RI/FS DQO STAGE 2 - IDENTIFY DATA USES/NEEDS
Stage 2 of the DQO process defines data uses and specifies the types of data needed to meet the project
objectives. Although data needs are identified generally during Stage 1, it is during Stage 2 that
specific data uses are defined.
The major elements of Stage 2 of the DQO process, as identified in Figure 4-1, are:
Identify data uses
Identify data types
Identify data quality needs
t Identify data quantity needs
Evaluate sampling/analysis options
Review PARCC parameters
Stage 2 begins after the conceptual model is developed and overall project objectives are established.
The conceptual model and the general decisions become the basis for determining data uses and data needs.
Stage 1 determines if existing data meet the project objectives. If the existing data are sufficient,
there is no need to collect additional data. If the data are insufficient, the types, quality, and
quantity of data which must be collected will be determined in Stage 2.
4.1 IDENTIFY DATA USES
Data uses must be stated very specifically to serve their purpose in development of DQOs. This task
should not be taken lightly.
As a demonstration of the importance of accurately specifying data uses consider the following example.
Ground water samples are to be obtained at a site with known shallow ground water contamination. The
homes in the area derive water from private wells which tap a deeper bedrock aquifer. Based upon the DQO
approach, professional experience, and program guidelines provided by the RPM, the contractor decides
that ground water from the bedrock aquifer should be sampled. However, additional questions to address
during Stage 2 of the DQO process include:
How many samples are required?
Where should samples be obtained?
How many QA/QC samples are needed (field trip blanks, collocated samples, field and
laboratory duplicates, spikes)
Will data be used to determine if an alternative water supply should be provided to affected
homes?
At what contaminant level are water supplies believed to be affected?
Will decisions be based upon analysis of samples from private water supply wells or from
monitoring wells?
4-1
-------
-------
IDENTIFY
DATA USES
IDENTIFY DATA
TYPES
IDENTIFY
DATA QUALITY
NEEDS
IDENTIFY
DATA QUANTITY
NEEDS
EVALUATE
SAMPLING/ANALYTICAL
OPTIONS
REVIEW PARCC
PARAMETERS
FIGURE 4-1
DQO STAGE 2 ELEMENTS
4-2
-------
If contaminants are not detected in private water supply wells but are detected in monitoring
wells, how will data be used to assess risks to receptors?
As demonstrated, the list of questions which can be generated to evaluate a simplistic problem in one
medium can be quite extensive.
4.1.1 DATA USE CATEGORIES
RI/FS data uses can be described in general purpose categories. These categories represent generic uses
but vary on a site-by-site basis. Further, specific sites may require data for purposes other than those
described here. The categories listed in Table 4-1 represent the most common RI/FS data uses. Tables
4-1 and 4-2 are forms that can be used by project managers to document the thought processes involved in
DQOs and the S&A plan. The categories do not represent different data qualities, only different uses
which may require data of a given quality. In other words, data collected for a site at a given level of
quality may be used for different purposes. The data use categories are briefly described below:
Site Characterization - Data are used to determine the nature and extent of contamination at
a site. This category is usually the one that requires the most data collection. Site
characterization data are generated through the sampling and analysis of waste sources and
environmental media.
Health and Safety - Data are typically used to establish the level of protection needed for
investigators or workers at a site, and if there should be an immediate concern for the
population living within the site vicinity.
Risk Assessment - Data are used to evaluate the threat posed by a site to public health and
the environment. Risk assessment data are generated through the sampling and analysis of
environmental and biological media, particularly where the potential for human exposure is
great.
Evaluation of Alternatives - Data are used to evaluate various remedial technologies.
Engineering data are collected in support of remedial alternative evaluation and to develop
cost estimates. This may involve performing bench-scale or pilot scale studies to determine
if a particular process or material may be effective in mitigating site contamination.
Engineering Design of Alternatives - Data collected during the RI/FS can be used for
engineering design purposes to develop a preliminary data base in reference to the
performance of various remedial technologies. Data types collected during the RI/FS which
are applicable to the RD process include waste characterization and preliminary volume
estimates (these estimates usually need to be refined further by additional data collection
activities during the RD/RA).
Monitoring During Remedial Action - During the the remedial action, samples can be taken to
assess the effectiveness of the action. Based on the analysis of these samples, corrective
measures may be taken.
PRP Determination - Data may be used to help establish liability at multiple-party sites.
For known RPs, data are used to link their wastes to those found on the site and to
pollutants released to the environment, and for unknown RPs, by comparing the site wastes to
pollutant profiles of known waste streams. Data are also used for injunctive actions and
cost recovery.
Once the data use categories are listed, the intended uses must be prioritized. Establishing an order of
priority for the intended data uses will help identify the most demanding use of each type of data, i.e.,
4-3
-------
SITE
NAME
LOCATION.
NUMBER _
PHASE
TABLE 4-1
DATA USES
EPA REGION
RI1 RI2 RI3 ERA FS RD RA
DATE
CONTRACTOR _
SITE MANAGER
^^^^ DATA USE
MEDIA ^^^x.
SOURCE SAMPUNG
TYPE
SOIL SAMPLING
GROUND WATER SAMPUNG
SURFACE WATER/SEDIMENT
SAMPLING
AIR SAMPLING
BIOLOGICAL SAMPLNG
OTHER
SITE
CHARACTERIZATION
(NCLUDNG
HEALTHS
SAFETY)
RISK
ASSESSMENT
EVALUATION OF
ALTERNATIVES
ENGNEERNG
DESIGN OF
ALTERNATIVES
MONrTORNG
DURNG
REMEDIAL ACTION
PRP
DETERMINATION
OT>IER
NOTE: CHECK APPROPRIATE BOX (ES)
COM SFDQO 1.001
-------
TABLE 4-2
DQO SUMMARY FORM
1. SITE
NAME.
LOCATION.
NUMBER
EPA
REGION
PHASE
Rl 1 Rl 2 Rl 3 ERA FS RD RA
(CIRCLE ONE)
2. MEDIA
(CIRCLE ONE)
3. USE
(CIRCLE ALL THAT
SOL
SITE
CHARAC.
GW
RISK
ASSESS.
SW/SED
EVAL.
ALTS.
AIR
ENGG
DESIGN
BIO
PRP
DETER
OTH3!
MONfTORNG
REMEDIAL
ACTION
OTT-BR
4. OBJECTIVE
5. SITE INFORMATION
AREA.
DEPTH TO GROUND WATER
GROUND WATER USE.
SOIL TYPES
SENSITIVE RECEPTORS .
DATA TYPES (CIRCLE APPROPRIATE DATA TYPES)
A. ANALYTICAL DATA
B. PHYSICAL DATA
PH
CONDUCTIVITY
VOA
ABN
TCLP
PESTICIDES TOX
PCS TOG
METALS BTX
CYANIDE COD
PERMEABILITY
POROSITY
GRAN SIZE
BULK DENSITY
HYDRAULIC HEAD
PENETRATION TEST
HARDNESS
7. SAMPLING METHOD (CIRCLE METHOD(S) TO BE USED)
ENVIRONMENTAL BIASED GRAB
SOURCE
GRD
COMPOSfTE
NON- INTRUSIVE
NTRUSIVE
PHASED
8. ANALYTICAL LEVELS (INDICATELEVELfS) AND EQUIPMENTi, METHODS)
LEVEL 1 FIELD SCREENING - EQUIPMENT
LEVEL 2 FIELD ANALYSIS - EQUIPMENT
LEVELS NON-CLP LABORATORY - METHODS
LEVEL 4
LEVELS
CLP/RAS - METHODS.
NON STANDARD
9. SAMPLING PROCEDURES
BACKGROUND - 2 PER EVENT OR
CRITICAL (LIST)
PROCEDURES
10. QUALITY CONTROL SAMPLES (CONFIRM OR SET STANDARD)
A- FIELD B. LABORATORY
COLLOCATED - 5% OR REAGENT BLANK -1 PER ANALYSIS BATCH OR
REPLICATE - 5% OR REPLICATE -1 PER ANALYSIS BATCH OR
FIELD BLANK - 5% OR MATRIX SPIKE -1 PER ANALYSIS BATCH OR
TRIP BLANK - 1 PER DAY OR OTHER
11. BUDGET REQUIREMENTS
BUDGET
STAFF
SCHEDULE
CONTRACTOR _
SITE MANAGER
PRIME CONTRACTOR
DATE
FOR DETAILS SEE SAMPLING S ANALYSIS PLAN
COM SF DQO 1 002
4-5
-------
TABLE 4-2 (CONTINUED)
DQO SUMMARY FORM INSTRUCTIONS
1. SITE - Identify the site and phase ot the work to be conducted
NAME - Site name or assignment as stated in the WA
LOCATION - City or Town Counly and State where site is located
NUMBER - Site number as staled In the WA
EPA REGION - EPA Region where (he site is located
PHASE CIrde work phase for which DOG'S are being developed: (number
sequentially for each phase as appropriate):
Rl - Remedial
ERA Expedited Response Action
FS Feasibility Study
RD - Remedial Design
RA - Remedial Action
2. MEDIA - CIrde the media being Investigated; only one form will be completed
for each media.
SOIL Surface and subsurface soils
GW Ground water
SW/SED - Surface water and sediment (a sediment sample will be taken if
possible at each surface water sampling location)
AIR Air quality and resplrable dust monitoring
BIO - Biological monitoring, flora and fauna
OTHER - Indicate other 'media' being Investigated I.e. buildings,
underground conduits, etc.
1_US£ Circle the Intended use(s) of the data to be developed.
SITE CHARAC. (H&S) - Site characterization which Includes a determination
of the lavel(s) of health and safety protection required at the site
RISK ASSESS - Risk assessment, data to be used to perform the
endangerment assessment or public health evaluation
EVAL ALTS. - Evaluate alternatives, data will be used to evaluate or screen
remedial/technological alternatives
ENG'G DESIGN - Data will be used to perform detailed engineering design
of remedy
MONITORING - Data will be used to monitor during remedy Implementation
or establish baseline conditions for long term monitoring after site
remediation
PRP DETERMINATION - Data will be used to confirm/fingerprint
contaminants to specific potentially responsible parties for possible
furture or pending enforcement actions
OTHER Indicate other specific data uses
4.
OBJECTIVE - Provide a concise, specific statement that answers the question
Why am I taking these samples?'
5. SITE INFORMATION - Provide the site Information necessary to
gain an overview of the site and the relative complexity and extent
of data requirements
AREA - Indicate the area of the site in acres and an indication of the
configuration (length and width)
DEPTH TO GROUND WATER - Indicate the depth to ground water from the
ground surface, to the extent known Identify seasonal fluctuation and the
depth and thickness of multiple aquifers
GROUND WATER USE - Identify both potable and non-potable ground water
use(s) by aquifer, if appropriate, and the point(s) of extraction relative to
the site
SOIL TYPES - Identify, to the extent known, the site soil strata and relative
depths below ground surface
SENSITIVE RECEPTORS - Identify population and environmental concerns,
relative to the site, which could be impacted by contaminant migration
«. DATA TYPES . circle the appropriate analytical and physical data required to
to determine the type, degree, extent and migration characteristics of
the contaminants and the required site characlensllcs. The selection of
data types required must be developed by the sue manager with the data
users as described In section 3.2
7. SAMPLING METHODS - Circle the appropriate sampling method(s) to be used
In obtaining the required data In accordance with the objectives above
ENVIRONMENTAL Refers to media sampling of air, water, soils and the
biological environment to determine the extent of contamination
SOURCE - Refers to the sampling of the actual contamination source(s)
BIASED - Refers to sampling which focuses on a specific site area,
characteristic or problem factor based upon site knowledge and/or modeling
GRID Refers to unbiased sampling which provides a representative estimate
of contamination problem over the entire site
GRAB - Refers to discrete samples which are representative of a specific
location at a specific point In time.
COMPOSITE The mixture ol a number ot grab samples to represent the average
properties of the parameters of concern over athe extent of the area
sampled
NON-INTRUSIVE - Refers to obtaining data using methods and equipment
that do not require the physical extraction of sample from the media
being sampled
INSTRUSIVE - Refers to physically extracting samples from the media
being sampled
PHASED- Refers to performing discrete time-phased sampling events and
using the information obtained in the previous event to refine the
subsequent sampling event
8. ANALYTICAL LEVELS . The analytical levels are descrbed in Section 9
of the Guidance
LEVEL f FIELD SCREENING - EQUIPMENT Identify the field monitoring
equipment to be used and the manufacturer's specified detection limits
when known
LEVEL 2 FIELD ANALYSIS - EQUIPMENT - Identify the field analysis to be
used and the historically achievable Instrument detection limits
LEVEL 3 NON-CLP LABORATORY - METHODS -Identify the laboratory
method(s) to be used and the historically achleveable precision
and accuracy when available
LEVEL 4 CLP'RAS - METHODS - Identify the CLP laboratory method(s)
to be used and the historically achievable precision and accuracy
LEVEL 5 NON-STANOARO Specify requirement for non-standard
analysis, analytical procedures to be used and required precision
and accuracy
a. SAMPLING PROCEDURES . The procedures to be used In obtaining the
required samples are to be defined, a description of the critical
samples Is to be provided and the requirement ot obtaining a
minimum of two background samples per sampling event Is to be
confirmed or the deviation from this minimum standard defined
10. QUALITY CONTROL SAMPLES . The identified minimum standards
for the field and laboratory quality control samples must be
confirmed or revised on a site specific basis
11. BUDGET REQUIREMENTS . Based upon the analysis summarized above
the critical resource requirements shall be defined
BUDGET - The estimated cost of the sampling and analysis shall be
presented in dollars
SCHEDULE - The total time required to perform the sampling and the
estimated time, as appropriate to perform the analysis shall be
presented by calendar days, by phase
STAFF - The key staff disciplines required to perform the sampling shall
be Identified
The form shall Identify the contractor directly responsible for the work the
prime contractor and must be signed and dated by the site manager.
4-b
-------
the use requiring the highest level of confidence, and therefore the lowest level of uncertainty. The
data quality required will be a function of the acceptable limits of uncertainty established by the
decision maker. The limits on uncertainty will drive the selection of both the analytical and sampling
approaches.
4.1.2 RI/FS USES
During the evaluation of data uses, the potential remedial options which will be considered during the
RI/FS must be reviewed.
As mandated by the Superfund Amendments and Reauthorization Act of 1986 (SARA), treatment alternative;
should be developed ranging from an alternative which minimizes long term management of residuals to an
alternative involving treatment that significantly reduces toxicity, mobility, or volume as a principal
element. In addition, a containment option involving little or no treatment and a no action alternative
should also be developed.
For each of the appropriate action categories, the following information or analyses should be considered
during the DQO process:
List of candidate remedial actions
Method by which the initial alternatives will be screened, including effectiveness criteria.
implementability criteria, and cost criteria
Detailed effectiveness screening will examine whether the alternatives protect public health
and the environment: meet ARARs; cause a reduction in toxicity, mobility, or volume; and
provide acceptable reliability.
Detailed implementability screening will examine the technical feasibility, availability, and
administrative feasibility of each alternative.
Detailed cost screening will examine the capital, O&M, and replacement cost as well as the
present worth of the alternatives.
Both the short and long-term effects of the screening factors must be assessed and the
alternatives must be compared to identify their relative strengths and weaknesses.
The remedial process involves a number of data collection activities, each having specific objectives.
Since the objectives require varying degrees of data quality, it is critical to identify the specific use
to which each set of data will be applied.
4.2 IDENTIFY DATA TYPES
Data use categories define the general purposes for which data will be collected during the RI. Based on
the intended uses, a concise statement regarding the data types needed can be developed. After
identifying the data types and uses, data quality needs can be defined, and a systematic evaluation of
sampling and analysis options can be performed.
Data types can be specified in broad groups initially, such as background samples or media samples, and
then these broad groups are divided into more specific components. Figure 4-2 illustrates the process of
continual refinement of data types for a hypothetical ground water contamination problem. The process
should be followed for each media of interest or each source material. The result of completing the
entire decision matrix is the specification of the data type needed for each intended data use.
4-7
-------
_ c
VOLATIE
ORGANIC
RECT
NTACT
1
GRO
WA
UNO
TER
SUH
WA
FACE
TER
I ORGANIC INORGANIC
CONTAMINANTS CONTAMINANTS
HAZARDOUS
SUBSTANCE
LIST
APPENDIX VIII
FOOD
CHAIN
J
FIGURE 4-2
SAMPLE TYPE SPECIFICATION LOGIC DIAGRAM
-------
Since environmental media and source materials are interrelated at uncontrolled hazardous waste sites,
data types used to evaluate ground water contamination may also be used to evaluate soil contamination.
By identifying data types by media, overlapping data needs are identified. The types of analyses
performed on each sample must be determined while identifying data types. The analytical requirements
are dictated by the use of the data.
The data types specified in Stage 2 should not be limited to chemical analytical parameters, but should
also include physical parameters such as permeability and porosity, which are needed to evaluate
contaminant migration. The level of detail in data type definition must be sufficient to allow for
I evaluation of sampling/analysis options during subsequent stages of the DQO process.
4.3 IDENTIFY DATA QUALITY NEEDS
4.3.1 DATA QUALITY FACTORS
j Consideration of data quality needs should begin with the identification of data uses and data types.
Important factors in defining data quality include:
I Prioritized data uses
i Appropriate analytical levels
Contaminants of concern
: Level of concern
Required detection limit
Critical samples
These factors should be considered to define data quality needs in a general way at the start of an
RI/FS. As work proceeds and more data become available, more precise statements can be made.
Appropriate Analytical Levels
There is little or no information on many factors which critically affect data quality such as: sample
I variability, sample container cleanliness, effect of different sample collection and analytical
preparation techniques, etc. Most available measurement data quality information addresses only the
j analytical technique. To provide some guidance, this section defines analytical levels and then
; indicates the levels appropriate to different generic RI/FS data uses. Appendix B of this document
j provides a more detailed discussion of analytical considerations.
j The analytical levels are defined as follows:
Level I - field screening or analysis using portable instruments. Results are often not
compound specific and not quantitative but results are available in real-time. It is the
least costly of the analytical options.
I Level II - field analyses using more sophisticated portable analytical instruments: in some
j cases, the instruments may be set up in a mobile laboratory on site. There is a wide range
' in the quality of data that can be generated. It depends on the use of suitable calibration
j standards, reference materials, and sample preparation equipment: and the training of the
operator. Results are available in real-time or several hours.
i
4-9
-------
Level III - all analyses performed in an off-site analytical laboratory. Level III analyses
may or may not use CLP procedures, but do not usually utilize the validation or documentation
procedures required of CLP Level IV analysis. The laboratory may or may not be a CLP
laboratory.
t Level IV - CLP routine analytical services (RAS). All analyses are performed in an off-site
CLP analytical laboratory following CLP protocols. Level IV is characterized by rigorous
QA/QC protocols and documentation.
Level V - analysis by non-standard methods. All analyses are performed in an off-site
analytical laboratory which may or may not be a CLP laboratory. Method development or method
modification may be required for specific constituents or detection limits. CLP special
analytical services (SAS) are Level V.
Levels III, IV and V all incorporate some time lag between submission of samples to the laboratory and
receipt of results. Table 4-3 provides more information on these analytical levels; Table 4-4 identifies
appropriate analytical levels for generic RI/FS data uses.
It can be seen from Table 4-4 that, for each generic data use, several analytical levels may be
appropriate. The decision maker needs further criteria to select the most appropriate. Important
criteria are the contaminants of concern and the level of concern for each contaminant.
Engineering design (see Table 4-4) usually requires considerations beyond analytical levels for chemical
analyses. Physical property data (viscosity, soil organic carbon, etc.) are often necessary for
engineering design. While most of the chemical analysis requirements for engineering design data needs
can be accomplished by Level II, III and IV analyses, the physical property type analyses will usually
fall within the Level V and "other" categories.
Contaminants of Concern
At some sites it may be clear which contaminants are of concern because they have known adverse impacts
on human health. In such cases, the appropriate health standards can be used to set levels of concern.
Often a large number of contaminants are found at a site. In such cases it is not feasible or desirable
to specify levels of concern for each observed contaminant. Rather, a small number of indicator
chemicals are selected and levels of concern are determined for these chemicals. Indicator chemicals are
the most toxic, mobile, persistent, or frequently occurring contaminants found on site. The process of
selecting indicator contaminants is described in the Superfund Public Health Evaluation Manual (EPA
1985).
Levels of Concern and ARARs
The level of concern specifies a concentration range above which some action may need to be taken. The
level of concern is intimately linked with the action level, which defines the "level of cleanup" for
remedial activities under SARA. In general, levels of concern are site specific issues and relate to
site characterization and assessment. The applicable or relevant and appropriate requirements (ARARs),
as mandated by SARA, are related to defining remedial design criteria and legal requirements.
An exact action level is not required before initiating an RI field investigation: however, a rough
estimate is necessary to ensure that the chosen analytical methods are accurate at the level of concern.
Also, knowledge of the level of concern can influence the number of samples required and the selection of
analytical methods. For these reasons, an acceptable range of values should be specified. As work on a
site progress and more data become available, the level of concern will be further refined and
incorporated into the ROD as an action level.
4-10
-------
TABLE 4-3
SUMMARY OF ANALYTICAL LEVELS APPROPRIATE TO
DATA USES
DATA USES ANALYTICAL LEVEL
SITE CHARACTERIZATION
MONITORNG DURING LEVEL 1
IMPLEMENTATION
SITE CHARATERIZATION
EVALUATION OF ALTERNATIVES LEVEL II
ENGrCERNG DESIGN
MONITORWG DURING
IMPLEMENTATION
RISK ASSESSMENT
PRP DETERMINATION
SITE CHARACTERIZATION
EVALUATION OF ALTERNATIVES
ENGNEERNG DESIGN LEVEL III
MONJTORNG DURING
IMPLEMENTATION
RISK ASSESSMENT
PRP DETERMINATION LEVEL IV
EVALUATION OF ALTERNATIVES LEVEL w
ENG KEFUNG DESIGN
RISK ASSESSMENT LEVEL V
PRP DETERMINATION
TYPE OF ANALYSIS
- TOTAL ORGANIC/INORGANIC
VAPOR DETECTION USNG
PORTABLE INSTRUMENTS
- FIELD TEST KITS
- VARIETY OF OHGANICS BY
GC; INORGANICS BY AA;
XRF
- TENTATIVE ID; ANALYTE-
SPECIFIC
- DETECTION LIMITS VARY
FROM LOW ppm TO LOW ppb
- ORGANICS/INORGANICS
USNG EPA PROCEDURES
OTHER THAN CLP CAN BE
ANALYTE-SPECIFIC
- RCRA CHARACTERISTIC TESTS
- HSL ORGANICS/INORGANICS
BY GC/MS; AA; ICP
- LOW ppb DETECTION LIMIT
- NON-CONVENTIAL
PARAMETERS
- METHOD-SPECIFIC
DETECTION LIMITS
-MODIFICATION OF
EXISTING METHODS
- APPENDIX 8 PARAMETERS
LIMITATIONS
- INSTRUMENTS RESPOND TO
NATUFIALLY-OCCURING
COMPOUNDS
- TENTATIVE ID
- TECHNIQUES/INSTRUMENTS
LIMITED MOSTLY TO
VOLATILES, METALS
- TENTATIVE ID IN SOME
CASES
- CAN PROVIDE DATA OF
SAME QUALITY AS
LEVELS IV, NS
- TENTATIVE IOENTFICATON
OF NON^HSL PARAMETERS
- SOME TIME MAY BE REQUIRED
FOR VALIDATION OF PACKAGES
- MAY REQUIRE METHOD
DEVELOPMENT/MODIFICATION
- MECHANISM TO OBTAIN
SERVICES REQUIRES
SPECIAL LEAD TIME
DATA QUALITY
- IF INSTRUMENTS CALIBRATED
AND DATA INTERPRETED
CORRECTLY, CAN PROVIDE
INDICATION OF CONTAMINATION
- DEPENDENT ON QA/QC
STEPS EMPLOYED
- DATA TYPICALLY REPORTED
W CONCENTRATION RANGES
- SIMILAR DETECTION
LIMITS TO CLP
- LESS RIGOROUS QA/QC
- GOAL IS DATA OF KNOWN
QUALITY
- RIGOROUS QA/QC
- METHOD-SPECIFIC
-------
TABLE 4-4
APPROPRIATE ANALYTICAL LEVELS - BY DATA USE
^"^v^ DATA USE
ANALYTICAL ^\-
LEVEL ^\
LEVEL 1
LEVEL II
LEVEL III
LEVEL IV
LEVEL V
OTHER
SITE
CHARACTERIZATION
(NCLUDNG
HEALTH 4
SAFETY)
x/
^/
N/
RISK
ASSESSMENT
N/
N/
N/
EVALUATION OF
ALTERNATIVES
N/
N/
N/
ENGNEEFtNG
DESIGN OF
REMEDIAL ACTION
N/
N/
S/
N/
MONITORMG
DURJNG
IMPLEMENTATION
OF
REMEDIAL ACTION
N/
N/
-y
FflP
DETERMINATION
N/
N/
N/
OTHER
NOTE. CHECK APPROPRIATE BOX (ES)
CDM SF DQO 1 001
HEM 1006
-------
Determination of levels of concern is a site specific activity. The decision maker and data users
(toxicologists. geologists, and engineers) must meet to determine the appropriate action level range for
the site. Tables in Appendix E summarize potentially applicable or relevant and appropriate requirements
and toxicity values. The standards do not consider simultaneous exposure from multiple routes.
Standards may also be based on levels, durations, or frequencies of exposure that differ from those at a
specific site. The standards and criteria that are used, especially when conducting public health
assessments, must correspond to the media for which they are developed.
In the listing of applicable standards which can be used for selecting action levels, few standards are
available for soil contamination. Generally, some type of modeling may be required to specify the level
of concern for soil. The type of model selected will be based on the potential route of exposure. If
contaminated soil is carried in the air and inhaled by receptors, air modeling may be required. If
contaminants leach from soils into ground water and are transported to receptor wells, a ground water
model may be required. These models are useful in assessing the potential impact resulting from
migration of contaminants at a specified level of concern to a receptor at a specified cancer risk level.
for instance. The available models are specified in the Superfund Exposure Assessment Manual (EPA 1985).
Detection Limit Requirements
The level of concern selected directly affects data quality requirements. The sampling and analysis
methods used must be accurate at the level of concern. Since sampling accuracy is hard to evaluate or
control, it is extremely important that the analytical technique chosen has a detection limit well below
the level of concern. This factor must be considered in evaluating analytical options. Appendix B
provides more detailed information on detection limits. Appendix H lists CLP contractually required
detection limits.
Critical Samples
Critical samples are those for which valid data must be obtained to satisfy the objectives of the
sampling and analysis task. An example of a critical data point may be an upgradient well in a ground
water contamination study or any other data point considered vital to the decision making process. In
some cases, taking critical samples in duplicate is appropriate.
4.3.2 COST ANALYSIS OF ALTERNATIVES
The program goal for developing cost estimates in feasibility studies is to estimate to within +50
percent and -30 percent of the actual cost of the selected remedial alternative. This puts requirements
on the type and amount of data which must be collected during the field investigation and requires the
decision maker to consider the range of potential remedial alternatives before planning the field
investigation.
Where a possible alternative is source removal or treatment, the cost criteria may be used to determine
the number of data required. If the cost of the remedial alternative is strictly proportional to the
volume of material removed or treated, sufficient data must be obtained to determine the volume of
material to within +50 percent and -30 percent. Normally, however, there is some uncertainty in the
capital costs and the efficiency of the treatment or removal procedure. Therefore, it is necessary to
determine the volume of contaminated soil as accurately as possible.
4.4 IDENTIFY DATA QUANTITY NEEDS
The number of samples which should be collected can be determined using a variety of approaches. The
validity of the approach utilized is dependent on the characteristics of the media under investigation
and the assumptions used to select sample locations In situations uliere data are unavailable or
4-13
-------
limited, a phased sampling approach may be appropriate. Phase I data can be evaluated to determine the
appropriate number of samples to be obtained in subsequent phases of the RI.
In the absence of available data, the data users and decision makers will be required to develop a
rationale for selecting sampling locations. Questions to guide the data users in selecting appropriate
locations could include:
Do source materials still exist on the soil surface?
Is there evidence of soil disturbance or vegetative stress based upon review of aerial
photographs?
Do geologic features in the area control ground water and surface water flow patterns?
Do site conditions favor surficial soil erosion or wind erosion? :
it
Are sensitive receptors located in the vicinity of the site?
These types of questions can be addressed in the absence of any analytical data and will assist in
identifying additional data needs. Subsequent discussions may lead to the recommendation that
geophysical surveys or soil gas and other field screening studies be conducted in areas of soil \
disturbance. Collection of a limited number of samples from identified source materials or pathways, !
such as streams, may also be considered. Limited air sampling may also be warranted during the early
stages of the RI to determine if organic vapors or particulates could pose a problem. |
In situations where data are available, or as new data are added to the site's data base, statistical '
techniques may be utilized in determining the number of data required. Appendix A provides examples of ;
the applicability and methodology of various statistical techniques. j
Following evaluation of the data, the adequacy of the data to support a decision can be determined. If a
higher degree of certainty in the decision is required (e.g., a more definitive statement regarding the
extent of contamination), then additional data should be obtained in subsequent sampling phases. In all
cases, the actual level of confidence in a decision can only be established following collection and j
evaluation of data. Therefore, at the completion of each data collection activity, data evaluation is
critical.
t
4.5 EVALUATE SAMPLING/ANALYSIS OPTIONS
Following the identification of data uses, data types, and data quality needs, an evaluation of sampling
and analysis options can be undertaken. Numerous sampling and analysis options could be developed for
any data collection activity. The possible options are a function of the data types needed.
4.5.1 SAMPLING AND ANALYSIS APPROACH (PHASING) ;
Data collection activities must be designed to ensure maximum use of the data. Developing a sampling and
analysis approach which ensures that appropriate levels of data quantity and quality are obtained may be \
accomplished by use of a phased RI approach and by the use of field screening techniques to direct the
data collection activities. By subdividing the data collection program into a number of phases, the data
can be obtained in a sequence which allows it to be used to direct subsequent data collection activities.
i
The time required for receipt of analytical data from laboratories often results in delays in an RI
program. By utilizing field techniques for assessing contaminant concentrations or media
characteristics, the RI can proceed with fewer delays.
4-14
-------
Direct reading instruments which should be considered for use during the evaluation of a
sampling/analysis approach include:
Photoionization detectors (PIDs)
Flame ionization detectors (FIDs)
Hydrogen sulfide analyzers
Hg vapor analyzers
Respirable particulate meters
Radiation meters
Oxygen/explosi meters
pH and conductivity meters
Other devices and field tests which allow for assessment of site conditions without the need for
laboratory support include:
Oil/water interface units
Slug tests
Infiltrometers
These direct reading instruments can be taken into the field to obtain data without extensive calibration
procedures. Additional levels of quantification can be obtained with transportable instruments such as
gas chromatographs (GC), x-ray fluorescence, or atomic adsorption devices. For these instruments.
however, calibration using known standards must be completed prior to field use.
Conceptually, an analytical approach can be thought of as a large "inverted funnel" whereby large numbers
of samples are analyzed quickly and cost effectively in the field, with succeedingly smaller numbers of
samples analyzed further using successively more sophisticated procedures. The type and design of this
analytical approach is determined by how the data will be used. By strategically selecting the samples
analyzed at each level, a much higher degree of certainty can be obtained for the overall data set
without sacrificing either the quantity of samples analyzed or the quality of data collected.
For example, consider a hazardous waste site where the soil is contaminated with volatile organic
compounds (VOCs). For this example, the objectives of the sampling are determination of VOC
concentrations at site boundaries and assessment of the direct contact threat. It is assumed that a
photoionization detector will detect contaminants at the levels of concern for this example.
The analytical approach for this hypothetical situation is illustrated in Figure 4-3 and summarized
below:
Samples from all locations are analyzed in real time using photoionization field headspace
techniques (Level I).
A limited number of samples for which nothing was detected and nil samples for which VOCs
were detected are analyzed oil-site using a portable gas chromatograph (Level II).
4-15
-------
DATA
QUALITY
Q
in
co
a
CO
&
3*
O
g
LU
Q
COST
AND
TURNAROUND
TIME
1
Although not applicable to the example situation, level III support
is shown to indicate that this is a viable option for confirmatory
analyses.
FIGURE 4-3
INTEGRATION OF ANALYTICAL
SUPPORT LEVELS
4-ir.
-------
A number of samples are selected for analysis by CLP RAS (Level IV) for the Hazardous
Substance List (HSL) compounds. All samples identified as critical data points are included.
This step provides confirmation for all preceding work, including verification that indicator
parameters are representative of contaminants of concern and are identified appropriately.
The results of all split samples analyzed by different levels are interpreted for quality
control purposes.
This approach can also be utilized in a time-phased manner, i.e., by using the results of an initial
sampling round with a lower level of analysis to fine-tune the sampling approach for a subsequent
sampling round using higher level(s) of analytical support. Another approach involves complete GC/MS
analysis of the initial sampling round to identify the organic compounds present, followed by GC analysis
of specific compounds of interest in later rounds. Gas chromatography with the appropriate detector can
provide lower cost analyses, often with lower detection limits and higher precision and accuracy, than
GC/MS. It is necessary, however, to verify by GC/MS that interfering compounds are not present.
4.5.2 RESOURCE CONSIDERATIONS
The resources available for performing a remedial action must be evaluated during the scoping process.
Within Stage 2 of DQO development, the time required for obtaining data, the personnel resources and
equipment required, and the costs for data collection must be evaluated. This evaluation is most
effectively performed as sampling/analysis options are identified.
The cost for analytical support varies considerably depending on the type of analysis required. Schedule
requirements dictating the need for rapid turnaround escalate analytical costs. The cost associated with
obtaining samples must also be considered during the evaluation of sampling/analysis options. Cost
savings can be achieved by performing multiple media sampling activities simultaneously (e.g., sample
ground water and surface water during the same sampling event).
Critical path activities and technical staff resource needs should be identified early to facilitate
efficient planning for the RI/FS.
4.6 REVIEW PARCC PARAMETER INFORMATION
The PARCC (precision, accuracy, representativeness, completeness, and comparability) parameters are
indicators of data quality. Ideally, the end use of the measurement data should define the necessary
PARCC parameters. In the ideal situation, numerical precision, accuracy, and completeness goals would be
established and these goals would aid in selecting the measurement methods.
As noted earlier. RI/FS work does not fit this ideal situation. RI/FS sites are so different and
information on overall measurements (sampling plus analysis) is so limited that it is not practical to
set universal PARCC goals at this time. Rather, the historical precision and accuracy achieved by
different analytical techniques should be reviewed to aid in selecting the most appropriate technique.
To indicate achievable precision and accuracy, tables in Appendix F present historical precision and
accuracy information for analytical techniques classified by level. EPA will continue to make
information of this type available so that a data base of numerical precision and accuracy requirements
appropriate to different data uses will develop.
4.6.1 PRECISION
Precision measures the reproducibility of measurements under a given set of conditions. Specifically, it
is a quantitative measure of the variability of a group of measurements compared to their average value.
4-17
-------
Precision is usually stated in terms of standard deviation but other estimates such as the coefficient of
variation (relative standard deviation), range (maximum value minus minimum value), and relative range
are common.
The overall precision of measurement data is a mixture of sampling and analytical factors. Analytical 4
precision is much easier to control and quantify than sampling precision. There are more historical data J
related to individual method performance and the "universe" is limited to the samples received in the
laboratory. In contrast, sampling precision is unique to each site.
Sampling precision may be determined by collecting and analyzing collocated or field replicate samples
and then creating and analyzing laboratory replicates from one or more of the field samples. The
analytical results from the collocated or field replicate samples provide data on overall measurement
precision; analysis results from the laboratory replicates provide data on analytical precision.
Subtracting the analytical precision from the measurement precision defines the sampling precision.
4.6.2 ACCURACY
Accuracy measures the bias in a measurement system; it is difficult to measure for the entire data
collection activity. Sources of error are the sampling process, field contamination, preservation, ',
handling, sample matrix, sample preparation and analysis techniques. Sampling accuracy may be assessed I
by evaluating the results of field/trip blanks, analytical accuracy may be assessed through use of known i
and unknown QC samples and matrix spikes. f
As an example of how the sampling process can affect accuracy, consider the collection of ground water '
samples for volatile organic analysis. In the actual sampling, some portion of the volatile components
may be lost. There is no way to measure this loss easily. The sample could also be subjected to
contamination from a wide range of sources in the field and laboratory. To check the system for
contamination, trip and field blanks can be used.
4.6.3 REPRESENTATIVENESS
Representativeness expresses the degree to which sample data accurately and precisely represent a
characteristic of a population, parameter variations at a sampling point, or an environmental condition.
Representativeness is a qualitative parameter which is most concerned with the proper design of the
sampling program. The representativeness criterion is best satisfied by making certain that sampling
locations are selected properly and a sufficient number of samples are collected.
Representativeness is addressed by describing sampling techniques and the rationale used to select
sampling locations. Sampling locations can be biased (based on existing data, instrument surveys,
observations, etc.) or unbiased (completely random or stratified-random approaches). Either way, the
rationale used to determine sampling locations must be explicitly explained. If a sampling grid is being
utilized, it should be shown on a map of the site. The type of sample, such as a grab or composite
sample, as well as the relevant standard operating procedure (SOP) for sample collection, should be
specified.
An example of the way representativeness is ensured in a sampling program is the use of proper ground
water sampling techniques. The SOPs for ground water sampling require that a well be purged a certain
number of well volumes prior to sampling, to be certain that the sample is representative of the
underlying aquifer at a point in time.
Representativeness can be assessed by the use of collocated samples. By definition, collocated samples
are collected so that they are equally representative of a given point in space and time. In this way.
they provide both precision and representativeness information.
4-18
-------
4.6.4 COMPLETENESS
Completeness is defined as the percentage of measurements made which are judged to be valid measurements.
The completeness goal is essentially the same for all data uses: that a sufficient amount of valid data
be generated. It is important that critical samples are identified and plans made to achieve valid data
for them.
Almost no historical data on the completeness achieved by individual methods exists. However, the CLP
f data has been found to be 80-85 percent complete on a nationwide basis. This can be extrapolated to
| indicate that Level III. IV and V analytical techniques will generate data that are approximately 80
i percent complete. Levels I and II would be expected to have lower completeness levels. However, since
| they are on-site measurement techniques providing results in real-time or after minimal delay, invalid
i measurements can be repeated easily. Thus, a high degree of compieteness can be achieved with these
I analytical levels.
' 4.6.5 COMPARABILITY
i
' Comparability is a qualitative parameter expressing the confidence with which one data set can be
compared with another. Sample data should be comparable with other measurement data for similar samples
and sample conditions. This goal is achieved through using standard techniques to collect and analyze
J representative samples and reporting analytical results in appropriate units. Comparability is limited
j to the other PARCC parameters because only when precision and accuracy are known can data sets be
i compared with confidence.
i 4.7 UTILIZING PARCC PARAMETER INFORMATION
In Stage 2 of the DQO process, the PARCC parameters should be considered in evaluating sampling and
analysis options. To the extent possible, they should be defined as goals in the Stage 3 Data Collection
Program.
Whenever measurement data are reviewed (in Stage 1 of the DQO process), the PARCC parameters which
were achieved should be included in the review. The laboratory should provide numerical precision
and accuracy data; Level II field analyses may also generate precison and accuracy data. Precision
and accuracy data may be expressed in several ways and are best evaluated by an analytical chemist
or a statistician. Since the precision data quantify the scatter of results about a mean value, a
lower precison value means less scatter. Accuracy is most frequently reported as percent recovery,
or percent bias. A 100 percent recovery indicates a completely accurate measurement; the closer the
recovery is to 100 percent, the more accurate the measurement. Percent bias reports the difference
of the result from the true value. A completely accurate measurement would have zero percent bias;
the lower the percent bias, the more accurate the measurement.
The data user must keep the level of concern and the end use of the data in mind when reviewing
precision and accuracy information. In some cases, even data of poor precision and/or accuracy may
be useful. For example, if all the results are far above the level of concern, the precision and
accuracy are much less important. However, close to the level of concern, precision and accuracy
are quite important and should be carefully reviewed. If results have very good precision but poor
accuracy, it may be acceptable to correct the reported results using the percent recovery or percent
bias data. This judgment should be made by a data user with appropriate technical expertise.
4-19
-------
4.8 REFERENCES
U.S. Environmental Protection Agency (EPA). 1987. Compendium of Field Operations Methods. Office
of Emergency and Remedial Response, Washington. D.C. June.
. 1983. Characterization of Hazardous Waste Sites - A Methods Manual. Volume 1 - Site
Investigations. NTIS PB84-126929. EPA/600/4-84/075 '
. 1985. Guidance on Remedial Investigations Under CERCLA. Office of Emergency and Remedial }
Response, Office of Waste Programs Enforcement, Office of Solid Waste and Emergency Response,
Washington, D.C. Office of Research and Development, Cincinnati, Ohio. EPA/540/G85/002. June. *
. 1985. Guidance on Feasibility Studies Under CERCLA. Office of Emergency and Remedial j
Response, Office of Waste Programs Enforcement, Office of Solid Waste and Emergency Response, j
Washington, D.C. Office of Research and Development, Cincinnati, Ohio. EPA/540/G85/003. June. 1
5
. 1985. Superfund Exposure Assessment Manual. Office of Solid Waste and Emergency Response. ^
OSWER Directive 9285.5-1 " J
j
. 1985. Sediment Sampling Quality Assurance User's Guide. EPA 600/4-85-048 j
i
. 1984. Superfund Public Health Evaluation Manual. Office of Solid Waste and Emergency
Response, OSWER Directive 9285.4-1
4-20
-------
5.0 RI/FS DQO STAGE 3 DESIGN DATA COLLECTION PROGRAM
Stage 3 of the DQO process entails design of the detailed data collection program for the remedial action
project. Through the process of addressing the elements identified in Stages 1 and 2. all the components
required for completion of Stage 3 should be available. Stage 3 is outlined in Figure 5-1.
5.1 ASSEMBLE DATA COLLECTION COMPONENTS
During Stage 2, specific DQOs were developed by media or sampling activity. The intent of Stage 3 is to
compile the information and DQOs developed for specific tasks into a comprehensive data collection
program. A detailed list of all samples to be obtained should be assembled in a format which includes
phase, media, sample type, number of samples, sample location, analytical methods, and QA/QC samples
(type and number). In addition, a schedule for all sampling activities should be developed in bar chart
or critical path method format.
5.2 DEVELOP DATA COLLECTION DOCUMENTATION
The output of the DQO process is a well defined sampling and analysis (S&A) plan with summary information
provided in the work plan.
Data collection documentation requirements vary on a regional basis within the EPA. The DQO guidance
provided in this document does not require the submittal of deliverables in addition to those already
established in the regions. Rather, the DQO process provides a framework to ensure that all the
pertinent issues related to the collection of data with known quality are addressed.
5.2.1 SAMPLING AND ANALYSIS PLANS
A written quality assurance/site sampling plan must be prepared for all remedial investigation activities
which involve sampling. These plans should include the following:
Description of the objectives of the sampling efforts, including the phase of the sampling
and ultimate use of the data
Specification of sampling protocol and procedures
Specification of the types, locations, and frequency of samples to be taken
The S&A plan identifies the individuals responsible and the procedures for field activities and sample
analyses. Quality assurance project plan (QAPjP) elements should be addressed in the S&A plan. The
standard elements of a QAPjP are listed in Table 5-1. Details on preparation of QAPjPs are contained in
Interim Guidelines and Specification for Preparing Quality Assurance Project Plans (EPA 1980).
The 16 points required in a QAPjP may be incorporated by reference if the information has been documented
elsewhere. For example, if a project description (Item 3) is available in the work plan, it is
acceptable to refer to this document rather than repeat the information. Quality assurance issues which
are program wide in nature, such as internal quality control checks (Section II), performance and system
audits (Section 12), corrective action (Section 15) and quality assurance reports to management (Section
16), are generally specified in the quality assurance program plan (QAPP) and can be included in the
QAPjP by reference.
Field investigation activities can be undertaken in a phased approach. Separate sampling/analysis plans
may be prepared for the separate phases of a remedial investigation. For example, geophysical
investigations may be performed to select locations for monitoring \vells. In such a case, a sampling
plan should be prepared for the geophysical investigations and. following evaluation of the data, a
5-1
-------
ASSEMBLE
DATA COLLECTION
COMPONENTS
DEVELOP DATA COLLECTION DOCUMENTATION
WORK PLAN
SAMPLING & ANALYSIS PLAN
Include QAPjP Elements
WORK PLAN
FIGURE 5-1
STAGE 3 ELEMENTS
DESIGN DATA COLLECTION PROGRAM
5-2
-------
TABLE 5-1
QUALITY ASSURANCE PROJECT PLAN ELEMENTS
1 Title Page
Introduction
2 Table of Contents
3 Project Description
4 Project Organization
5 Quality Assurance Objectives
for Data Measurement
6 Sampling Procedure
7 Sample and Document Custody
Procedures
8 Calibration Procedures and Frequency
9 Analytical Procedures
10 Data Reduction, Validation
and Reporting
11 Internal Quality Control Checks
12 Performance and System Audits
13 Preventive Maintenance
14 Data Measurement Assessment
Procedures
15 Corrective Action
16 Quality Assurance Reports to
Management
5-3
-------
separate plan should be developed for installation of the wells. Additional plans for the subsequent
phases of a remedial investigation may be prepared at any time during the course of the project as the
need for additional field investigation is identified.
5.2.2 WORK PLANS
Work plans define the scope of services, level-of-effort. costs, and schedule for performing the RI/FS;
in general, the work plan describes what will be done, while the S&A plan and QAPjP describe how each
task will be done. The scope of the sampling effort depends on the quality of existing data, an
understanding of the site problems, identification and evaluation of feasible remedial actions, and
enforcement needs.
The work plan provides the general description of the activities to be performed as part of the RI/FS.
However, it does not contain the detailed description of how a sample is obtained or an analysis
performed. This type of information is presented in the S&A plan. The level of detail to be included in
the work plan for the RI phase is outlined below:
How site mapping will be performed including survey limits, the scale of the plan to be
produced, the horizontal and vertical control, and significant site features
Number of individuals to be involved in each field sampling task and estimated duration in
days
Identification of geophysical survey areas or transects, soil boring and test pit locations
on the map provided in the draft work plan
Number of samples to be obtained in the field including blanks and duplicates and the
location from which the samples will be obtained illustrated on a map included in the draft
work plan
List of analyses to be performed
A general discussion of DQOs
Identification of pilot or bench-scale studies that will be performed
This information is required as part of the work plan in order to establish a basis for the schedule and
cost estimate. Work plans prepared for a phased RI approach should be specific for the initial phase,
and general for subsequent phases, with subsequent phases well defined when the previous phase is
completed.
5.2.3 ENFORCEMENT CONCERNS
All RI/FS activities should be conducted and documented such that sufficient data are collected to make
sound decisions concerning remedial action selection. This applies to fund-lead, and potentially
responsible party lead projects. The data collection and documentation activities should be similar for
all types of RI/FSs. In other words, if enough data are collected using appropriate protocols, and the
data are sufficiently valid upon which to base a remedial action decision, then the procedures and
documentation should be sufficient to be admissible as evidence in litigation.
The guidelines outlined below should be followed to assure that data quality objectives are met:
Appropriate plans (i.e.. work plans, sampling and analysis plans. QAPjP) should be developed
to document intentions.
5-4
-------
Field notebooks should be maintained to keep accurate records of sampling activities.
Personnel should have appropriate experience or training.
Chain of custody records must be kept for samples.
Methods used for sampling and analysis should be valid from an engineering/scientific
standpoint and be consistent with standard analytical procedures.
Documentation should be sufficient to allow the persons involved in the site studies to
reconstruct the work if necessary.
EPA's or the state's responsibility from a QA/QC standpoint is to audit randomly some RI/FS
field sampling, analysis (QA/QC) and data validation to confirm that procedures utilized were
sufficient.
The above requirements pertain to civil cases only. Criminal cases will require additional documentation
and/or materials. EPA counsel should be consulted in these cases.
5.3 REFERENCES
American Chemical Society. 1983. Principles of Environmental Analyses. Analytical Chemistry
55:2210-2218.
ASTM. 1985. Quality Assurance for Environmental Measurements. ASTM Special Tech Pub 867. J.K.
Taylor and T.W. Stanley eds.
Taylor, J.U. 1981. Quality Assurance of Chemical Measurements Analytical Chemistry. Volume 53,
No. 14. December.
U.S. EPA. 1980. Standard Operating Procedures and Quality Assurance Manual-Draft. Region IV
Surveillance and Analysis Division-Water Surveillance Branch. Athens, Georgia.
. 1980. Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans.
QAMS. EPA-600/4-83-004. NTIS PB83-170514.
. 1981. Work Plan Development Technical Methods for Investigating Sites Containing Hazardous
Substances. Technical Monograph No. 6.
. 1984. Soil Sampling Quality Assurance User's Guide. EPA 600/4-84-043.
. 1984. Memorandum from Stanley Blacker, about QAMS Checklist for DQO Review.
. 1984. Quality Assurance Management and Special Studies Staff. Calculation of Precision. Bias.
and Method Detection Limit for Chemical and Physical Measurements. March 30.
. 1984. Soil Sampling Quality Assurance User's Guide. EMSL-LV 600/4-84-043. May.
. 1985. Guidance on Remedial Investigations Under CERCLA Hazardous Waste Engineering Research
Laboratory Office of Emergency and Remedial Response and Office of Waste Programs Enforcement.
. 1985. Sediment Sampling Quality Assurance User's Guide. Environmental Research Center
Cooperative Agreement CR 810550.01
-------
. 1985. Construction Quality Assurance for Hazardous Waste Land Disposal Facilities Public
Comment Draft. EPA/530-SW-85-021
. 1985. Protection of Public Water Supplies from Groundwater Contamination. EPA/625/4-85/016.
. 1985. RCRA. Groundwater Monitoring Technical Enforcement Guidance Document. Office of Waste
Programs Enforcement, Office of Solid Waste and Emergency Response.
. 1985. Draft Superfund Public Health Evaluation Manual. OSWER Directive 9285.4-1. December.
5-b
-------
-------
APPENDIX A
STATISTICAL CONSIDERATIONS
-------
APPENDIX A
STATISTICAL CONSIDERATIONS
Statistical techniques should be used to evaluate environmental data and to assist in designing
appropriate sampling plans based on the data. Statistical techniques should be applied during PA/SI,
RI/FS, RD. and RA activities.
Statistical considerations come into play in Stages 1, 2. and 3 of the DQO process. In Stage 1 the
existing data are compiled and evaluated and statistical techniques can be used to evaluate the
comparability of different sets of existing data and to evaluate the need to obtain additional data. In
Stage 2 data quality and quantity needs can be stated in terms of confidence limits or within other
statistical framework. After Stage 3, statistics can be used to evaluate newly acquired data and to
assess uncertainty in various decisions.
This appendix provides discussions of various statistical approaches which may be appropriate for
remedial action programs. The discussions are based upon hypothetical scenarios which have or might
occur at hazardous waste sites and links available statistical methods to potential applications. The
scenarios presented are not the only situations in which statistics can be applied, but they provide an
indication of the information that can be obtained from statistical methods.
The scenarios will be discussed in an intuitive fashion keeping the use of equations and rigorous
statistical formalism to a minimum. Hopefully this approach will allow the reader without a strong
background in mathematics to follow the discussion and grasp the important role which statistical methods
can play during site investigation and remediation. Because of the decision to present this material in
a somewhat simplified form, readers with advanced knowledge of statistics may believe that the topics are
not treated in sufficient detail. These readers and others who wish to obtain additional information on
the methods presented are referred to the list of references provided at the end of this section.
A.I CLASSICAL STATISTICS VERSUS GEOSTATISTICS
When applying statistical procedures there are essentially two possible families of procedures which can
be applied. Classical statistical techniques, based on the concept of the random variable, have been
applied with success for well over 100 years. Geostatistical techniques, based on the concept of the
random function, were developed in the 1960's, but have been applied very successfully to data from such
diverse fields as mining engineering, petroleum engineering, hydrogeology, soil science, and, recently,
hazardous waste. The property of geostatistical procedures which makes them applicable in such a wide
variety of fields is that geostatistical techniques utilize the location of the data and the size of the
site in all calculations, whereas classical techniques ignore both of these important parameters.
Because classical techniques ignore data location, the decision of which set of procedures should be
applied to a data set is straightforward. If the locations of the data and the size of the site can be
ignored, then classical techniques can be accurately applied: otherwise, geostatistical techniques should
be applied.
In the following sections, applications of both classical and geostatistical procedures will be provided.
These sections will provide a clear distinction between these methods and will indicate when each are
appropriate.
A.2 ACCURACY AND PRECISION OF ANALYTICAL PROCEDURES
The type of statistical information which most readers are likely to encounter is precision and accuracy
data. This data accompanies the results from each case of samples sent to the CLP and most non CLP
laboratories. Interpreting accuracy and precision information can be key in understanding the
significance of the reported values and assessing the confidence associated with any RI decision.
A-l
-------
Because of the importance of this information, detailed definitions of both accuracy and precision are
provided. A short example is provided to illustrate the use of these parameters.
If analytical procedures were perfect, the reported anaiyte concentrations would always exactly equal the
actual concentrations present in the sample. In reality, analytical procedures are not perfect, so the
reported and actual concentrations are commonly not identical. The difference between the reported
concentration and the actual concentration of a sample is the analytical error. Without knowledge of the
potential magnitude of the analytical error it is impossible to judge the significance of a reported
concentration. An example where knowledge of the analytical error is crucial is the decision to shut
down a drinking water well when the reported concentration is below the action level. Although, in this
case, the reported value is below the action level, the actual concentration might exceed the action
level due to analytical error. Without knowledge of the likely magnitude of the analytical errors the
decision maker has insufficient information to make a decision. If the likely magnitude of errors were
known, the decision maker could examine the impacts and likelihood of an incorrect decision and could
reach an informed, correct, decision. In this section a procedure for examining analytical errors and
judging the significance of reported values will be discussed. These procedures are based solely on
classical statistics.
A.2.1 DEFINITION OF ANALYTICAL ERROR
There are many sources of error which can be introduced when obtaining a sample. Some of these sources
are improper sampling procedures, contaminated sample containers and use of improperly decontaminated
sampling equipment. These types of errors are separate from analytical error and are not considered
here. Analytical error is taken as the error due solely to the analytical procedure. This error is
measured by laboratory spikes and duplicate samples.
A.2.2 DEFINITION OF ACCURACY AND PRECISION
Analytical procedures can introduce errors due to a wide variety of causes, some of which are described
in Appendix B. It is impossible to deterministically predict the magnitude of each error, so accuracy
and precision have been introduced to summarize the errors of an analytical procedure. An example will
provide a means of introducing accuracy and precision. Suppose that a standard sample containing a known
amount of an anaiyte is submitted to four different laboratories (Lab A, Lab B, Lab C, and Lab D), each
using the same analytical procedure. Each laboratory analyzes ten replicates of the sample. The
following results are obtained.
Reported Concentration of Standard (ppm)
Laboratory
Replicate A B C D
1 10 13 10 4
2 10 12 14 14
3 10 12 6 15
4 12 11 8 I
5 9 14 12 I
6 8 12 7 6
7 13 II 11 I
8 11 12 5 14
9 12 13 15 II
10 10 12 10 15
Ten replicates were chosen only to illustrate the concepts of accuracy and precision. There is no
implicit or explicit recommendation that each sample be analyzed 10 times to determine accuracy and
precision.
A-2
-------
The actual concentration of the standard is 10 ppm. The majority of people examining these results would
conclude that laboratory A provides the "best" results. This conclusion is reached because laboratory A
either reports 10 ppm or a value very close to 10 ppm for each replicate.
Presented with results for replicate analyses, the average person could qualitatively rank a set of
laboratories; however, such a comparison is time consuming and ultimately not useful, since it is not
quantitative. To make sense of replicate data it must be summarized in a meaningful way. Accuracy and
precision provide a method of summarizing replicate data which allows different analytical procedures and
different laboratories to be compared. Accuracy and precision also allow a determination of the
significance of individual reported values. The accuracy and precision of common analytical methods are
presented in Appendix C.
A. 2.3 ACCURACY
Intuitively it is desirable that, on average, the reported concentration equal the actual concentration
present in a sample. That is, ideally the analytical method should not have any systematic errors.
Accuracy measures the average or systematic error of a method. In the example of the four laboratories,
accuracy can be defined as the difference between the average of the 10 reported values and the actual
value (10 ppm). Performing this calculation, the following results are obtained:
Lab Average of 10 Replicates (ppm) Average Error (ppm)
A 10.5 0.5
B 12.2 2.2
C 9.8 -0.2
D 8.2 -1.8
These results show that, on average, laboratories A and C yield reported values which are very close to
the actual or spiked value. Thus, laboratories A and C are more accurate than laboratories B and D.
Accuracy values can be presented in a variety of ways. The average error shown above is one way of
presenting this information; however, more commonly accuracy is presented as percent bias or percent
recovery. Percent bias is a standardized average error; that is, the average error divided by the actual
or spiked concentration and converted to a percentage. For Lab A in the previous example, the percent
bias is .5/10 = .05 or 5 percent since the actual concentration is 10 ppm. Percent bias is unitless so
it allows the accuracy of analytical procedures to be compared easily.
Percent recovery provides the same information as percent bias. Since accuracy is often determined from
spiked samples, laboratories commonly report accuracy in this form. Percent recovery is defined as:
% Recovery = R X 100
S
where S = spiked concentration
R = reported concentration
Given this definition it can be shown that
% bias = % recovery - 100
A-.l
-------
For this example, the observed % bias and % recovery are:
Lab Percent Recovery Percent Bias
A 105 5
B 122 22
C 98 -2
D 82 -18
A. 2.4 PRECISION
Whereas accuracy measures the average properties of an analytical method, precision examines the spread
of the reported values about their mean. The spread of reported values refers to how different the
individual reported values are from the average reported value. Precision can thus be seen as a measure
of the magnitude of the errors.
Precision can be measured in a variety of ways, each of which has its merits. A simple measure of
precision is the variance. The sample variances, calculated using the standard formula for the sample
variance, for the 10 replicate samples sent to the 4 previously discussed labs are as follows:
Lab Variance of the replicates
A 2.3
B 0.8
C 11.1
D 38.4
These results indicate that Lab B is the most precise. This could be determined by examining the ten
individual values reported by Lab B which are ail extremely similar. The Lab D reported values are very
dissimilar. This feature is expressed by a large variance of the replicates.
Laboratories commonly determine precision from duplicate samples; thus precision is usually expressed as
relative percent difference (%RPD) or relative standard deviation (%RSD). These quantities are defined
as follows.
% RPD = 100 x 2 ]X± - X, |
(Xx + X2)
where X and X. are the reported concentrations for each duplicate sample
% RSD = (IOO//2) x [2|X1 - X2 j/(X1 + X2)]
A.2.5 SUMMARY OF ACCURACY AND PRECISION
Based on the definitions of accuracy and precision, the performance of each of the four laboratories can
be summarized in relative terms.
A-4
-------
Lab Accuracy Precision
A High High
B Low Very High
C High Low
D Low Low
From this summary, it appears that Lab A provides the most reliable values. Notice however, that
although Lab B has low accuracy, its precision is very high. Thus, if the reported values are corrected
for the systematic error introduced by the laboratory, Lab B is superior to Lab A. In other words, if
2.2 ppm (which is the absolute average error, as calculated previously) is added to each of the values
reported by Lab B. the reported values will have both very high accuracy and precision. This example
demonstrates that if the bias of an analytical method is known, it can be easily be removed; however, it
is not possible to correct for low precision.
A.2.6 USING ACCURACY AND PRECISION INFORMATION
The accuracy and precision of four laboratories have been determined for a specific analyte. This
information can now be used in the DQO process. For the purpose of this example, assume that a drinking
water sample is sent to a single laboratory. The sample will be analyzed for four suspected
contaminants. The historical accuracy and precision of the analytical procedures are known for these
four analytes. The action levels for the four contaminants are:
Contaminant Action Level
A 12
B 10
C 15
D 15
The lab reported the following concentrations for the sample:
Contaminant Reported Concentration
A 9.0
B 9.99
C 7.0
D 8.0
All analytes except contaminant B are reported at concentrations below the action levels. The reported
concentration for contaminant B is almost exactly at the action level. Based on these results, the well
water might be considered to be safe for drinking.
Accuracy and precision information, as found in Appendix F. can be used to determine the safety of the
drinking water by determining the probability that the actual concentration of each analyte present in
the sample exceeds the appropriate action level.
A-5
-------
The first step is to correct for the bias of each analytical procedure. To correct for bias, divide the
reported concentration by the average percent recovery which is determined from spiked samples analyzed
with the present sample, or historical information. Note that systematic correction of reported values
for bias is not recommended; however, it is performed in this example because it is assumed that the bias
is well known . The corrected values are presented below. >
Reported Percent Corrected ;
Contaminant Concentration Recovery Value .
A 9.0 105 8.6 I
B 9.99 122 8.2 \
C 7.0 98 7.2 I
D 8.0 82 9.8 1
I
The standard deviation, S, for the analytical procedures can be calculated from the percent j
relative standard deviation, percent RSD. The standard deviation (S) is calculated in the ?
following table by multiplying the reported value by the percent RSD. j
i
Reported \
Contaminant Concentration %RSD S
i
A 9.0 14.5 1.3 :
B 9.99 7.5 .75 J
C 7.0 34.0 2.4 !
D 8.0 75.6 6.0
A simple technique for presenting the uncertainty in analytic results is to present the probable range of
values which might be expected from the analytical procedure. In a quality control chart, the probable
range is usually _+3 standard deviations about the expected value, in our case the corrected value.
Contaminant
A
B
C
D
The upper limit of the probable range is
Corrected Value + 3 x Standard Deviation.
Reported
Concentration
9.0
9.99
7.0
8.0
Corrected
Value
8.6
8.2
7.2
9.8
S
1.3
.75
2.4
6.0
Action
Level
12
10
15
15
Probable
Range
(4.7,12.5)
(6.0,10.5)
(0 ,14.4)
(0 ,27.8)
Correcting for bias in an analytical procedure should be done on a contaminant by contaminant
basis, taking into account the nature of the media and the matrix being analyzed.
A-6
-------
Note that only contaminant C has an upper limit which does not exceed the action levels. The upper limit
for contaminants A and B just exceeds the action level. (12.5 vs. 12 and 10.5 vs. 10). Contaminant D's
upper limit is well above its action level, even though its reported value is only 8.0.
If the distribution of reported values is assumed to be normal, the probability that the actual sample
concentration exceeds the corresponding action level can be calculated.
Tables of the normal distribution are available in all statistics books. These tables give the
probability of exceeding a series of standardized variables. To utilize these tables, the reported
analytical values must be standardized. The standardization is
Z = X-Xr
~T
where Z = standardized value
X = action levei
Xc = corrected reported concentration
S = standard deviation of the analytic test.
The values X, Xc , and S are known in this example and can be used to determine the
probability of exceeding the action levels, Pr[Xc ) Action Level].
Reported Action
Contaminant Concentration Level Z
A 9.0 12 2.6 .005
B 9.99 10 2.4 .008
C 7.0 15 3.3 .001
D 8.0 15 0.87 .19
These probability values indicate that by utilizing accuracy and precision information, the significance
of the reported values can be assessed. Even though all the reported concentrations were below the
action levels, further analysis demonstrates that contaminant D has a 19 percent chance of being greater
than the action level of 15. Contaminant B, with a reported value at the action level, 9.99 vs. 10.0,
has in actuality less than a 1 percent chance of exceeding the action level of 10.
The precision of the analytical procedure for analyte D is poor as expressed by its high percent RSD of
75.6 percent. Precision can be improved by analyzing sample replicates or splits. If the lab analyzed
three splits, the percent RSD and standard deviation would have been reduced by 58 percent (1//3). The
new S would have been 3.5 (6x.58), and the new Z value 1.48. Thus, the new probability that contaminant
D would exceed the action level of 15 is only 7 percent which could be an acceptable risk depending on
the toxicity and health effects of contaminant D.
This simple example demonstrates the importance of accuracy and precision information and indicates the
possible consequences of ignoring these data. Because of the importance of accuracy and precision
information. Appendix F, which gives accuracy and precision data for many common analytical techniques,
has been compiled. Decision makers are urged to examine this appendix and to utilize the information
prior to reaching a decision.
A-7
-------
A.3 PROBABILITY OF LOCATING A CONTAMINATED ZONE
At sites or portions of sites where soil contamination is suspected but no definite sources have been
identified, an objective of the remedial investigation might be to determine if soil contamination is
present. Important decisions facing the site manager are how many samples must be taken to investigate
the potentially contaminated area and where the samples will be located.
In certain situations geophysical surveys can be utilized in determining the location of contaminated
zones. Geophysics can effectively be used to determine the locations of certain ground water plumes
(such as hydrocarbon plumes) and concentrations of buried metallic objects (drums and tanks). The
following discussion concerning the probability of locating a contaminated zone is applicable to
geophysical methods as well as to standard sampling techniques.
The decision maker must determine, in Stages 1 and 2 of the DQO process, the acceptable probability of
not finding an existing contaminated zone in the suspected area. For instance, it might be determined
that a 20 percent chance of missing a 100-ft-by-lOO-ft contaminated zone is acceptable but only a 5
percent chance of missing a 200-ft-by-200-ft zone is acceptable. This probability value provides the
basis for using statistics to determine how many samples are required. Statistical methods can be used
to determine the number and location of data required to lower the probability of missing an existing
contaminated zone to a value less than the acceptable predefined value. The acceptable probability of
missing a contaminated zone must be established by the decision maker working in concert with the data
users. Individuals involved in developing risk assessments may provide meaningful inputs into
determination of the appropriate probability values to be utilized.
The statistical method applied in this instance involves geometric probabilities. That is. the
probability of not identifying a contaminated zone is related to the area or volume of the contaminated
zone and the spatial location of the samples. This method is not clearly a classical statistical or
geostatistical procedure, it will be considered as a hybrid statistical method.
To apply this method, the following assumptions are required:
The shape and size of the contaminated zone must be known at least approximately. This known
shape will be termed the target.
Any sample located within the contaminated zone will identify the contamination.
These assumptions are not severe and should be met in practice.
If, in addition to the above assumptions, data are located on a perfectly regular grid and the target is
circular, the probability of hitting the target for a given grid size is given by the following (Gilbert
1982):
Probability of a Hit G/A
0.8 1.13
0.9 1.01
0.95 0.94
0.99 0.86
where A is the diameter of the target and G is the linear grid spacms.
-------
If data are not regularly located or the target is not circular, a simulation procedure is used.
The procedure used is hit or miss simulation involving the following steps:
Simulate a contaminated zone or target.
t Randomly locate the target within the site.
Determine if any sample locations fall within the boundaries of the target. If so score a
hit, otherwise a miss.
Simulate and randomly locate several hundred targets using a computer program and record the
number of hits and misses.
The probability of locating the contaminated zone is equal to the total number of hits divided by the
total number of simulations.
Figure A-l illustrates the hit or miss approach for two simulated contaminated zones. The method is
flexible so various different sample configurations and various different target sizes can be quickly
examined. By varying the number of samples for a fixed target, the number of samples required to lower
the risk of missing the contamination to an acceptable level can be determined. Thus, this method allows
determination of both the number and location of samples necessary to satisfy DQOs.
A.4 CONFIDENCE LIMITS ON ESTIMATES OF MEAN CONTAMINATION
At sites where contamination is known to exist, a parameter of interest is the mean contaminant
concentration over the contaminated area. Mean contaminant concentrations are important when evaluating
contaminants contained within a confined area such as a lagoon. In this case, the mean contaminant
concentration determines the total amount of contaminants contained in the lagoon. To assess various
remedial alternatives it is important to know the maximum quantity of contaminants present in the lagoon.
Confidence limits can be used to state the probable range of total contaminants contained in the lagoon.
Confidence limits can, theoretically, be placed on any quantity calculated from a data set. Perhaps the
most useful quantity is the sample mean. When the sample mean is calculated from a set of data, it is
unlikely that the actual or population mean will equal the sample mean. The sample mean for a fixed
number of data is a random variable whose value will fluctuate depending on the specific data collected.
Confidence intervals are a method of quantifying the likely range of fluctuation of the sample mean.
Confidence intervals are defined as follows; if the 95 percent confidence interval is set for the sample
mean after each repetition of an experiment and the experiment is performed 100 times, the population
mean is expected to fall between confidence limits 95 times.
For example, 20 soil samples are collected at a site with known soil contamination. The sample mean is
calculated from these samples and is determined to be 14 mg/kg of an analyte of interest. Furthermore,
it is determined that the 95 percent confidence limits for this sample mean are 12 and 17 mg/kg. In this
example, there is a 95 percent chance that the actual mean soil concentration falls between 12 and 17
mg/kg.
To determine a confidence interval the distribution of the sample mean must be known. To determine the
distribution at least three quantities are required. These quantities are the estimated sample mean, the
variance of the sample mean and the shape of its distribution. Both classical and geostatistical
approaches can be used to determine these quantities. Each of these methods will be discussed
individually; however, before proceeding it must be noted that neither of these methods can be applied
without site-specific information.
A-9
-------
B
NOTE: This figure illustrates two possible simulations of a circular target for a fixed
set of data locations. The upper figure (a) illustrates a hit while the lower
figure (b) illustrates a miss
A-1
HIT & MISS EXAMPLE
A-IO
-------
A.4.1 THE CLASSICAL STATISTICAL APPROACH
Classical statistical approaches assume that the distribution of the sample mean follows Student's t or,
if more than 30 data are available, a normal distribution. This assumption is considered valid because
of the power of the central limit theorem which states that regardless of the distribution of the data,
the sample mean follows a normal distribution when sufficient data are obtained. The drawback of the
classical statistics approach is that the variance of the sample mean is taken as the variance of the
data divided by the number of the data available regardless of the location of the data or the size of
the site.
Thus, when using classical statistics to determine a confidence interval for a sample mean based on 10
data, it does not matter whether the data are spread uniformly over the site or clustered in one corner;
nor does it matter if the site covers 1/4 acre or 20 square miles. The confidence intervals in each of
these cases will be identical.
Since all scientists and engineers working on hazardous waste sites realize that both data location and
the size of the site are crucial factors in analyzing the significance of data, it is not logical to
apply a procedure which does not account for these important factors. However, if in some specialized
instance it is deemed that sample locations are not important, the classical statistical procedures based
on the t statistic yield simple formulas for determining confidence intervals and data requirements.
These formulas are provided in many references including EPA 1984 and EPA 1985.
A.4.2 GEOSTATISTICAL APPROACH
Geostatistics, or more formally, the theory of regionalized variables, is similar to classical statistics
in many ways. Most importantly, it differs with respect to basic assumptions regarding independence of
the data. Classical statistics assumes that data are mutually independent, that is, that one data point
is not related to another. Geostatistics recognizes that observed concentrations are governed by
physical processes; thus, one particular point in space yields information concerning the expected
contaminant level at a location 5 or 10 ft away from the sampled point. In statistical terms, these data
are correlated in space. Geostatistical tools measure and exploit the correlation between data to
estimate contaminant concentrations and determine the uncertainty associated with the estimate. In other
words, geostatistical methods consider the location of data and the size of the site in any calculation.
Geostatistics can be used to determine the variance of errors associated with any weighted estimate of
the sample mean. In particular, geostatistics can be used to determine the variance of errors associated
with estimating the true mean contaminant concentration by the average of the available data. The
detailed derivation of the methodfor determining confidence intervals is given by Journel and Huijbregts
(1978). A brief discussion of the method is provided here.
An estimate of the true mean site contamination can be determined from an average of the available data.
The estimate is not, in general, equal to the true mean so an error is made. The error of estimation is
defined as the estimated mean less the true mean. The particular error observed is one realization of
the error random variable. The variance of the error variable is unknown, but it is known that the mean
of the error distribution is zero since only unbiased estimators will be used. The variance of the error
distribution can be determined using geostatistics.
The variance estimate requires knowledge of the average correlation between the data and the average
correlation between the data and the volume defining the site. Determination of these quantities
requires a model of correlation at the site. This correlation model is provided by the experimental
variogram determined from the data. The experimental variogram is defined as follows:
A-ll
-------
g(h) =J Z(z(x. + h) -z(x.))2
n(h)
Where: n(h) is the number of data separated by distance h
z(x. ) is the contamination observed at location x£
z(x^ + h) is the contamination observed at location XA +h
g(h) is the experimental variogram for distance h
By varying h, a model of the variogram versus h can be developed and applied to determine the variance of
errors. An example of a variogram is provided in Section 5.5.3.2 of the DQO example document.
To this point, the mean and variance of the distribution of errors have been discussed. The remaining
parameter of interest is the shape of the distribution of errors. As the number of data used to estimate
the true mean increases, the distribution of errors becomes more and more like a normal distribution.
This is not a theoretical result but an observation from practical applications. Given that the errors
follow an approximately normal distribution, the confidence limits can be determined by the following
procedure.
Define the level of confidence required.
Find the standard normal variate corresponding to this probability in a normal table.
Apply the following formula:
Z-ys
-------
A.5 LOCAL ESTIMATION OF CONTAMINATION
In many instances, the contamination at a particular point within the site is of interest. Determination
of contaminant concentrations at unsampled locations is termed local estimation. For example, consider a
site with a known source of contamination. Available information indicates that contaminants are
migrating toward the western edge of the site. An objective of the RJ might be to determine the western
extent of contaminant migration. Geostatistics can be used to determine the likely extent of
contamination. This information will greatly aid in choosing data locations.
A second example where local estimation is important is in determining optimal contours for a variable.
For instance, in many enforcement cases an accurate determination of the ground water gradient is
required to correctly identify potentially responsible parties. Water levels are measured in wells which
are separated by varying distances. The heads between wells must be obtained. To ensure that the
estimated heads and associated contour lines are as accurate as possible the heads at unsampled points
should be estimated optimally using geostatistics.
Geostatistics can be used to address problems presented in the previous scenarios. The geostatistical
technique which will be applied is known as kriging. Kriging, which is similar to multiple regression,
determines an optimal estimate of a variable at any particular location in space. Associated with this
estimate is a measure of uncertainty known as the kriging variance. To apply kriging, a model of the
correlation between data is required. This model is obtained by modeling the experimental variogram of
the data.
An example of the use of kriging to optimally estimate the concentration of lead in soil surrounding a
smelter is shown in Figure A-2.
A.6 LOCAL ESTIMATION OF PROBABILITY
At soil contamination sites where a fixed cleanup criterion has been set, geostatistics can be used to
estimate the risk associated with not removing any particular quantity of soil. Geostatistics can be
used to quantify the probability of exceeding this criteria and to develop probability contour maps.
This map may be used in conjunction with the acceptable uncertainty determined during Stages 1 and 2 of
the DQO process to define what volume of soil must be removed.
To determine the probability of exceeding a given value at an unsampled point it is necessary to estimate
the entire contaminant distribution at that point. Given this distribution, the probability that the
contaminant concentration exceeds any value of interest can be determined.
An example of a probability map is provided in Figure A-3. In this example. lead contamination has been
found in soil surrounding a lead smelter. It has been determined that all soil in excess of 1000 ppm
will be removed as part of the remedial action. The probability map gives the likelihood of exceeding
1000 ppm at each point in the site. If, through the DQO process, 30 percent had been determined as an
acceptable probability of exceeding 1000 ppm. then all soil within the 30 percent contour would be
removed. The remaining soil would have, at most, a 30 percent chance of exceeding 1000 ppm. If a
different acceptable probability was defined, the volume of soil removed would be defined by the
particular contour. This method provides an objective method for determining the volume of soil to be
removed.
Techniques for estimating local probability distributions include indicator kriging. probability kriging.
and multivariate gaussian kriging. (Journel 1983: Sullivan 1984: Verly 1983: and Isaakes 1983). These
techniques are known as non-linear estimators and are related to but are more complex than kriging.
These estimators require an accurate and detailed model of the correlation structure of the data to be
effective.
-------
N
1000
(ft.)
750 *
500
250
250
500
750
1000('t-)
Nolt:
Contour map of lead concentration in soil surrounding a smtlttr. Contours
ara bis«d on tjtimam of soil Itid concamration (in ppm) dttarmmcd by
kriging.
A-2
EXAMPLE OF KRIGING
A-14
-------
N
1000
(ft.)
750 -
500 -
250 -
250
500
750
1000<«.)
NOTE: Probabilities of exceeding 1000 ppm soil lead concentration near a smelter.
Material with probabilities exceeding the acceptable risk defined in the DQO
process will be removed as part of the remedial action.
A-3
PROBABILITY MAP
A-15
-------
An important feature of non-linear estimators is that any uncertainty in the data values stemming from
laboratory or sampling errors can easily be incorporated into the estimate. Since non-linear estimators
can be used to estimate the mean or variance at a point or over a region, these techniques provide a
means of including uncertainty in any regional or local estimate of the mean. The uncertainty associated
with these estimates will include the uncertainty present in the data.
A.6 REFERENCES
Addiscott, T.M., and J.R. Wagenet. 1985. A Simple Method for Combining Soil Properties that Show '
Variability. Soil Science Society of America Journel. 49: 1365-1369.
Box, G.E.P., W. Hunter, and J.S. Hunter. 1978. Statistics for Experimenters: An Introduction to
Design, Data Analysis, and Model Building. John Wiley and Sons. New York. |
I
Camp Dresser & McKee Inc. (CDM). 1986. Statistics for Contaminated Zones at the North Cavalcade j
Site, CDM Internal Correspondence, J. Sullivan. !
EPA. 1984. A Soil Sampling Quality Assurance User's Guide EPA 600/4-84-043 j
EPA. 1985. Sediment Sampling Quality Assurance User's Guide EPA 600/4-85-048 j
Flatman, G.T. 1985 Design of Soil Sampling Program: Statistical Considerations Draft. '
Flatman, G.T. and A.A. Yfantis. 1984. Geostatistical Strategy for Soil Sampling: The Survey and the
Census. Environmental Mentoring and Assessment 4:335-349. \
Gilbert, 1982. Some Statistical Aspects of Finding Hot Spots and Buried Radioactivity. TRAN-STAT j
Statistics for Environmental Studies, Batelle Institute, Richland Washington, No. 19.
Isaaks, E., 1984. Risk Qualified Mappings for Hazardous Waste Sites: A case study in distribution
free geostatistics, unpublished master's thesis, Stanford University.
Journel, A.G., 1983. Non Parametric Estimation of Spatial Distributions, Journal of Mathematical
Geology, Vol. 15, No. 3, pp. 445-468.
Journel, A.G. and Ch.J. Huijbregts. 1978. Mining Geostatistics. Academic Press, London.
Klusman, R.W. 1985. Sample Design and Analysis for Regional Geochemical Studies. Journal of
Environmental Quality. 14:369-375
Ripley, B. 1982. Spatial Statistics. John Wiley & Sons, New York.
Russo, D. 1984. Design of an Optimal Sampling Newtwork for Estimating the Variogram. Soil Science
Society of America Journal 48 (4): 708-716
Sullivan, J. 1984. Conditional Recovery Estimation through Probability Kriging - Theory and
Practice, in Geostatistics for Natural Resource Characterization. Reidel, Dordrecht, Holland.
Verly, G. 1983. The Multigaussian Approach and its Application to the Estimation of Local Reserves,
Journal of Mathematical Geology, Vol. 15. No. 2 pp. 263-290.
Yost, R.S.. G. Uehara and R.L. Fox. 1982. Geostatistical Analysis of Soil Chemical Properties of
Large Land Areas. I. Semi-variograms. Soil Science Society of America Journal 46(5): 1028-1032
A-16
-------
Yost. R.S.. G. Uehara and R.L. Fox. 1982. Geostatistical Analysis of Soil Chemical Properties of
Large Land Areas. II. Kriging. Soil Science Society of America Journal 46(5): 1033-1037
Zirchky. J., Deary. G.P., Gilbert. R.O.. Middlebrooks. EJ. 1985. Spatial Estimation of Hazardous
Waste Site Data. In: Journal of Environmental Engineering. Vol.III. No.6 pp. 777-787.
A-17
-------
APPENDIX B
ANALYTICAL CONSIDERATIONS
-------
-------
APPENDIX B
ANALYTICAL CONSIDERATIONS
Analytical methods must be evaluated during the development of site specific data quality objectives.
The parameters for which the analytical method is valid, its limitations, and any special considerations
(such as sample preparation) which will affect data quality must be understood in order to select
appropriate analytical methods for specific uses.
This section provides an overview of the analytical considerations which should be taken into account
during DQO development. Analytical considerations must be evaluated concurrently with statistical and
sampling considerations in order to ensure that established DQOs can be attained.
B.I ANALYTICAL SUPPORT LEVELS
The analytical options available to support data collection activities are presented in five general
levels. These levels are distinguished by the types of technology and documentation used, and their
degree of sophistication as follows:
LEVEL V - Non-standard methods. Analyses which may require method modification and/or
development. CLP Special Analytical Services (SAS) are considered Level V.
LEVEL IV - CLP Routine Analytical Services (RAS). This level is characterized by rigorous
QA/QC protocols and documentation and provides qualitative and quantitative analytical data.
Some regions have obtained similar support via their own regional laboratories, university
laboratories, or other commercial laboratories.
LEVEL III - Laboratory analysis using methods other than the CLP RAS. This level is used
primarily in support of engineering studies using standard EPA approved procedures. Some
procedures may be equivalent to CLP RAS. without the CLP requirements for documentation.
LEVEL II - Field analysis. This level is characterized by the use of portable analytical
instruments which can be used on-site, or in mobile laboratories stationed near a site
(close-support labs). Depending upon the types of contaminants, sample matrix, and personnel
skills, qualitative and quantitative data can be obtained.
LEVEL I - Field screening. This level is characterized by the use of portable instruments
which can provide real-time data to assist in the optimization of sampling point locations
and for health and safety support. Data can be generated regarding the presence or absence
of certain contaminants (especially volatiles) at sampling locations.
Table B-l provides a summary of the analytical levels, their applicability, and limitations. Within each
level, different procedures may be used to produce different quality data to some extent. For example.
Level II encompasses both mobile laboratory procedures and less sophisticated "tailgate" operations which
may produce data of different quality.
B. 1.1 LEVEL V ANALYTICAL SUPPORT - NON-STANDARD METHODS
The objective of non-standard analytical support is to provide the RI/FS process with data that cannot be
obtained through standard avenues of analytical support. Analytical support of this type may involve the
research, development and documentation of a method, or more typically, the modification of an existing
method. EMSL-LV should be consulted for protocol availability, modification, or development. Level V
methods are available through CLP Special Analytical Services (SAS). university laboratories, commercial
laboratories. National Enforcement Investigation Center, and Environmental Services Division. Not all
SAS analyses are non-standard; they may just not be part of CLP protocols.
B-l
-------
TABLE B-l SUMMARY OF ANALYTICAL LEVELS FOR RI/FS
Option
Type of Analysis
Uses
Limitations
CD
r-j
Level V - Non-convential
parameters
- Method-specific
detection limits
- Modification of
existing methods
- Appendix 8 parameters
- TIC
Level IV - HSL organics/inorganics
by GC/MS; AA; ICP.
Level III - Organics/inorganics
using EPA procedures
other than RAS can be
analyte-speci fie
- RCRA characteristic
tests
- Confirmational
- Toxicology
- Site-specific
conditions/parameters
- RCRA compliance
Confirmational
Toxicology
All other program
activities
Confirmational but with
less documentation
Presence or absence of
contaminants
Engineering uses
Screening
- Requires method
development/modi fi ca-
tion
- Mechanism to obtain
services requires
special leadtime
- Calibration standards
may not be readily
available
- Tentative identifica-
tion of non-HSL parameters
- Some time is Required
for validation of
packages
- Methods may vary
Level II
Level I
Variety of organics
by GC; inorganics
by AA; XRF
Tentative ID; analyte-
specific
Detection limits vary
from low ppm to low ppb
Portable/mobile
instrumentation
Total organic vapor
detection using
portable instruments
pH, conductivity,
salinity, DO
Presence or absence of
contaminants
Relative concentrations
Engineering
Screening
- Assist in identifying
sample locations
- Field screening
- Health and safety
- Tentative ID
- Techniques/instruments
1imited
- Instruments respond to
naturally-occurring
compounds
-------
TABLE B-l SUMMARY OF ANALYTICAL LEVELS FOR RI/FS
(continued)
Option
Data Quality
Cost
Time
Level V
- Method-specific
ca
Level IV
Level III
Level II
Level I
Rigorous QA/QC
Standard methods
Similar detection
limits to CLP
Less rigorous QA/QC
Dependent on QA/QC
steps employed
Data typically reported
in concentration ranges
If instruments cali-
brated and data
interpreted correctly,
can provide indication
of contamination
- Initially high,
if method development
is required.
$l,000/Sample for
organics
$200/Sample for
metals
- $960/Sample for
organics
- $200/Sample for
metals
- $15-40/Sample
Negligible, if
capital costs
excluded
Entries refer to
all types of
analysis listed.
No specific time/
cost requirements
can be specified.
In general the
time frame can
range from a few
weeks to signifi-
cantly longer if
method development
is needed.
Contractually,
30-40 days
Shorter turnaround
time possible
through SAS
request
14 days, but can
vary based on
contract require-
ments.
Real-time to
several hours
- Real-time
-------
The analysis of samples for the RCRA modified Appendix VIII list of contaminants could currently be
considered a Level V application. The modified Appendix VIII list contains 92 organic compounds that are
not a part of the Hazardous Substances List (HSL) and therefore are not normally tested for on samples
obtained from CERCLA sites. Appendix D of this document contains tables from a preliminary feasibility
study performed to address the applicability of using or modifying existing analytical procedures for
Appendix VIII analysis.
Level V poses limitations to implementation because the amount of lead-time for start-up may be
significant, and the analyses may be "one-of-a-kind" applications of the method, resulting in a lack of
comparability of the data. The unit costs for Level V sample analysis are dependent on the analysis
requested. Generally, initial unit costs will be high, reflecting the costs of becoming familiar with
the method. If the method is used for other projects or sites, unit costs may decrease with the demand,
and the method may become standard. The amount of documentation available for Level V analytical support
will vary depending on the sophistication of the technology used. If method development is required,
this information should be requested and reviewed by the user.
Accuracy and precision information is generally not available for Level V. Thus, when level V methods
are used, the number of duplicate and spiked samples must be increased to allow a determination of the
accuracy and precision of the method.
B. 1.2 LEVEL IV ANALYTICAL SUPPORT - CONTRACT LABORATORY PROGRAM (CLP)
ROUTINE ANALYTICAL SERVICES (RAS)
The CLP RAS provides for analyses of all types of media for Hazardous Substance List (HSL) organic
compounds and priority pollutant inorganic compounds. (CLP RAS does not provide biota and air media
[adsorption tube] analyses.) These services are available through CLP RAS and regional EPA ESD
laboratories. Level IV analyses are currently used for most RI/FS activities. However, the use of level
IV data may not be required for many RI/FS purposes. Level IV analyses are typically used for
confirmation of lower level data, risk assessment, and to obtain highly documented data.
CLP RAS generated data have the following properties:
Confirmed identification and quantitation of compounds (for HSL parameters only unless
otherwise specified) to the detection specified in the IFB.
Tentative identification of a contractually-specified number (30) of non-HSL parameters.
Sufficient documentation to allow qualified personnel to review and evaluate data quality.
Uniform methods of analysis activities.
Detection limits may not be sufficient for toxicological evaluations
CLP support is one of the most expensive routine analytical services available to the
Superfund program, (e.g., RAS for organics is about $1,000/sample. RAS for inorganic is
about $200/sample).
RAS is contractually operating on a 30-40 day turnaround although delays can occur. Since
demands fluctuate, space may be limited at times for the Superfund program. In addition,
data validation usually takes 3-4 weeks after data is received.
The CLP RAS is very specific concerning the documentation that is supplied with every data package. The
RAS deliverables package contains information on initial and continuing calibration. GC/MS tuning,
B-4
-------
surrogate percent recovery, and matrix spike duplicates. In addition, hard copies are provided of
reconstruction ion chromatograms. GC chromatograms. and spectra for every sample and every blank,
standard, or spike run with a particular set of samples. Documentation is also provided for blank
analyses, internal chain of custody and holding times.
The bias and precision of CLP analytical procedures can be assessed by examining the performance of the
laboratory in analyzing matrix spikes. However, an indication of the performance of the laboratory is
also provided by the results of quarterly laboratory performance evaluation samples. These evaluation
samples are submitted blind so the laboratory has no indication of the actual contaminant value. In
contrast, the laboratory knows the exact concentration of a matrix spike.
Historical CLP precision and accuracy data classified by media are presented in Appendix F as Level IV.
Each table is footnoted to show the source of the precision and accuracy data and, to the extent
possible, the type of QC samples used, the numbers of data points, etc. Contract required detection
limits are presented in Appendix H.
B. 1.3 LEVEL III ANALYTICAL SUPPORT - LABORATORY ANALYSIS
Level III analytical support is designed to provide laboratory analysis using standard EPA approved
procedures other than current CLP RAS. This level is used to obtain similar analysis with less
documentation.
Generally the analyses performed using Level III techniques are designed to provide confirmed
identification and quantification of organic and inorganic compounds in water, sediment, and soil
samples. These analyses are available through commercial laboratories, ESD, CLP SAS, and the CLP
screening service (in development).
Level III provides data for site characterizations, environmental monitoring, confirmation of field data
and to support engineering studies (e.g., design, modeling, and pilot/bench studies). In specific cases,
Level III analyses can also provide data for risk assessment requirements.
Level III laboratory analysis provides the following:
Data to support engineering design parameters
t Data for use in evaluating the site for further action, e.g., to determine extent of
environmental contamination
Data for use in risk assessments
Rapid turnaround of data may be available
Detection limits for presence or absence of compounds comparable to Level IV
Costs range from about $200/sample for inorganics to $960/sample. for organics analysis.
Turnaround time for Level III laboratory analysis for organics is expected to be about 14-21
days.
Level III protocols all have built-in QA/QC, including calibration runs, surrogate standards, etc.
External QA, which is also used for the CLP. is employed in the form of trip blanks, replicate and
duplicate samples, and blind spikes submitted with the samples.
The type of laboratory support available under Level III ranges in sophistication from GC/MS
instrumentation to the measurement of water quality parameters. The type and amount of documentation
B-5
-------
available depends on the type of analysis requested. Data users should review a sample report issued by
the laboratory for the analysis requested to determine if the degree of documentation supplied is
adequate or whether additional information must be requested. If the documentation is sufficient, Level
III could save time and cost.
Accuracy, precision and MDL information that is considered representative of this level of analytical
support was compiled from SW-846. This information is provided in Appendix F. These procedures are
applicable for all sample matrices; however, the SW 846 information presented was derived from the
analysis of water and wastewater samples and performance evaluation standards. Therefore, the criteria
specified in this table should be considered as "best case" information when non-aqueous media samples
are analyzed. Also, these data are presented irrespective of the sample pretreatment or preconcentration
techniques used. These techniques may include liquid-liquid extraction (3520), acid/base-neutral
clean-up extraction (3530), soxhlet extraction (3540), sonication extraction (3550), headspace (5020),
and purge and trap (5030). They are used in conjunction with the analytical procedures presented in SW
846.
B. 1.4 LEVEL II ANALYTICAL SUPPORT - FIELD ANALYSIS
Level II analytical support is designed to provide real-time data for ongoing field activities or when
initial data will provide the basis for seeking laboratory analytical support. Level II analysis can
also be utilized effectively when a phased approach is used for field sampling. In a phased sampling
effort, the results of the first phase guide the development of subsequent phases, and thus, real-time
data are important.
Field analysis involves the use of portable or transportable instruments which are based at or near a
sampling site. Field analysis should not be confused with the process of obtaining total organic
readings using portable meters.
Field analysis can provide data from the analysis of air, soil and water samples for many Hazardous
Substance List (HSL) organic compounds, including volatiles, base neutral acid (BNA) extractable
organics, and pesticides/PCBs. Inorganic analysis can also be conducted using portable atomic adsorption
(AA) or other instruments.
Level II analyses is used for onsite, real-time screening, baseline data development, extent of
contamination, and on-site remedial activities.
Level II - field analytical techniques provide the following:
Rapidly available data for a variety of activities, including hydrogeologic investigations;
cleanup operations; and health and safety.
Detection limits for volatiles range from 0.5 ppb in air, 2-3 ppb in water, and 10 ppb for
soil. Detection limits for PCBs in soil are about 1.0 ppm. Detection limits for extractable
organic compounds analyzed in mobile labs are in the vicinity of 10 ppb.
Special applications - e.g., vadose zone monitoring.
Volatile organic data can be used as early indicators or tracers of off-site contaminant
migration. Volatiles are the most mobile of organic contaminants in all media, and are
typically found at some concentration at virtually all sites.
The ability to assess data quality for field activities is dependent upon the QA/QC steps taken in the
process (e.g., documentation of blank injections, calibration standard runs, runs of qualitative
standards between samples, etc.).
B-b
-------
If capital expenditures are excluded, the costs of field analysis are in terms of personnel time in
performing analyses, preparation/maintenance of equipment, etc. Per sample costs for mobilizing and
staffing a field laboratory will decrease as the number of samples increases. Based on limited data from
Region I FIT experience, per-sample costs for volatile and inorganic analyses are approximately $15.
Per-sample costs for mobile laboratory analyses may approach $100. Depending on the type of analysis,
time requirements per analysis range from 10 minutes to 1-2 hours.
Since Level II analyses are performed in the field, the amount and type of documentation available will
vary with the type of analysis and the standard operating procedures used. Typically, a gas
chromatograph operated in the field provides the bulk of the analytical support at this level. The
documentation available utilizing this level of analytical support would consist of the output of an
integrator or strip chart recorder for all samples, standards, and blanks analyzed. Field and analysis
log books would also be a source of additional documentation.
Data generated by Level II analysis are typically confirmed by submitting some duplicate samples to CLP
and/or a local laboratory. Factors to consider in choosing the number (or subset) of samples to be
submitted for confirmational purposes include the following:
Total number of samples taken (i.e., when only a few samples are taken, 100 percent
confirmational analyses may be appropriate)
Objective of sampling
Data uses
Method of analyses used
In general, confirmational samples should include a subset (or all) of designated critical samples, a
subset of samples covering the entire range of identified concentrations, and a subset of samples near
the (preliminary) action level and near the "0" concentration or not detectable range.
An additional factor to consider is the measured precision of the field instrument in use. When
precision is high, fewer samples need to be confirmed: if precision is low, analysis should be suspended
until the reason for the low precision is determined. A qualified chemist should be contacted for input
on instrument calibration, and the utility of the analysis method with the specific field conditions.
The data base for documenting accuracy, precision and MDL information for Level II analyses is sparse. A
number of factors have recently stimulated an interest in the development of Level II methods. This
activity is centered primarily in various EPA Environmental Service Divisions (ESD) and remedial
contractors. There are two ongoing projects expected to contribute significantly to the Level II data
quality criteria data base. These projects are an EPA Headquarters-directed compilation of all Level II
analytical methods currently used by Field Investigation Teams (FITs) and the operation of a mobile field
analytical laboratory being directed by EPA/ESD in Region IV. The Region IV project, in particular,
holds the promise of a significant contribution, since virtually all organic Hazardous Substance List
(HSL) parameters are being analyzed for. As these data become available they will be incorporated into
this document.
B. 1.5 LEVEL I ANALYTICAL SUPPORT - FIELD SCREENING
The objective of Level I analysis is to generate data which are generally used in refining sampling plans
and determining the extent of contamination at this site. This type of support also provides real time
data for health and safety purposes. Additional data which can effectively be obtained by Level I
analyses include pH. conductivity, temperature, salinity, and dissolved oxygen.
B-7
-------
Level I analyses are generally effective for total vapor readings using portable plioto-
ionization or flame ionization meters which respond to a variety of volatile inorganic and organic
compounds. These analyses are available through ESD or remedial contractors.
Level I analysis provides data for onsite, real-time total vapor measurement, evaluation of existing
conditions, samples location optimization, extent of contamination, and health and safety evaluations.
Data generated from Level I support are generally considered qualitative in nature, although limited
quantitative data can also be generated. Data generated from this type of analysis provide the
following:
Identification of soil, water, air and waste locations which have a high likelihood of
showing contamination through subsequent analysis.
Real-time data to be used for health and safety consideration during site reconnaissance and
subsequent intrusive activities.
Quantitative data if a contaminant is known and the instrument is calibrated to that
substance.
Presence or absence of contamination.
The most available form of documentation for this support level is the field operator log book. Sample
identification, location, instrument reading, calibration and blank information is usually contained in
the field log book. A hardcopy stripchart recorder output can be used with these instruments, but this
is not common practice.
There are no data quality criteria specified for Level I, Field Screening Support, because this level is
characterized by the use of hand-held instrumentation (PID, FID) which in general measure total organic
vapor concentrations only, and as such, is not conducive to the generation of quantitative data. In
specialized applications, FIDs can be calibrated to a specific compound and quantitative data can be
obtained. Specific information regarding individual compound sensitivities and response factors can be
obtained in the manufacturer's manual for specific instruments.
B.2 ANALYTICAL FACTORS
Other factors which may affect development of DQOs include the following:
Analytical quality control
Instrumentation options
Media variability
Method detection limit
Matrix effects
Tentatively identified organic compounds
Data qualifiers
B-8
-------
B.2.1 ANALYTICAL QUALITY CONTROL
The classification of analytical support into broad levels takes into account internal laboratory quality
assurance/quality control (QA/QC) in a general manner only. Internal QA/QC refers to the surrogate and
matrix spikes, method blanks, and duplicate/replicate runs, among other laboratory or field operation
quality control. Within a given level of analytical support, there may be differences in the way
individual laboratories or field operations approach internal QA/QC. For CLP Invitation for Bid (IFB)
RAS analytical support, the procedures are standardized and contract-specified.
When evaluating laboratory QA/QC. it is important for the reviewer to keep the level of analytical
support in perspective. These levels produce data of different quality and documentation and should be
reviewed with this in mind. For example, it would be inappropriate to hold a screening laboratory to CLP
RAS standards, or expect a field screening operation to have as rigorous QA/QC as a laboratory.
Expectations such as these would be inconsistent with the concept of classifying analytical support by
the quality of the data needed.
B.2.2 INSTRUMENTATION OPTIONS
In some cases, the decision maker may have the option of choosing between similar analytical procedures
for the analysis of a given parameter. Although each procedure is an EPA approved method, the reason for
the equivalent procedures is that different analytical instrumentation is used for each method. Although
the results obtained are equivalent, there can be subtle differences in the types of data produced by
different instrumentation. When choosing analytical procedures, consideration should always be given to
the instrumentation used in order to select the method that will best satisfy the stated analytical
requirements. One of the many examples of equivalent procedures using different instrumentation for the
analysis of the same parameters is the gas chromatography (GC) and gas chromatography/mass spectrometry
(GC/MS) procedures used for the analysis of organic compounds. An analytical chemist should be consulted
to select the appropriate procedures for the specific problems encountered at the site.
B. 2.3 MEDIA VARIABILITY
Decision makers and data users should be aware that variability is introduced by the response of a given
analytical technique or method to a given sample medium. Most of the analytical methods utilized in
support of RI/FS activities were developed, at least originally, for aqueous samples and modified for use
with other media with varying results. Also, the quality control data published for most analytical
methods (concerning accuracy and precision information) were developed using aqueous samples. The
performance criteria published may not totally apply to the use of the method with other sample media.
When considering the analysis of source materials, leachate or other complex matrices, qualified
analytical support personnel should be consulted to determine the most appropriate analytical approach.
B.2.4 METHOD DETECTION LIMIT
Regardless of the specified method detection limit, the actual detection limit reported may be sample
specific. This is especially true of samples having complex sample matrices (i.e., samples containing
numerous analytes at widely-different concentration ranges). If the concentration of a particular sample
constituent is so high that it requires dilution prior to analysis, the resulting detection limit for
that sample will be raised by the dilution factor. For example, consider a sample being analyzed by
GC/MS for volatile organics. The laboratory's normal detection limit for benzene by this method is 4.0
ug/1. but the sample may contain a high concentration of volatiles, and have to be diluted (say by at
least a 1:10 ratio). The resulting detection limit for benzene will be 50.0 ug/l. In some cases, the
laboratory can analyze the same sample twice to obtain the specified detection limit but this is not
always possible, is not considered standard practice, and would have to be specified prior to sample
submittal.
-------
It is important to recognize that quantitative results reported at the detection limit may not be
reliable. If the action level of a contaminant is 5.0 ug/1. an analytical method with a detection limit
of 5.0 ug/1 may not provide suitable data to meet the criteria. For example, the action level for
trichloroethene (TCE) as defined the the Safe Drinking Water Act as a proposed Maximum Contaminant Level
(MCL) is 5.0 ug/1. Analytical method 624 for volatile organics by GC/MS has a detection limit of 5.0
ug/1. Thus, method 624 may not be acceptable for this application. The magnitude of the action level
and the detection limit must be considered in selecting a procedure.
B.2.5 MATRIX EFFECTS
A matrix effect is a phenomenon that occurs when the sample composition interferes with the analysis of
the analyte(s) of interest. This can bias the sample result either in a positive or in a negative way,
with the negative bias being the most common.
The magnitude of a matrix effect is best assessed by the use of matrix spikes. Matrix spikes supply
percentage recovery information which addresses the amount of bias present in the measurement system.
This information can be used to adjust reported concentrations by the application of a correction factor
based on percentage recovery. It is not recommended that sample values actually be adjusted for percent
recovery unless a worst-case scenario is being developed.
B.2.6 TENTATIVELY IDENTIFIED ORGANIC COMPOUND (TIC)
Under the CLP RAS procedures, 30 non-HSL peaks present in the reconstructed ion chromatogram are
identified as tentatively identified compounds (TICs). Other laboratories may not address TICs or have
different reporting criteria. If compounds of interest are tentatively identified by GC/MS and are high
in spectra matching criteria (above 90 percent match) and above action levels, samples may be re-run
against a standard in order to verify the compound's identity. Chromatographic retention time
consideration is an important factor in assessing the probability of tentative identification
reliability. Approaches for providing more reliable tentative identifications are under development.
B.2.7 DATA QUALIFIERS
When analytical data are validated, the analytical results and the associated QA/QC information are
reviewed using criteria specific to the analysis performed. This review can range from superficial to
very rigorous, depending on the level of analytical support utilized and the type of technical review
requested by the data user.
Data qualifiers are commonly used during the data validation process to classify sample data as to its
conformance to QC requirements. The most common qualifiers are listed below:
A - Acceptable
J - Estimate, qualitatively correct but quantitatively suspect
R - Reject, data not suitable for any purpose
U - Not detected at a specified detection limit (e.g.. 10U)
Sample data can be qualified with a "J" or "R" for many different reasons. Poor surrogate recovery,
blank contamination, or calibration problems, among other things, can cause sample data to be qualified.
Whenever sample data are qualified, the reasons for the qualification are stated in the data validation
report. Data users are reminded that data validation is generally performed using strict analytical
criteria which do not take the sampling activity's DQOs into account. Data users should request that the
B-10
-------
technical staff interpret the validation report according to the sampling activity's objectives and data
uses. For example, data qualified with a "J" may be perfectly suitable for some data uses.
B.3 ANALYTICAL ERROR
Analytical errors can be estimated for each compound or element of interest by method. Analytical error
should be calculated for non-standard (Level V) or field (Level I) methods when possible.
In order to determine potential analytical errors, the accuracy and precision of the method must be
known. The information required to develop meaningful calculations of analytical error include
interlaboratory information for matrix spikes, surrogate recoveries, duplicated and blind performance
evaluation standards for each compound analyzed for each analytical procedures as follows:
Statistical Information - N, bias, RSD of percent recovery, concentration of spike, and
concentration of analyte
Matrix - air. aqueous, soil/sediment, leachate or source material
Concentration Range - Liquids: 0-10 jjg/l; 10-100 jig/1; 100-1000 ^g/1 or > 1000 jug/1. Solids:
<1 jig/kg; 10-1000 jig/kg; or >1000^ig/kg
If the above listed information is available, analytical errors could be predicted for the majority of
analyses conducted in support of remedial actions.
For example: based on interlaboratory spike recoveries for benzene in ground water in the 0 to 10 ug/1
concentration range using Method 624, the confidence interval at the 95 percent confidence level can be
stated. This statement would be further qualified based on the number and types of laboratories, other
types of performance evaluation criteria, matrix strength, and other pertinent analytical information.
The detailed statistical information described above is not presently available. The accuracy and
precision information that is available is given in Appendix F.
B.3.1 LEVEL IV
Precision and bias data provided by the CLP RAS to be used in the estimation of analytical confidence
limits include:
Interlaboratory volatile organic matrix spike duplicate data for water and soil samples (N,
percent RSD. percent RSD at 85th Percentile)
"Interlaboratory" surrogate recovery data of generated volatile compounds from water and soil
material (N. bias percent, percent RSD)
Interlaboratory performance evaluation standard data for volatile and semi-volatile organic
compounds in water and soil (N, bias percent, percent RSD).
In all cases, the data base has been sanitized, i.e.. outliers have been removed. In the case of
"interlaboratory" surrogate recoveries the data base should be considered interlaboratory in the classic
sense - same sample submitted to a number of laboratories - but it is actually a close approximation.
The same chemical surrogates are added to samples in individual laboratories but the laboratories are not
recovering the surrogate from the same matrix. In addition, recovery data should be provided for the
air, leachate and source material.
All of this information can still be used intliNiiiuallv or in concert to de\elop uncertainty statements
but with some inherent limitations.
n-n
-------
The interlaboratory matrix spike data as provided do not stratify the data with respect to
concentration. Using these data requires the assumption that matrix recovery is a linear
function of concentration.
The "interlaboratory" surrogate recovery data are generally for one concentration range and
as a result do not account for variability of accuracy as a function of concentration; assume
that all analytes act as surrogate during the analytical process: and do not account for
interlaboratory variations associated with different matrices.
Interlaboratory performance evaluation standard data can probably be considered a "best case"
for the development of uncertainty statements (actual samples would have a greater degree of
uncertainty). The uncertainty associated with these data do not account for true sample
matrix effects, or a wide range of analyte concentrations and as a result, the actual
analytical uncertainty could only be worse than that estimated using this data set. It does
have the advantage of being truly interlaboratory and blind (sample concentrations not known
by participating laboratories) and should be a true measure of analytical uncertainty for the
concentration range and matrix analyzed.
The best estimate of analytical uncertainty would be a composite of the uncertainty associated with
matrix spike with the uncertainty associated with performance evaluation standards (interlaboratory
performance only).
B.3.2 LEVEL III
The available information to estimate uncertainty for Level III is the accuracy and precision statements
included with the individual EPA approved procedures in SW-846. While this information is rarely
stratified as to matrix and concentration, it could serve as a starting point (best case) from which the
uncertainty associated with the actual analytical conditions could be estimated.
B.3.3 LEVEL II
The most important factor that influences the uncertainty associated with Level II analyses is the skill
of the analyst doing the work. Because the procedures are not formalized, a great deal of improvisation
usually takes place. The inherent variability of the procedures themselves would make the development of
a centralized quality assurance data base tenuous. The same reasoning would apply to making uncertainty
predictions based on a centralized data base.
B.3.4 LEVEL I
Level I analyses are qualitative, and therefore it is not possible to quantify the uncertainty in these
methods.
B.4 REFERENCES
The following references can be consulted for further information on analytical considerations. This
list present a representative sample of documents.
Aleckson, K.A.. J.W. Fowler and Y.T. Lee. 1986. Inorganic Analytical Methods Performance and
Quality Control Considerations. In: Quality Control in Remedial Site Investigation: Hazardous
Industrial Solid Waste Testing Fifth Volume ASTM STP 925.
American Public Health Association. American Water Works Association. Water Pollution Control
Federation. 1975. Standard Methods for Examination of Water and Wastewater. 14th Ed.
B-12
-------
American Society for Testing Materials. 1976. Annual Book of ASTM Standards, Part 31, "Water",
Standard D3223-73, p. 343
Anderson. D.C., K.W. Brown and J. Green. 1981. Organic Leachate Effects on the permeability of clay
liners. National Conference on Management of Uncontrolled Hazardous Waste Sites, pp 223-229.
October 28-30, 1981 Washington, DC.
Bishop, J.N. 1971. Mercury in Sediments, Ontario Water Resources Comm., Toronto, Ontario, Canada.
Boston Society of Civil Engineerrs. 1985. Controlling Hazardous Wastes. Lecture Series.
Brandenberger, H. and H. Bader, 1967. The Determination of Nanogram Levels of Mercury in Solution by
a Flameless Atomic Absorption Technique, Atomic Absorption Newsletter (6), 101
CDM. 1986 Draft Memorandum re: XRF Field Analysis of Smuggler Mountain Soil Samples from R. Chapp
R. Olsen to J. Hillman January 13, 1986 EPA Contract No. 68-01-6939 Document No. 149-EP-CCCU-l.
Federal Register, Organochlorine Pesticides and PCBs, Method 608; 2,3.7,8-TCDD, Method 613;
Purgeables (Volatiles), Method 6224; Base/Neutrals, Acids and Pesticides, Method 625; Vol. 44, No.
233, Monday, December 3, 1979, pp. 69501, 69526, 69532 and 69540.
Flotard, R.D., M.T. Homshen, J.S. Wolff and J.M. Moore. 1986. Volatile Organic Analytical Methods
Performance and Quality Control Considerations. In: Quality Control in Remedial Site
Investigation: Hazardous and Industrial Solid Waste Testing, Fifth Volume ASTM STP 925.
Garbarino, J.R. and H.E. Taylor, 1979. An Inductively-Coupled Plasma Atomic Emission Spectrometric
Method for Routine Water Quality Testing. Applied Spectroscopy 33, (3)
Garner, F.C., M.T. Homsher and J.G. Pearson. 1986. Performance of USEPA Method of Analysis of 2, 3,
7. 8 - Tetrachlorodibenzo-P Dioxin in Soils and Sediments by Contractor Laboratories. In:
Quality Control in Remedial Site Investigation: Hazardous and Industrial Solid Waste Testing
Fifth Volume TSTM STP 925.
Goulden, P.D. and B.K. Aighan. 1970. An Automated Method for Determining Mercury in Water.
Technicon, Adv. in Auto. Analy. 2 317
Hatch, W.R. and W.L. Ott, 1968. Determination of Sub-Microgram Quantities of Mercury by Atomic
Absorption Spectrophotometry Analytical Chemistry 40: 2085.
Kopp, J.F., Longbottom, M.C. and Lobring, L.B. 1972. Cold Vapor Method for Determining Mercury.
AWWA, 64:20.
Martin, T.D. (EMSL/Cincinnati). Inductively Coupled Plasma - Atomic Emission Spectrometric Method of
Trace Elements Analysis of Water and Waste, Method 200.7, Modified by CLP Inorganic Data/Protocol
Review Committee.
Martin. T.D., J.F. Kopp, and R.D. Ediger, 1975. Determining Selenium in Water, Wastewater, Sediment
and Sludge by Flameless Atomic Absorption Spectroscopy. Atomic Absorption Newsletter 14: 109
Shackelford. W.M.. D.M. Cline. L. Faas. and G. Kurth. 1983. An Evaluation of Automated Spectrum
Matching for Survey Identification of Wastewater Components by Gas Chromatography - Mass
Spectrometry. Analytica Chimica Acta.
B-13
-------
Technicon Industrial Systems. 1980. Operation Manual for Technicon Auto Analyzer 11C System.
Technical Pub. #TA9-0460-00, Tarrytown, New York.
U.S. EPA. 197 Handbook for Analytical Quality Control in Water and Wastewater Laboratories,
USEPA-600/4-79-019.
. 1973. Handbook for Monitoring Industrial Wastewater, USEPA Technology Transfer.
. 1974. Methods for Chemical Analysis of Water and Wastewater, USEPA Technology Transfer.
. 1979. Methods for Chemical Analysis of Water and Wastes. 600/4-79-20.
. 1981. EMSL, Users Guide for the Continuous Flow Analyzer Automation System. Cincinnati, Ohio.
. 1982. Test Method for Evaluating Solid Waste. Physical/Chemical Methods. SW-846. 2nd
Edition.
. 1982. Office of Solid Waste and Emergency Response, Modification (By Committee) of Method
3050. SW9846, 2nd Ed., Test Methods for Evaluating Solid Waste. July.
. 1984. Soil Properties, Classification, and Hydraulic Conductivity Testing. Draft-Technical
Resource Document for Public Comment. SW-925.
. 1984. Solid Waste Leading Procedure. Draft-Technical Resource Document for Public Comment.
SW-924.
. 1984. Toxiciry Characteristic Leaching Procedure (TCLP) - Draft Method 13XX. TK0703.
. 1984. OERR. User's Guide to the Contract Laboratory Program.
. 1984. Test Methods for Evaluating Solid Waste, Physical/Chemical Methods. SW-846.
. 1984. Calculation of Precision, Bias and Method Detection Limit for
Chemical and Physical Measurements. (QAMS Chapter 5.)
. 1986. Demonstration of a Technique for Estimating Detection Limits with Specified Assurance
Probabilities. Contract No. 68-01-6939.
Winefordner, J. D., Trace Analysis: Spectroscopic Methods for Elements. Chemical Analysis, Vol. 46:
41-42.
Winge, R.K., V.J. Peterson, and V.A. Fassel, 1970 Inductively Coupled Plasma - Atomic Emission
Spectroscopy Prominent Lines. EPA-600/4-79-017.
Wolff, J.S., M.T. Homsher, R.D. Flotard and J.G. Pearson. 1986. Semi-volatile Organic Analytical
Considerations. In: Quality Control in Remedial Site Investigation. Hazardous and Industrial
Solid Waste Testing, Fifth Volume, ASTM STP 925.
B-14
-------
APPENDIX C
SAMPLING CONSIDERATIONS
-------
APPENDIX C
SAMPLING CONSIDERATIONS
The error introduced by sampling procedures must be considered during the development of DQOs. Factors
that can introduce sampling error include sampling/handling variability and the variability of
contaminants as a function of location and time. The magnitude of each of these factors is largely site
specific. The site specific nature of sampling errors distinguishes sampling from analytical errors,
which are largely independent of site conditions.
This section focuses on factors that influence sampling errors and provides general guidance on sampling
considerations to be evaluated during DQO development. It does not discuss specific sampling methods or
provide strict guidelines for sampling design. The RPM and the site manager are responsible for ensuring
that the appropriate technical experts are involved in development of the site-specific S&A plan.
C.I SAMPLING STRATEGY
In designing a sampling plan there are many factors which must be considered. Some of these factors such
as the physical characteristics of the site (geology, hydrogeology, physiography) are unique to each
site. However, there also are several general factors which must be considered for all sites. The
general factors include decisions addressed during the DQO process such as:
Will a phased approach be used?
Will samples be collected for site characterization?
Will samples be collected for confirmation purposes?
Will grab or composite samples be collected?
Will a grid system be used?
The importance of each of these factors varies from site to site, and therefore must be analyzed
individually.
For sites at which a significant amount of data have been generated during preliminary assessments and
site investigations, a focused approach to the RI can be developed. For sites for which little or no
data are available or data are inconclusive, a broader approach to site investigations must be
implemented. Data which are inconclusive or unvalidated may be appropriate for data uses requiring lower
data quality (e.g.. as indicators of areas requiring further study or confirmation).
C.2 SAMPLING PROGRESSION
In the DQO process it may be necessary to identify a sampling approach before sufficient information has
been gathered to use the statistical methods discussed in Appendix A. In these cases, it may be
beneficial to use a phased data collection approach. In a phased approach, samples are collected in a
series of independent sampling events. The first phase may be undertaken for site characterization
purposes while subsequent phases use the information generated by earlier phases to fill in data gaps.
If a mobile lab is utilized, phases may be continuous as results are analyzed and data gaps are
identified and filled. The DQO process applies to each phase of an RI and for each sampling task.
Initial sampling undertaken during the first phase may not yield specific information since little or no
site specific data may be available. However, in subsequent phases of the RI more data will be available
for decision making.
C-l
-------
A phased approach to sampling is. in most cases, a cost effective method since areas of concern are
identified in the early phases and are then targeted for additional sampling. When sampling is performed
in only one phase, every conceivable target must be completely sampled. If one or several of the targets
prove to be uncontaminated. a large number of unnecessary samples will have been taken.
Phased approaches must be developed on a site specific basis but generally will follow sequentially from
general to progressively more detailed and sophisticated field sampling and analysis programs. The steps
likely to be included in a phased sampling approach include the following:
Review of existing information/data
Remote sensing/geophysical techniques
Field screening
Intrusive sampling
Pilot studies
C. 2.1 RE VIE W OF EXISTING INFORMATION/DATA
All sources of available information should be obtained and reviewed during the initial stages of RI/FS
project planning. It is especially important to obtain and review data from any previous investigations
gathered in the National Priorities List (NPL) ranking process, FIT and/or TAT team investigations, and
other data gathering activities conducted by the state or other parties. Detailed discussions of the
various data sources which should be accessed during review of existing information are contained in the
Remedial Investigation Guidance Document (EPA 1985).
C.I.2 REMOTE SENSING/GEOPHYSICAL TECHNIQUES
Remote sensing is a term applied to methods used for the detection, recognition, or evaluation of objects
or conditions by means of distant sensing or recording devices, including, but not limited to, aerial
photography or satellite imagery. Kroeck and Shelton (1981) discuss the application of aerial
photography to investigation of waste sites. Geophysical techniques are remote sensing methods used to
characterize subsurface conditions without excavation. Information on the application of geophysical
techniques to hazardous waste sites can be found in Benson, et. al. (1983).
Remote sensing/geophysical investigations should be used in the initial stages of RIs to provide an
overall sense of the site environs (aerial photographs) and subsurface conditions. These techniques may
also be used at later stages to provide a means for extrapolation of data obtained from disruptive
techniques. For example, soil borings installed at a site may reveal the presence of a clay lens over a
portion of a site which could affect ground water migration. Geophysical techniques could be used to
provide information on subsurface conditions between the soil borings. In the absence of this
information, an extrapolation of the soil strata between the borings may result in an erroneous
interpretation of subsurface conditions.
C.2.3 FIELD SCREENING
Proper field screening techniques can be instrumental in reducing the time it takes to perform an RI/FS.
reduce costs, reduce "intrusive" sampling locations, and. in general, lead to more effective use of Level
HI and IV analyses.
Field screening is primarily used to provide indications of contamination at analytical Levels I and II.
Thus, the decisions that will be based on the results of this type of sampling are in many cases yes/no
C-2
i
-------
type decisions. For instance, on the basis of soil gas screening it may be determined that contamination
of a particular unconfined aquifer is indicated and further direct sampling is warranted.
C.2.4 INTRUSIVE SAMPLING
Intrusive sampling includes all methods in which a physical sample from the media of concern is obtained.
Intrusively obtained samples are used to obtain a numerical value for a physical or chemical measurement
at a particular point. Intrusive sampling provides much more exact information concerning the
concentration of contaminants or physical features than non-intrusive remote sensing or field screening
techniques.
C.2.5 PILOT STUDIES
Pilot studies are undertaken to obtain data to assess the applicability of various proposed alternatives
for site remediation in a controlled manner. Pilot studies can also be undertaken to evaluate the
effectiveness of various unit processes for treatment of a contaminant source at a site or for developing
data needed to optimize system design and operation. The results of pilot treatability studies are used
to develop design criteria: develop cost estimates; and to identify any special management or operational
constraints which must be implemented in order to utilize the system under study. Analytical Levels II,
HI, IV, or V may apply to pilot studies.
C.3 SOURCES OF VARIABILITY
To determine the uncertainty associated with a decision, all sources of variability must be taken into
consideration. Important sources of variability are sampling/handling variability and the variability of
contaminants as a function of location and time. Of these three sources, the variability of the
contaminants as a function of location is expected to be the largest.
C. 3.1 SAMPLING/HANDLING VARIABILITY
Sampling/handling variability is defined as any variability introduced by the sampling and/or handling
procedures, resulting in a contaminant concentration in the sample that is different than the
concentration in the original media. Causes of sampling variability include incorrect sampling
procedures and cross contamination. Since most of the causes of sampling/handling variability are
related to errors in procedures, measurement of sampling variability is difficult. The magnitude of
sampling variability can range from small to very large; however, if correct sampling and handling
procedures are followed, sampling variability should be small compared to laboratory variability.
Sampling/handling variability can be reduced by training sampling personnel and performing all sampling
activities in accordance with standard operating procedures (SOPs). SOPs are developed to ensure that
any samples collected are representative of the undisturbed media of interest. By adhering to the SOPs,
intra- and intersite variability for a given sampling method are greatly reduced or eliminated.
C. 3.2 TEMPORAL VARIABILITY
Many observed contaminant concentrations are dependent on time related variables such as the time of day
or season of the year. The important variable linking concentration and time is often climatological
(i.e.. temperature or rainfall). Since the linking variables (temperature, for instance) follow cyclical
patterns over a day or year, time dependent contaminant levels are also expected to follow cyclical
patterns. To obtain representative samples of time related variables, it is important to identify the
cyclical nature of the contaminant concentrations and to sample at various phases of the cycle to obtain
a representative sample.
-------
C. 3.3 SPATIAL VARIABILITY
Spatial variability describes the manner in which contaminants vary as a function of location. Although
this source of variability is normally not considered explicitly, it is implicitly expected. The
magnitude of the difference in contaminant concentrations in samples separated by a fixed distance is a
measure of spatial variability. The level of spatial variability is site and contaminant specific. When
spatial variability is high, a single sample is likely to be unrepresentative of the average contaminant
concentration in the media surrounding the sample. Although it is important to recognize the nature of
spatial variability at all times, it is crucial when the properties observed in a single sample will be
extrapolated to the surrounding volume. Thus, when analyzing the results from a single ground water
sample, spatial variability is not important; however, when attempting to determine the mean contaminant
concentration over a portion of a site, or attempting to extrapolate or interpolate concentrations,
spatial variability is important. Analysis of spatial variability is accomplished using geostatistics.
C.4 SAMPLE TYPES
During the DQO development process the decision maker and data users must determine which types of
samples should be obtained during the RI. The types of samples required to characterize a site may
differ from those required to perform a pilot study. An evaluation of the intended use of the data must
be undertaken in order to ensure that the type of sample obtained provides the necessary information to
address the issues of concern. In determining the types of samples which should be obtained the
following issues should be considered:
Media vs. waste samples
Grab vs. composite samples
Filtered vs. unfiltered samples
0 Biased vs. unbiased sampling
C.4.1 MEDIA VS. WASTE SAMPLES j
i
Media or environmental samples refer to sampling of air. water, soils, and other environmental media to ;
determine the extent of contamination. Waste samples refer to the sampling of the actual wastes.
Typically this will mean drums, impoundments, tanks, or other waste disposal areas. i
Sampling will typically involve both investigation of general environmental media and specific waste
accumulation areas. General questions regarding environmental media include:
Which media are contaminated? (air. water, soil, ground water, biota)
What is the average contamination? ,
!
What is the total contamination? (mass, volume) i
\
What is the maximum contamination? (concentration) J
;
What area of the site is contaminated? j
i
What is the vertical and horizontal extent of contamination?
Waste samples are those collected from drums, tanks, lagoons, pits, uaste piles, fresh spills, and other
areas of waste accumulation. The specific area or container being sampled differs from the media samples i
C-4
-------
in two ways: (1) the questions asked of the data and (2) the general characteristics of the materials
being sampled. The most common questions are concerned with waste characterization:
What compounds are present?
Do these contaminants exceed any criteria or standards?
C.4.2 COMPOSITE VS. GRAB SAMPLES
Grab samples are discrete aliquots which are representative of a specific location at a specific point in
time. Composite samples represent the mixing of a number of grab samples and represent an average value.
In the most common case, two or more grabs are added to the same container, mixed, and then a single
aliquot is taken from the mixture. However, other forms of composite sampling might be from radiation
badges or body samples for lead readings. In both these cases, the measurements would be over a number
of hours and would not represent a single sampling location or time.
When developing or reviewing a sampling plan, it is important to consider the uses of grab and composite
samples. Grab samples offer the most information regarding contaminant variability. Since compositing
involves combining several grab samples, estimation of overall site properties using composites is less
expensive than using grabs due to reduced analytical costs. However, compositing does not allow the
spatial variability of data to be determined, so the confidence in a composite value may be impossible to
determine. Composite samples should not be used when there is a potential risk of dangerous chemical
reaction or when a measure of spatial variability is important.
C.4.3 BIASED VS. UNBIASED SAMPLING
Biased sampling refers to a sampling scheme whose resulting data places emphasis on a single
characteristic or factor of the problem. Unbiased sampling refers to sampling methods which allow for
estimates to be drawn from the data which are representative of the population at large. These terms
usually can be considered to be synonymous with random and non-random sampling.
Biased sampling is most common during the site investigation (SI) process. The purpose of the SI is to
find out whether any contamination is present. Thus, these studies are typically conducted in ways that
maximize the chance of analyzing samples which have contamination above a particular criteria. The use
of direct reading instruments to screen samples is a good example of biased sampling. The samples which
are finally analyzed using, for example, GC/MS will represent higher contamination than might exist
overall at the site. This type of sampling is typically acceptable for the SI. In the RI/FS, this type
of sampling may be acceptable in cases where design of a treatment system is dependent on the maximum
treated load.
Unbiased sampling is performed by sampling on a regular grid. This type of sampling is unbiased because
each sample is representative of an identical volume of the medium being sampled. This type of sampling
is best for predicting overall site properties.
C.5 SAMPLING PATTERNS
When acquiring data which will be used to make general inferences concerning site characteristics, it is
important that samples provide complete coverage of the area of interest and that sample locations do not
introduce bias. Complete coverage is necessary to ensure that no areas of contamination are missed.
Bias in a data set causes the mean of the data to be systematically different from the true mean. Bias
is caused by any systematic error in data location, such as clustering of data. When data are clustered
(located close together) some small portions of the site are sampled more densely than the remainder of
the site. The particular contaminant value observed in the densely sampled area will be over represented
C-5
-------
iii the estimate of the sample mean. If. as is often the case, samples are clustered in highly
contaminated areas, the mean site contaminant concentration will be overestimated.
Sampling patterns should be designed to minimize bias and provide complete site coverage. The best
sampling pattern for accomplishing both of these goals is a regular grid. It can be shown theoretically
(Ripley 1982). that data taken on a regular grid will yield a more precise estimate of the mean site
contamination than data located according to any other procedure. This fact combined with the superior
coverage and non-biased property of regular sampling make it the preferred sampling pattern when
statistics will be applied.
The use of an unbiased approach during the initial sampling phases is recommended in order to ensure that
no area of the site is overlooked in sampling. Subsequent sampling phases should incorporate the
information resulting from the unbiased sampling which occurred during the initial phases. The data
should be used to identify areas in which additional samples should be obtained and areas where no
additional samples are required. Introduction of bias during subsequent phases may be justified in these
instances.
C.5.1 GRID SYSTEMS
Grid systems are used in developing systematic non-biased sampling plans in which samples are located at
consistent distances from one another. The most elementary grid system is a straight line between two
points on which regularly spaced sampling locations are noted. This type of one-dimensional sampling
grid may be useful for sampling along a straight drainage ditch or other man-made feature. The majority
of environmental sampling, however, requires a two-dimensional approach to sample location
identification.
Figure C-l presents a two dimensional square grid system for locating sampling points. The grid is
comprised of equidistant parallel lines at right angles to each other. Figure C-2 presents a two
dimensional triangular grid system comprised of equidistant parallel lines intersected by lines drawn at
60 from vertical in both directions. Sampling generally is undertaken at the intersection of the
parallel lines which compose a grid, although other approaches such as sampling in the center of each
grid box or obtaining a composite of samples within a grid box are also acceptable. It may be
appropriate to modify the grid system to account for variations in concentration gradients as illustrated
in Figure C-3.
C.5.2 STRATIFICATION
Stratification refers to the process of locating samples within distinct populations or strata. Commonly
occurring strata are geological formations, soil horizons, and visually different areas of contamination.
Typically, the number of samples taken within strata varied. For instance, initially, more samples
should be taken from a visibly contaminated soil horizon than from a soil horizon which is not visibly
contaminated. This approach needs to be used with caution and by experienced field personnel, as soil
(and other media) which is not visibly contaminated could well be contaminated. By varying the number of
samples in each strata based on existing information or information obtained in the field, the sampling
program can concentrate on the most important aspects of the site. Stratification is thus a valuable
method for conserving resources.
C.5.3 GRID SPACING
Spacings of grids are usually established to allow for sampling at each grid intersection. These
alternative sampling approaches can be used when a low intensity investigation is used preliminary to
more intensive sampling to be performed following review of the data. For example, a grid system may be
placed over a site with grid lines spaced at 10-ft intervals. During the preliminary investigation
samples may be obtained at every tenth intersection on every tenth grid line, thereby resulting in
C-6
-------
SITE BOUNDARY
C-1
SQUARE GRID SYSTEM
(A=B)
C-7
-------
C-2
TRIANGULAR GRID SYSTEM
(A=B
-------
O
I
cc
O
O
O
T
C-3
MODIFIED GRID SYSTEM TO
ACCOUNT FOR DIRECTIONAL CORRELATION
C-9
-------
samples being obtained at 100-ft spacings. Following review of the preliminary data, intenshe sampling
may be warranted in a number of discrete locations on the site. This intensive sampling may then be
performed at the previously established 10-ft grid intersections.
The distance between the grid lines will determine the number of intersections and hence the number of
potential sampling points within a specified area. As the grid line spacings increase, the number of
potential sampling points will decrease for any given sampling area. Confidence interval methodology can
be used to select optimal grid spacing.
C.6 QUALITY CONTROL SAMPLES
Various types of samples may be obtained during a remedial investigation in order to provide quality
control information for interpretation of data including:
0 Background samples
Critical samples
Collocated and replicate samples
Split samples
Field and trip blanks
Matrix spikes
In all cases QC samples must be submitted to the laboratory as blind samples.
C.6.1 BACKGROUND SAMPLES
Inclusion of background samples in an RI sampling task must be taken into consideration during the DQO
process. A background sample is one taken from media characteristic of the site but outside the zone of
contamination. Monitoring data as well as available literature on natural background concentrations of
chemicals in the area should be collected, reviewed and/or verified to determine background conditions.
Background data should be defined as either natural or anthropogenic chemical contamination resulting
from a source or sources other than the site undergoing assessment.
C.6.2 CRITICAL SAMPLES
Critical data points are sample locations for which valid data must be obtained in order for the sampling
event to be considered complete. An example of a critical data point may be an upgradient well in a
ground water contamination study or any other data point considered vital to the decision making process.
Critical data points should be carefully considered in the sampling plan design. In some cases, taking
critical data point samples in duplicate is appropriate. A common problem of any sample design is the
loss of data during implementation of the design. Care must be taken to determine the set of points for
which data must be collected in order to analyze the results accurately. The set of points which must be
collected are called the "critical points." Critical points may be defined in terms of the minimum
number of data points which must be collected and analyzed. Critical data points should be identified in
every completeness statement developed during the DQO process.
C.6.3 COLLOCATED AND REPLICATE SAMPLES
Collocated samples are independent samples collected in such a manner that they are equally
representative of the parameter(s) of interest at a given point in space and time. Examples of
C-10
-------
collocated samples include: samples from two air quality analyzers sampling from a common sample
manifold, two water samples collected at essentially the same time and from the same point in a lake, or
side-by-side soil core samples.
Collocated samples, when collected, processed, and analyzed by the same organization, provide
intralaboratory precision information for the entire measurement system including sample acquisition,
homogeneity, handling, shipping, storage, preparation and analysis. Collocated samples, when collected.
processed and analyzed by different organizations, provide interlaboratory precision information for the
entire measurement system.
Replicate samples are samples that have been divided into two or more portions at some step in the
measurement process. Each portion is then carried through the remaining steps in the measurement
process. A sample may be replicated in the field or at different points in the analytical process. For
field replicated samples, precision information would be gained on homogeneity (to a lesser extent than
for collocated samples), handling, shipping, storage, preparation, and analysis. For analytical
replicates, precision information would be gained on preparation and analysis. Examples of field
replicated samples include a soil core sample that has been collected and poured into a common container
for mixing before being split and placed in individual sample containers.
Collocated samples can be used to estimate the overall precision of a data collection activity. Sampling
error can be estimated by the inclusion of collocated and replicated versions of the same sample. If a
significant difference in precision between the two subsets is found, it may be attributed to sampling
error. As a data base on field sampling error is accumulated, the magnitude of sampling error can be
determined.
The following are suggested guidelines for the inclusion of collocated and replicated samples in field
programs:
Ground and surface water - one out of every 20 investigative samples should be collocated.
Replicated samples could be substituted where appropriate. These samples should be spread
out over the sampling event, preferably at least one for each day of sampling.
Soil, sediments and solids - one out of every 20 investigative samples should be field
replicated or collocated. To estimate sampling error, collocated and field replicated
samples should be of the same investigative sample. These samples should be spread out over
the sampling event, preferably one per each day of sampling.
C.6.4 SPLIT SAMPLES
Split samples are replicate samples divided into two portions, sent to different laboratories, and
subjected to the same environmental conditions and steps in the measurement process. They serve as an
oversight function in assessing the analytical portion of the measurement system.
C.6.5 TRIP AND FIELD BLANKS
Trip blanks generally pertain to volatile organic samples only. Trip blanks are prepared prior to the
sampling event in the actual sample containers and are kept with the investigative samples throughout the
sampling event. They are then packaged for shipment with the other samples and sent for analysis. There
should be one trip blank included in each sample shipping container. At no time after their preparation
are the sample containers opened before they reach the laboratory.
Field blanks are defined as samples which are obtained by running anahte-free deionized water through
sample collection equipment (bailer, pump, auger, etc.) after decontamination, and placing it in the
appropriate sample containers for analysis. These samples will be used to determine if decontamination
-------
procedures have been sufficient. Using the above definition, soil field blanks could be called rinsate
samples. These should be included in a sampling program as appropriate.
The following guidelines for including blanks in sampling programs are suggested.
Ground and surface water - Field blanks should be submitted at the rate of one field
blank/matrix/per day or one for every 20 investigation samples, whichever results in fewer
samples. Trip blanks should be included at a frequency of one per day of sampling or as
appropriate.
Soil sediments and solids - Rinsate samples should be submitted at the rate of one for every
20 investigative samples for each matrix being sampled or as appropriate.
Guidelines for blank, duplicate, and background samples are provided in Table C-l. These guidelines
serve as a starting point from which to develop site-specific sampling plan QC sample numbers. In
certain instances, it may be appropriate to utilize known reference materials when available for QC
checking. The numbers and sources of reference materials which would provide meaningful comparison and
checks for media obtained from hazardous waste sites are limited. Analytical chemists should be
consulted regarding the appropriateness of use of reference materials as a QC check.
C.6.6 MATRIX SPIKES
Many samples exhibit matrix effects, in which other sample components interfere with the analysis of
contaminants of interest. Matrix spikes provide the best measurement of this effect. When done in the
field, immediately after collection, they also provide a measurement of sampling, handling and
preservation error. The field matrix spike does provide the best overall assessment of accuracy for the
entire measurement system, as collocated samples do for precision assessment. However, there are some
serious issues regarding the field spiking of environmental samples that must be considered. Field
matrix spikes are generally not recommended because of the high level of technical expertise required for
proper use and their sensitivity to environmental variables.
The major problems associated with field matrix spikes are due to the fact that all spike recovery data
must be interpreted very carefully. Spike recoveries are subject to many competing factors, such as
analyte stability, holding time, and the sample matrix. Because of the inherent variability associated
with spike recoveries, the additional variability introduced by spiking samples in the field can increase
the overall uncertainty associated with a data set rather than decrease it.
The two most important issues to address when considering field spiking as an option are the source of
the spiking material and the technical capability of the person doing the spiking. Spiking materials
that can be used are Standard Reference Materials (SRMs), EPA quality control ampules, or
laboratory-prepared solutions made from pure compounds. SRMs are stand-alone standards prepared by NBS
that can be placed in the appropriate sample containers and sent to the laboratory to be analyzed. The
use of certified standards such as SRMs solves the "traceability" issue concerning the integrity of the
blind standard and also does not require a skilled technician to prepare the standard. However, because
the SRM is a stand-alone sample, it provides no information on the impact of the sample matrix on the
measurement system. An aliquot of an SRM can be used to spike an environmental sample, but it would no
longer be traceable and would require a person skilled in the appropriate analytical techniques, just as
the use of quality control ampules or laboratory-prepared spikes do. The competence of the person doing
the spiking is critical. The exact amount of spiking material must be recorded for future use in
assessing recoveries. Errors in measurement of the spike or use of the wrong spiking material will cause
serious problems in interpreting the usability of the data.
C-12
-------
TABLE C-l
GUIDELINES FOR MINIMUM QA/QC SAMPLES FOR FIELD SAMPLING PROGRAMS
DUPLICATES
n
MEDIA
Aqueous
Soil,
sediment
Air
Source
material
FIELD
COLLOCATED OR REPLICATE
one in twenty
one in twenty
one in twenty
one in twenty
FIELD
BLANK
one in
twenty
one in
twenty
not
available
not usually
required
TRIP
BLANK
one per
day of
sampling
one per
day of
sampl ing
BACKGROUND
SAMPLE
min. of two
per sampling
event-media
min. of two
per sampling
event-media
min. of two
per sampling
event-media
INTER-LAB
SPLIT SAMPLE
when required
to meet
objectives
when required
to meet
objectives
when required
to meet
objectives
when required
to meet
objectives
NOTE: This table is provided to serve as a guideline only; QA/QC sample requirements must be
developed on a site-specific basis. Laboratory blanks and spikes are method specific and are
not included in this table.
-------
In summary, field matrix spikes are not recommended unless the appropriate technical support is
available. Absolute attention to all details is required to obtain useful information from the
procedure. If field matrix spikes are used, the results should be compared with laboratory matrix spike
results.
C.I REFERENCES
Benson, R.C.; R.A. Glaccum, M.R. Noel. 1984. Geophysical Techniques for Screening Buried Wastes and
Waste Migration, U.S. EPA Environmental Monitoring Systems Laboratory-Las Vegas.
Charlie, W.A., R.E. Wardwell, and O.B. Andersland. 1979. Leachate Generation from sludge disposal
area. ASCE Journal of Environmental Engineering Division Vol. 10S EES:947-960.
Claassen. H.C. 1982. Guidelines and Techniques for Obtaining Water Samples that Accurately
Represent the Water Chemistry of an Aquifer. USGS Open File Report 82-1024. Lakewood, CO.
Clay, P.P. and T.M. Spittler. 1982. The Use of Portable Instruments in Hazardous Waste Site
Characterizations, Proceedings of the Third National Conference on Management of Uncontrolled
Hazardous Waste Sites, Washington, D.C.
Ecology and Environment, Inc. 1982. FIT Operations and Field Manual, HNu Systems PI 101
Photoionization Detector and Century Systems Model OVA-128 Organic Vapor Analyzer, prepared for
U.S. EPA, Washington, D.C.
Elans, R.B. 1986. Groundwater Monitoring Data Quality Objectives for Remedial Site Investigations.
In: Quality Control in Remedial Site Investigations: Hazardous and Industrial Solid Waste
Testing Fifth Volume ASTM STP 925.
Fuller, W.H., A. Amoozeyar-Fard, E.F. Niebla, as M. Boyle. 1980. Influence of Leachate Quality on
Soil Attenuation of Metals. In: Disposal of Hazardous Wastes pp 108-117. EPA-600-9-80-010.
Jacot, Brian. 1983. OVA Field Screening at a Hazardous Waste Site. Proceedings of the Fourth
National Conference on Management of Uncontrolled Hazardous Waste Sites. Washington, D.C.
Kroeck. R.M.; and G.A. Shelton. 1982. Overhead Remote Sensing for Assessment of Hazardous Waste
Sites. U.S. EPA Environmental Monitoring Systems-Las Vegas. 600/X-82-019
Mehran, M. et al. 1983. Delineation of Underground Hydrocarbon Leaks by Organic Vapor Detection.
Proceedings of the Fourth National Conference on Management of Uncontrolled Hazardous Waste Sites.
Washington, D.C.
Quimby, J.M. et al. 1982. Evaluation and Use of a Portable Gas Chromatograph for Monitoring
Hazardous Waste Sites. Proceedings of the Third National Conference on Management of Uncontrolled
Hazardous Waste Sites. Washington, D.C.
Raveh, A. and Y. Avnimelech. 1979. Leaching of pollutants from sanitary landfill models. Journal
WPCF 51(11):2705-2716.
Ripley, B. 1982. Spatial Statistics. John Wiley & Sons. New York.
Spittler. T.M. 1980. Use of Portable Organic Vapor Detectors for Hazardous Waste Site
Investigations. Second Oil and Hazardous Materials Spill Conference and Exhibition. Philadelphia,
Pennsylvania.
C-14
-------
Spittler, T.M. et al. 1981. Ambient Monitoring for Specific Volatile Organics Using a Sensitive
Portable PID GC. Proceedings of the Second National Conference on Management of Uncontrolled
Hazardous Waste Sites. Washington, D.C.
Spittler, T.M. 1983. Field Measurement of PCBs in Soil and Sediment Using a Portable Gas
Cliromatograph. Proceedings of the Fourth National Conference on Management of Uncontrolled
Hazardous Waste Sites. Washington, D.C.
U.S. EPA. 1977. Procedures Manual for Groundwater Monitoring at Solid Waste Disposal Facilities. EPA
530/SW-611.
. 1980. Environmental Monitoring and Support Laboratory. Cincinnati. Ohio, Interim Methods for
the Sampling and Analysis of Priority Pollutants in Sediments and Fish Tissue. October.
. 1981. Soil and Sediment Sampling Methods. Technical Methods for Investigating Sites
Containing Hazardous Substances. Technical Monograph No. 17.
. Handbook for Sampling and Sample Preservation of Water and Wastewater. EPA 600/4-82-029.
. 1983. Characterization of Hazardous Waste Sites~A Methods Manual, Volume II, Available
Sampling Methods. NTIS PB84-126929.
. 1983. Preparation of Soil Sampling Protocol: Techniques and Strategies NTIS PB83-20-6979
EPA 600/4-83-020.
. 1984. Soil Sampling Quality Assurance User's Guide. EPA 600/4-84-043
. 1984. Quality Assurance Handbook for Air Pollution Measurement Systems. Volumes I and 2, EPA
- 600/9-76-005. January 19.
. 1984. Standard Operating Safety Guides. Office of Emergency and Remedial Response.
. 1985. Characterization of Hazardous Waste Sites - A Methods Manual. Volume 1 - Site
investigation. EPA/600/4-84/075.
C-15
-------
APPENDIX D
REVIEW OF QAMS CHECKLIST
-------
APPENDIX D
REVIEW OF QAMS DQO CHECKLIST
In a memorandum dated April 3, 1983, Mr. Stanley Blacker. Director of the Quality Assurance
Management Staff (QAMS) issued a checklist to be used by QAMS staff during their review of
DQOs. The purpose of this appendix is to review the QAMS checklist with respect to this
RI/FS DQO guidance.
The QAMS checklist is designed for use in reviewing specific DQOs rather than an approach to
DQO development for a complex process such as an RI/FS. This appendix presents a review of
the checklist items, along with a reference to the section where the item is addressed and/or
a comment regarding the applicability of the item to the RI/FS DQO process.
The RI/FS process involves multiple levels of data and data uses, and culminates in a
decision regarding the degree of remedial response to be implemented for a site. Decisions
are based on analytical and other measurement data which are often integrated to interpret
various aspects of a site's characteristics. Thus, many different sets of DQOs may be
required for a given RI/FS.
D-!
-------
APPENDIX D
SUMMARY OF DQO CHECKLIST ITEMS WITH RESPECT TO
RI/FS DQO APPLICABILITY
DQO CHECKLIST ITEM
A-l. The decision maker and
associated users are clearly
identified.
COMMENT RE: RI/FS DQO APPLICABILITY
The key RI/FS decision is remedy
selection (i.e., ROD/EDD
signature). For the majority of
RI/FS projects, remedy selection is
delegated to the Regional
Administrator (RA). Program
Management responsibilities are
delegated to the Waste Management
Division Director and Managers,
with project specific management
and oversight assigned to Remedial
Project Managers (RPM). In this
role the regional EPA RPM, is
responsible for coordinating the
DQO development process, and
overseeing remedial contractors,
state officials, or private parties
conducting the RI/FS. Associated
data users include primary,
secondary and technical support and
project review/audit personnel.
A-2. The decision maker and
associated data users have been
involved in the development of
DQOs.
B-la. A statement of the
decision(s) that depend(s) on
the results of this data
collection activity.
See Section 2.0. Stage 1 - Identify
& Involve Data Users.
The decision(s) that result from
the RI/FS process involve multiple
levels of data for multiple
purposes. See Section 3.0. Stage 1
- Specify RI/FS Objectives.
B-lb. If the data collection
activity is of an exploratory
nature and not formally linked
with a regulatory decision, then
the document should include a
clear explanation of the purpose
for which the environmental data
are intended.
See Section 4.0. Stage 2 Identify
Data Uses/Needs.
D-2
-------
APPENDIX D
(continued)
SUMMARY OF DQO CHECKLIST ITEMS WITH RESPECT TO
RI/FS DQO APPLICABILITY (continued)
DQO CHECKLIST ITEM
B-2. Statements of each specific
question that will be addressed
in the data collection activity
and the type of conclusion that
is anticipated as an appropriate
answer to each question. The
conclusions should depend only on
measurement data.
B-3. A clear statement of the
way in which each conclusion of
the study will be represented, in
terms of the results of
statistical calculations made
with the data.
B-4. Statements of the
acceptable levels of precision
and accuracy associated with each
of the conclusions depend on
measurement data.
B-5. A definition of the
population to which each of the
conclusions apply, including
definitions of all subpopulations
or strata.
B-6. Definitions of the
variables that will be measured.
B-7. The acceptable levels of
precision and accuracy for the
measurements to be made.
B-8. A flow chart or spread
sheet illustrating the
relationship between the measure-
ment data and each conclusion
that will be made with the data.
COMMENT RE: RI/FS DQO APPLICABILITY
See Section 4.0, Stage 2
See Section 4.0, Stage 2 - The
conclusions of an RI/FS study are
highly interdependent. The format
for data presentation will vary,
based upon data quantity. A
statistical approach may not be
feasible.
See Section 4.0, Stage 2.
See Section 4.0, Stage 2.
See Section 4.0, Stage 2.
See Section 4.0, Stage 2.
See Section 4.0. Stage 2.
D-3
-------
APPENDIX E
POTENTIALLY APPLICABLE
OR RELEVANT AND
APPROPRIATE REQUIREMENTS
(50 FR 47948)
-------
APPENDIX E
POTENTIALLY APPLICABLE OR RELEVANT AND APPROPRIATE REQUIREMENTS
1. EPA's Office of Solid Waste administers, inter alia, the Resource Conservation and Recovery Act
of 1976, as amended (Pub. L. 94-580, 90 Stat 95, 42 U.S.C. 6901 et seq.). Potentially
applicable or relevant requirements pursuant to that Act are:
a. Open Dump Criteria - Pursuant to RCRA Subtitle D criteria for classification of solid waste
disposal facilities (40 CFR Part 257).
Note: Only relevant to nonhazardous wastes.
b. In most situations Superfund wastes will be handled in accordance with RCRA Subtitle C
requirements governing standards for owners and operators of hazardous waste treatment,
storage, and disposal facilities: 40 CFR Part 264, for permitted facilities, and 40 CFR
Part 265, for interim status facilities.
Ground Water Protection (40 CFR 264.90-264.109).
Ground Water Monitoring (40 CFR 265.90-265.94).
Closure and Post Closure (40 CFR 264.110-264.120, 265.110-265.112).
Containers (40 CFR 264.170-264.178. 265.170-265.177).
Tanks (40 CFR 264.190-264.200, 265.190-265.199).
Surface Impoundments (40 CFR 264.220-264.249, 265.220-265.230).
Waste Piles (40 CFR 264.250-264.269, 265.250-265.258)
Land Treatment (40 CFR 264.270-264.299, 265.270-265.282).
Landfills (40 CFR 264.300-264.339, 265.300-265.316).
Incinerators (40 CFR 264.340-264.999, 265.340-265.369).
Dioxin-containing Wastes, (50 FR 1978). Includes the final rule for the listing of dioxin
containing waste.
2. EPA's Office of Water administers several potentially applicable or relevant and appropriate
statutes and regulations issued thereunder:
a. Section 14.2 of the Public Health Service Act as amended by the Safe Drinking Water Act as
amended (Pub. L. 93-523, 88 Stat 1660. 42 U.S.C. 300f et sec.)
Maximum Contaminant Levels (for all sources of drinking water exposure). (40 CFR
141.11-141.16)
Underground Injection Control Regulations. (40 CFR Parts 144. 145. 146. and 147)
E-l
-------
b. Clean Water Act as amended (Pub. L. 92-500. 86 Stat 816, 33 U.S.C. 1251 et. seq.)
Requirements established pursuant to sections 301. 302, 303 (including State water quality
standards), 306. 307. (including Federal Pretreatment requirements for discharge into a
publicly owned treatment works), and 403 of the Clean Water Act. (40 CFR Parts 131,
400-469)
c. Marine Protection, Research, and Sanctuaries Act (33 U.S.C. 1401).
Incineration at sea requirements. (40 CFR Part 220-225, 227, 228. See also 40 CFR
125.120-125.124)
3. EPA's Office of Pesticides and Toxic Substances
Toxic Substances Control Act (15 U.S.C. 2601).
PCB Requirements General ly: 40 CFR Part 761; Manufacturing Processing, Distribution in
Commerce, and Use of PCBs and PCB Items (40 CFR 761.20-761.30); Markings of PCBs and PCB
Items (40 CFR 761.40-761.45); Storage and Disposal (40 CFR 761.60-761.79). Records and
Reports (40 CFR 761.180-761.185). See also 40 CFR 129.105, 750.
Disposal of Waste Material Containing TCDD. (40 CFR Parts 775.180-775.197).
4. EPA's Office of External Affairs
Section 404(b)(l) Guidelines for Specification of Disposal Sites for Dredged or Fill
Material (40 CFR Part 230).
Procedures for denial or Restriction of Disposal Sites for Dredged Material (Section 404(c)
Procedures, 40 CFR Part 231).
5. EPA's Office of Air and Radiation administers several potentially applicable or relevant and
appropriate statutes and regulations issued thereunder:
a. The Uranium Mill Tailings Radiation Control Act of 1978 (42 U.S.C. 2022).
Uranium mill tailing rules - Health and Environmental Protection Standards for Uranium and
Thorium Mill Tailings
(40 CFR Part 192).
b. Clean Air Act (42 U.S.C. 7401).
National Ambient Air Quality Standards for total suspended particulates (40 CFR Part 50.6-
50.7)
National Ambient Air Quality Standards for ozone (40 CFR 50.9).
Standards for Protection Against Radiation - high and low level radioative waste rule. (10
CFR Part 20). See also 10 CFR Parts 10. 40. 60. 61. 72. 960. 961.
National Emission Standard for Hazardous Air Pollutants for Asbestos. (40 CFR
61.140-61.156). See also 40 CFR 427.110-427.116. 763. National Emission Standard for
Hazardous Air Pollutants for Radionuclides (40 CFR Part 61. 10 CFR 20.101-20.108).
E-:
-------
6. Other Federal Requirements
a. OSHA requirements for workers engaged in response activities are codified under the
Occupational Safety and Health Act of 1970 (29 U.S.C. 651), The relevant regulatory
requirements are included under:
Occupational Safety and Health Standards (General industry Standards) (29 CFR Part 1910).
The Safety and Health Standards for Federal Service Contracts (29 CFR Part 1926).
The Shipyard and Longshore Standards (29 CFR Parts 1915, 1918).
Recordkeeping, reporting, and related regulations (29 CFR Part 1904).
b. Historic Sites, Buildings, and Antiquities Act (16 U.S.C. 461).
c. National Historic Preservation Act, 16 U.S.C. 470. Compliance with NEPA required pursuant
to 7 CFR Part 650, Protection of Archaelogical Resources: Uniform Regulations --
Department of Defense (32 CFR Part 229, 229.4), Department of the Interior (43 CFR Part 7,
7.4).
d. D.O.T. Rules for the Transportation of Hazardous Materials, 49 CFR Parts 107, 171.1-171.500.
Regulation of activities in or affecting waters of the United States pursuant to 33 CFR
Parts 320-329. The following requirements are also triggered by Fund-financed actions:
Endangered Species Act of 1973, 16 U.S.C. 1531. (Generally, 50 CFR Parts 81, 225, 402).
Wild and Scenic Rivers Act, 16 U.S.C. 1271.
Fish and Wildlife Coordination Act, 16 U.S.C. 661 note.
0 Fish and Wildlife Improvement Act of 1978, and Fish and Wildlife Act of 1956, 16 U.S.C. 742a
note.
Fish and Wildlife Conservation Act of 1980, 16 U.S.C. 2901. (Generally, 50 CFR Part 83).
Coastal Zone Management Act of 1972, 16 U.S.C. 1451. (Generally, 15 CFR Part 930 and 15
CFR 923.45 for Air and Water Pollution Control Requirements).
OTHER FEDERAL CRITERIA. ADVISORIES. GUIDANCES.
AND STATE STANDARDS TO BE CONSIDERED
1. Federal Criteria, Advisories and Procedures
Health Effects Assessments (HEAs).
Recommended Maximum Concentration Limits (RMCLs).
Federal Water Quality Criteria (1976. 1980. 1984). Note: Federal Water Quality Criteria
are not legally enforceable. State water quality standards are legally enforceable, and are
developed using appropriate aspects of Federal Water Quality Criteria. In many cases. State
n-3
-------
water quality standards do not include specific numerical limitations on a large number of
priority pollutants. When neither State standards nor MCLs exist for a given pollutant.
Federal Water Quality Criteria are pertinent and therefore are to be considered.
Pesticide registrations.
Pesticide and food additive tolerances and action levels. Note: Germane portions of
tolerances and action levels may be pertinent and therefore are to be considered in certain
situations.
Waste load allocation procedures, EPA Office of Water.
Federal sole source aquifer requirements.
Public health basis for the decision to list pollutants as hazardous under section 112 of
the Clean Air Act.
EPA's Ground-water Protection Strategy.
New Source Performance Standards for Storage Vessels for Petroleum Liquids.
» TSCA health data.
Pesticide registration data.
TSCA chemical advisories (2 or 3 issued to date).
Advisories issued by FWS and NWFS under the Fish and Wildlife Coordination Act.
Executive Orders related to Floodplains (11988) and Wetlands (11990) as implemented by EPA's
August 6, 1985, Policy on Floodplains and Wetlands Assessments for CERCLA Actions.
TSCA Compliance Program Policy.
OSHA health and safety standards that may be used to
protect public health (non-workplace).
Health Advisories. EPA Office of Water
2. State Standards
State Requirements on Disposal and Transport of Radioactive wastes.
State Approval of Water Supply System Additions or Developments.
State Ground Water Withdrawal Approvals. Requirements of authorized (Subtitle C of RCRA)
State hazardous waste programs.
State Implementation Plans and Delegated Programs Under Clean Air Act.
All other State requirements, not delegated through EPA authority.
Approved State NPDES programs under the Clean Water Act.
E-4
-------
o Approved State UIC programs under the Safe Drinking Water Act.
Note: Many other State and local requirements could be pertinent. Forthcoming guidance
will include a more comprehensive list.
3. USEPA RCRA Guidance Documents
o Draft Alternate Concentration Limits (ACL) Guidance
A. EPA's RCRA Design Guidelines
1. Surface Impoundments, Liners Systems, Final Cover and Freeboard Control.
2. Waste Pile Design - Liner Systems.
3. Land Treatment Units.
4. Landfill Design - Liner Systems and Final Cover.
B. Permitting Guidance Manuals
1. Permit Applicant's Guidance Manual for Hazardous Waste Land Treatment, Storage, Disposal
Facilities.
2. Permit Writer's Guidance Manual for Hazardous Waste Land Treatment, Storage, and Disposal
Facilities.
3. Permit Writer's Guidance Manual for Subpart F.
4. Permit Applicants Guidance Manual for the General Facility Standards.
5. Waste Analysis Plan Guidance Manual.
6. Permit Writer's Guidance Manual for Hazardous Waste Tanks.
7. Model Permit Application for Existing Incinerators.
8. Guidance Manual for Evaluating Permit Applications for the Operation of Hazardous Waste
Incinerator Units.
9. A guide for Preparing RCRA Permit Applications for Existing Storage Facilities.
10. Guidance Manual on Closure and Post-Closure Interim Status Standards.
C. Technical Resource Documents (TRDs)
1) Evaluating Cover Systems for Solid and Hazardous Waste.
2) Hydrologic Simulation of Solid Waste Disposal Sites.
3) Landfill and Surface Impoundment Performance Evaluation.
4) Lining of Water Impoundment and Disposal Facilities.
E-5
-------
5) Management of Hazardous Waste Leachate.
6) Guide to the Disposal of Chemically Stabilized and Solidified Waste.
7) Closure of Hazardous Waste Surface Impoundments.
8) Hazardous Waste Land Treatment.
9) Soil Properties, Classification, and Hydraulic Conductivity Testing.
D. Test Methods for Evaluating Solid Waste
1) Solid Waste Leaching Procedure Manual.
2) Methods for the Prediction of Leachate Plume Migration
and Mixing.
3) Hydrologic Evaluation of Landfill Performance (HELP) Model Hydrologic Simulation on Solid
Waste Disposal Sites.
4) Procedures for Modeling Flow Through Clay Liners to
Determine Required Liner Thickness.
5) Test Methods for Evaluating Solid Wastes.
6) A Method for Determining the Compatibility of Hazardous
Wastes.
7) Guidance Manual on Hazardous Waste Compatibility.
4. USEPA Office of Water Guidance Documents
A. Pretreatment Guidance Documents
I) 304(g) Guidance Document Revised Pretreatment Guidelines (3) Volumes
B. Water Quality Guidance Documents
1) Ecological Evaluation of Proposed Discharge of Dredged Material into Ocean Waters (1977)
2) Technical Support Manual: Waterbody Surveys and Assessments for Conducting Use
Attainability Analyses (1983)
3) Water-Related Environmental Fate of 129 Priority Pollutants (1979)
4) Water Quality Standards Handbook (1983)
5) Technical Support Document for Water Quality-based Toxics Control.
C. NPDES Guidance Documents
1) NPDES Best Management Practices Guidance Manual (June 1981)
2) Case studies on toxicity reduction evaluation (May 1983).
E-6
-------
D. Ground Water/UIC Guidance Document
1) Designation of a USDW
2) Elements of Aquifer Identification
3) Interim guidance for public participation
4) Definition of major facilities
5) Corrective action requirements
6) Requirements applicable to wells injecting into, through or above an aquifer which has been
exempted pursuant to Section 146.104(b)(4).
7) Guidance for UIC implementation on Indian lands.
5. USEPA Manuals from the Office of Research and Development
1) EW 846 methods - laboratory analytic methods.
2) Lab protocols developed pursuant to Clean Water Act
Section 304(h).
n-7
-------
APPENDIX F
HISTORICAL PRECISION AND ACCURACY
DATA CLASSIFIED BY MEDIA
BY ANALYTICAL LEVEL
-------
APPENDIX F CONTENTS
HISTORICAL PRECISION AND ACCURACY TABLES
Introduction
Water: Level III
Water: Level IV
Soil: Level I
Soil: Level II
Soil: Level III
Soil: Level IV
Air: Level I
Air: Level II
Air: Level III
Other Media: Level III
F-l
-------
INTRODUCTION
The data in this Appendix have been compiled to assist the reader in
selecting an analytical method appropriate for each data use. The methods
are classified by media and by analtyical levels defined as follows:
Level I - field screening or analysis using portable instruments.
Results are often not compound specific and not quantitative but
results are available in real-time.
Level II - field analysis using more sophisticated portable
analytical instruments; in some cases, the instruments may be set up
in a mobile or onsite laboratory. There is a wide range in the
quality of data that can be generated. Quality depends on the use
of suitable calibration standards, reference materials, and sample
preparation equipment; and the training of the operator. Results
are available in real-time or several hours.
Level III - all analyses performed in an offsite analytical
laboratory using standard, documented procedures. The laboratory
may or may not be a CLP laboratory.
Level IV - CLP routine analytical services (RAS). All analyses are
performed in an offsite CLP analytical laboratory following CLP
protocols.
Precision and accuracy data are presented in tabular fashion. Footnotes to
each table cite the sources of the data and the concentration or
concentration range at which the precision and accuracy were determined.
When no concentration is cited no concentration information was available
in the source material.
Precision is a measure of the variability in repeated measurements of the
same sample compared to the average value. Precision is reported as %
Relative Standard Deviation (RSD). The lower the % RSD, the more precise
the data.
F-2
-------
RSD is calculated for a pair of replicates using the following formula:
%RSD = [ 2 | Xj_ -X2 |/( Xt +X2 ) ] (100//2)
where X^^ is measurement #1 of a replicate
X2 is measurement #2 of a replicate
Accuracy is reported as % Bias; as % Bias approaches zero, accuracy
increases. Bias is calculated by the following formula:
% Bias = X-Y (100)
Y
where Y is the known concentration or true value
X is the reported concentration
Bias measures the systematic error within an analytical technique.
F-3
-------
HISTORICAL PRECISION AND ACCURACY DATA/WATER a
LYTICAL TECHh
I THAN CLP I
ANALYTES
BRCIMODICHLOROMFTHANF
BROMOFORM
METHOD CONCENTRATION
(TECHNIQUE) RANGE
624
(GC/MS)
8240
(GC/MS)
624
(GC/MS)
501.1
(PURGE & TRAP GC/MS)
501.2
(EXTRACTION GC/MS)
624
(GC/MS)
501.1
(PURGE & TRAP GC/MS)
501.2
(EXTRACTION GC/MS)
11 ug/l
480 ug/l
5-100 ug/l
8 ug/l
480 ug/l
0.9 ug/l
550 ug/l
1.8 ug/l
170 ug/l
9 ug/l
400 ug/l
4.8 ug/l
550 ug/l
6 ug/l
170 ug/l
PRECISION
% RSD
16
21
21
28
18
66
34
61
23
32
30
44
41
14
15
ACCURACY
% BIAS
0
-16
12
-8.8
-6.7
0
-3.8
33
-19
-23
10
-27
7.5
-23
1.8
-------
HISTORICAL PRECISION AND ACCURACY DATA/WATER a
(continued)
LEVEL III ANALYTICAL TECHNIQUES - METHODS OTHER THAN CLP RAS METHODS
ANALYTES
CHLOROFORM
Tl
I
DIBROMOCHLOROMETHANE
DIOXIN
METHOD CONCENTRATION
(TECHNIQUE) RANGE
624
(GC/MS)
501.1
(PURGE & TRAP GC/MS)
501.2
(EXTRACTION GC/MS)
624
(GC/MS)
501.1
(PURGE & TRAP GC/MS)
501.2
(EXTRACTION GC/MS)
613
(GC/MS)
4.5 ug/l
300 ug/l
0.9 ug/l
550 ug/l
1.8 ug/l
170 ug/l
8.1 ug/l
360 ug/l
0.8 ug/l
550 ug/l
1.8 ug/l
170 ug/l
21 ng/l
202 ng/l
PRECISION
% RSD
31
14
64
14
68
26
13
19
35
36
37
13
25
21
ACCURACY
% BIAS
2.2
-0.6
44
-0.02
-39
-1.2
-3.1
10
-12.5
4.7
0
0.02
N.A.
N.A.
-------
HISTORICAL PRECISION AND ACCURACY DATA/WATER a
(continued)
LEVEL III ANALYTICAL TECHNIQUES - METHODS OTHER THAN CLP RAS METHODS
ANALYTES
METHYLENE CHLORIDE
JjQUJELE
TRICHLOROETHENE
LEAD
METHOD
(TECHNIQUE)
624
(GC/MS)
624
(GC/MS)
8240
(GC/MS)
624
(GC/MS)
8240
(GC/MS)
200.7
(ICP)
239.1
(FLAME AA)
239.2
(FURNACE AA)
CONCENTRATION
RANGE
7.2 ug/l
480 ug/l
13.5 ug/l
600 ug/l
25 ug/l
75 ug/l
5.4 ug/l
360 ug/l
25 ug/l
75 ug/l
42 ug/l
47.7 ug/l
12 ug/l
105 ug/l
10 ug/l
234 ug/l
PRECISION
% RSD
78
52
19
31
19
48
39
24
34
5
5.9
6.7
53
19
ACCURANCY
% BIAS
-17
-25
15
-14
-10
44
-2.3
5
31
4.4
17
-1.9
-22
-3.1
a. Source: Draft Compendium of Information and Performance Data on Routinely Used Measurement Methods (RUMM) - Pilot Phase,
RTI/3087/03, prepared for EPA Quality Assurance Management Staff, January 1986. This document should be
consulted for more information on individual analytes.
-------
HISTORICAL PRECISION AND ACCURACY DATA/WATER
(Continued)
LEVEL III ANALYTICAL TECHNIQUES - SW-846 METHODS
Method
Number
ORGANICS
8010
8020
8030
8040
8060
8080
8090
8100
8120
8140
8150
8240
8250
8040
Method Name
Halogenated Volatile Organics
Aromatic Volatile Oranics
Acrolein, Acrylonitrile,
Acetonitrile
Phenols
Esters
Organochlorine Pesticides
and PCBs
Nitroaromatics and Cyclic
Ketones
Polynuclear Aromatic
Hydrocarbons
Chlorinated Hydrocarbons
Organophosphorous Pesticides
Chlorinated Herbicides
Volatile Organics
GC/MS Semivolatiles (Packed
Column)
GC/MS Semivolatiles
(Capillary)
Data
Source
SW 846
SW 846
SW 846
SW 846
EPA 606
SW 846
SW 846
SW 846
SW 846
SW 846
SW 846
Range of
Recovery (%)
75.1 - 106.1
77.0 - 120
96 - 107
41 - 86
82 - 94
86 - 97
63 - 71
NAb
76 - 99
56.5 - 120.7
NA
95 - 107
41 - 143
NA
Precision
(%)
2.
9.
5.
7.
1.
1.
3.
0 - 25.1
4 - 27.7
6 - 11.6
9 - 16.5
3 - 6.5
3 - 6.5
1 - 5.9
NA
10
5.
NA
9
20
NA
- 25
3 - 19.9
- 28
-145
MDL
(mg/1)
0.03 - 0.
0.2 - 0.4
0.5 - 0.6
058 - 2.2
0.29 - 3.
0.29 - 3.
0.06/ND
NA
0.03 - 1.
0.1 - 5.0
0.1 - 200
1.6 - 6.9
0.9 - 44
NA
52
0
0
34
-------
HISTORICAL PRECISION AND ACCURACY DATA/WATER
(Continued)
LEVEL III ANALYTICAL TECHNIQUES - SW-846 METHODS
Tl
CD
Method
Number
Method Name
Data
Source
Range of
Recovery (%)
Precision
(%)
MDL
(mg/l)
8310
INORGANICS;
7000 Series
7470
9010
9030
Polynuclear Aromalic SW 846 78 - 116
Hydrocarbons (HPLC)
(Capillary)
Metals (ICAP) EPA 200.7 MA
Metals (FLAME) 7000 Series EPA 200 NA
Metals (FLAME LESS/GF) EPA 200 NA
Metals (MERCURY) EPA 245.2 87 - 125
Cyanides EPA 335.2 85 - 102
Sulfides EPA 376.1 NA
7.3 - 12.9
3 - 21.9 (RSD)
NA
NA
0.9 - 4.0
0.2 - 15.2
NA
0.03 - 2.3
1.3 - 75 Mg/l
0.01 - 5
0.001 - 0.2 Mg/l
0.0002
0.02 Mg/l
1 Mg/l
a. For water only
b. NA Not Available
NOTES: Method Detection Limit (MDL) as listed on this table is the minimum concentration of a substance
that can be measured and reported with 99% confidence that the value is above zero.
Accuracy, presented as an average percent recovery, was determined from replicate (10-25) analyses
of water and wastewater samples fortified with known concentrations of the analyte of interest at
or near the detection limit. In most cases this was less than 10 times the MDL.
Precision data are used to measure the variability of these repetitive analyses reported as a
single standard deviation or, as a percentage of the recovery measurements. For presentation
purposes accuracy, precision and MDL information is presented as an average range of individual
values for every analyte covered by the procedure. If specific information on a particular
compound is required, the specific analytical method cited should be consulted.
-------
HISTORICAL PRECISION 2ND ACCURACY DATA/WATER'
LEVEL IV ANALYTICAL TECHNIQUES - CLP RAS METHODS
n
I
ANALYTES
Volatilesb
Methylene chloride
1,1-Dichloroethene
1,1-Dichloroethane
Trans-1,2-Dichloroethene
Chloroform
1,2-Dichloroethane
1,1,1-Trichloroethane
Carbon Tetrachloride
1,1,2,2-Tetrachloroethane
Bromodichloromethane
1,2-Dichloropropane
Trans-1,3-Dichloropropene
Trichloroethene
Dibromochlorome thane
1,1,2-Trichloroethane
Benzene
Cis-1,3-Dichloropropene
Bromoform
Tetrachloroethene
Toluene
Chlorobenzene
Ethyl Benzene
Semiyolatiles
bis(2-Chloroethyl)ether
2-Chlorophenol
1,3-Dichlorobenzene
1,4-Dichlorobenzene
1,2-Dichlorobenzene
2-Methylphenol
bis( 2-Chloroisopropyl )ether
TECHNIQUE
Purge & Trap GC/MS
CONCENTRATION
RANGE
N.A.C
GC/MS
N.A.
PRECISION
% RSD
56
20
13
31
12
13
19
12
11
19
18
31
17
14
11
12
22
16
13
14
14
4
24
29
24
21
29
29
25
ACCURACY
% Bias
+36.6
-26.3
-46.4
-21.7
-21.1
+2.4
-41.0
-32.1
-5.8
-13.0
-12.9
-41.2
-22.8
-3.3
-7.0
-3.3
-35.5
+6.5
-42.5
-23.3
-15.9
-31.9
-16
-21
-48
-25
-28
-30
-22
-------
HISTORICAL PRECISION AND ACCURACY DATA/WATER
LEVEL IV ANALYTICAL TECHNIQUES - CLP RAS METHODS
ANALYTES
Semivolatiles
4-Methylphenol
N-Ni troso-di-n-propylamine
Nitrobenzene
Isophorone
2-Nitrophenol
bis (2-Chloroethoxy)me thane
^ 2,4-Dichlorophenol
g 1,2,4-Trichlorobenzene
Naphthalene
4-Chloro-3-methylphenol
2,4,6-Trichlorophenol
2-Chloronapthalene
Acenapthene
2,4-Dinitrophenol
2,4-Dini trotoluene
2,6-Dinitrotoluene
4-Chlorophenyl-phenylether
Fluorene
4,6-Dinitro-2-methylphenol
4-B romopheny 1-phenyle the r
Hexachlorobenzene
Pentachlorophenol
Phenanthrene
Fluoranthene
Benzo(b)fluoranthene
Benzo(a)pyrene
TECHNIQUE
GC/MS
CONCENTRATION
RANGE
N.A.C
PRECISION
% RSD
33
31
32
23
30
34
29
30
44
26
25
24
28
24
34
25
34
25
30
32
36
31
21
42
39
42
ACCURACY
% Bias
-36
+0.3
-23
-8
-21
-2.6
-20
-47
-38
-32
-17
+3.4
-12
-23
-33
-48
+12
-24
-13
-0.1
-42
-24
-28
-15
-10
-29
-------
HISTORICAL PRECISION AND ACCURACY DATA/WATER
(continued)
LEVEL IV ANALYTICAL TECHNIQUES - CLP HAS METHODS
ANALYTES
Metals6
Aluminum
Antimony
Arsenic
Barium
Beryllium
Pflrfrnjiim
Calcium
Chromium
Cobalt
" Copper
: Iron
Lead
Magnesium
Manganese
Mercury
Nickel
Potassium
Selenium
Sodium
Thallium
Tin
Vanadium
Zinc
TECHNIQUE
ICP
ICP
Furnace AA
ICP
ICP
ICP
ICP
ICP
ICP
ICP
ICP
Furnace AA
ICP
ICP
Cold Vapor
ICP
ICP
Furnace AA
ICP
Furnace AA
ICP
ICP
ICP
CONCENTRATION
RANGE
1000-3000 ug/1
180-600 ug/1
50-150
800-1500
30-45
25-50
1000-30000
50-150
200-1000
125-250
200-800
30
10000-40000
30-150
5-20
160
10000-20000
50
10000-45000
80-100
160
60-200
50-800
PRECISION
% RSD
9.1
11
9.4
6.8
15
12
6.0
9.8
6.7
6.7
10.4
32
6.6
6.2
18.8
9.0
16.
8.
8.
17-Z
N.A.
7.6
9.1
.2
.7
.7
ACCURACY
% Bias
-4.3
-9.2
-8.3
-3.9
+3.7
-3.3
-1.6
-2.6
-2.9
-1.1
+6.5
-0.7
-2.5
-1.0
-14.4
-2.5
-12.1
-5.7
-2.8
-4-.2
-2.5
-0.46
+3.0
a. Source: Quality Control in Remedial Site Investigation: Hazardous and Industrial Solid Waste Testing, Fifth Volume,
ASTM STP 925, C.L. Perket, Ed., American Society for Testing Materials, Philadelphia, 1986.
b. Volatile precision and accuracy data from 26-34 laboratories' results on quarterly blind performance evaluation
samples; 29-152 data points for each compound.
c. N.A. - Not Available.
d. Semi volatile precision and accuracy data from 1985 preaward program data; 22-227 data points for each compound.
e. Metals precision and accuracy data is based on performance evaluation sample results from 18 laboratories; number
of data points is not given.
-------
HISTORICAL PRECISION AND ACCURACY DATA/SOILS
LEVEL I FIELD SCREENING TECHNIQUES
MEASUREMENT
RESISTIVITY
TERRAIN
CONDUCTANCE
TERRAIN
CONDUCTANCE
Magnetic Field
Intensity
Subsurface
Lithology
Changes
Subsurface
Lithology
Changes
INSTRUMENT
(TECHNIQUE)
Bison 2390 T/R
(Resistivity meter)
EM 31
(conductivity)
EM 34-3
(conductivity)
EDA - Omni IV
(Magnetometer)
SIR-8
(Ground Penetrating
Radar)
EG+G 1225
(Seismograph)
INSTRUMENT
RANGE
0-1999
millivolts
0-1000
mi11imhos/mete r
0-300
mi11imhos/mete r
18000-110000
gammas
1-81 dielectric
constant
0-2000
milliseconds
INSTRUMENT .
PRECISION
at 1% range setting,
0-5% of full scale
2% of full scale
2% of full scale
0.02 gamma
N/Ad
N/Ad
INSTRUMENT
ACCURACY
2% of measured
value
5% at 20 millimhos/taeter
5% at 20 millimhos/meter
1 gamma at 50000 gammas
at 23oC
N/A
0.01%
-------
HISTORICAL PRECISION AND ACCURACY DATA/SOIL
(continued)
LEVEL I FIELD SCREENING TECHNIQUES
I
I
UJ
MEASUREMENT
TOTAL
VOLATILE
ORGANICS
INSTRUMENT FIELD SCREENING
(TECHNIQUE) RESULTS in ppm (X)
PHOTO VAC 11.4
(GC/Photoionization) 22.0
56.0
139
70.0
24.9
60.0
6.6
12.1
8.7
CLP
RESULTS in ppm (Y)
26.9
32.8
129.7
228.0 & 258.0
126.7
2823.0
53.3
0.056
0.032
0.024
ACCURACY6
(% Bias)
-57.6
-32.9
-56.8
-42.8
-44.8
+99.1
+12.6
+116.9
+377.1
+361.5
a. Source: Manufacturers' manuals unless otherwise cited. Mention of specific models does not constitute
and endorsement of these instrument.
b. Precision refers to reproducibility of meter or instrument reading as cited in instrument specifications,
c. Accuracy refers to instrument specifications unless otherwise cited.
d. N.A. » not available.
e. Accuracy of PhotoVac field screening results calculated by assuming that CLP results on the same samples
were completely accurate. % Bias - 100 (X-Y). Source of these data is CDM project files.
"*
-------
HISTORICAL PRECISION AND ACCURACY DATA/SOIL"
LEVEL II FIELD TECHNIQUES
I
J>
ANALYTES INSTRUMENT FIELD RESULTS
fTECHNIQUE) IN ppm (x)
PCBs HNu 301 6.0
(GC/ELECTRON 6.0
CAPTURE) 6.0
9.0
13.0
14.0
14.0
21.0
35.0
41.0
48.0
50.0
65.0
67.0
92.0
95.0
11
202
269
286
1215
1647
3054
CLP RESULTS
IN ppm (y)
22.0
6.1
510.0
3.9
3.0
3.1
23.5
8.1
7.7
2.1
11.0
460.0
23.1
18.7
75.0
30.0
12.3
99.0
370.0
80.5
640.0
1040.0
9,300
ACCURACY b
% BIAS
-72.7
-1.6
-98.8
+56.7
+333.3
+351.6
-40.4
+ 159.3
354.5
+1,852
+336.3
-89.1
+181.4
+258.3
22.7
+216.7
-10.6
+ 104.0
-27.3
+255.3
+90.0
+58.4
-67.2.
a. Source: COM Project files.
b. Source: Accuracy calculated by assuming that CLP results on the same samples were completely accurate. % Bias = 100 (X'V)
y
-------
HISTORICAL PRECISION AND ACCURACY DATA/SOIL3
I FVFI III ANALYTICAL TECHNIQUES - METHODS OTHER THAN CLP RAS METHODS
ANALYTE METHOD CONCENTRATION PRECISION ACCURACY
(TECHNIQUE) RANGE % RSD % BIAS
DIOXINS 8280 5 ppb 6-30 N.A.
(HPLC/LRMS) 125 ppb 3-10 N.A.
JAR EXTRACTION GC/MS 1 ppb 20 0
10 ppb 10 -18
a. Source: Draft Compendium of Information and Performance Data on Routinely Used Measurement Methods (RUMM) - Pilot Phase,
RTI/3087/03, prepared for EPA Quality Assurance Management Staff, January 1986. This document should be
consulted for more information on individual analytes.
-------
HISTORICAL PRECISION AND ACCURACY DATA/SOILS3
LEVEL IV ANALYTICAL TECHNIQUES - CLP RAS METHODS
CONCENTRATION PRECISION ACCURACY
TECHNIQUE RANGE % RSD % Bias
Volatiles Purge & Trap GC/MS N.A.C
Chloroform g Q -01
1,2-Dichloroethane i3'± +11 1
Dibromochlorometnane 35*0 -12 0
Benzene 32^ _1(J;3
Bromoform 16>6 _121
2-Hexanone 16 6 _45 5
SJUenL 13-8 +13.7
Chlorobenzene 21.2 +13.2
Semivolatiles GC/MS N.A.C
1,4-Dichlorobenzene 27 -51
Nitrobenzene 21 -48
Isophorone 24 -47
2-Nitrophenol 35 _^g
2,4-Dichlorophenol 31 _gg
1,2,4-Trichlorobenzene 28 -43
Penta Chlorophenol 17 _4g
Pyrene 25 -15
2-Methylnaphthalene 26 -42
bis-(2-Ethylhexyl)phthalate 33 _2
Phenol 38 _27
Acenaphthylene 26 -27
Diethyphthalate 16 _2Q
Dioxin
2T1,7,8-TCCD 1-10 ugAg 15 -11.5
-------
HISTORICAL PRECISION AND ACCURACY DATA/SOILS3
(continued)
LEVEL IV ANALYTICAL TECHNIQUES - CLP RAS METHODS
CONCENTRATION PRECISION ACCURACY
ANALYTES TECHNIQUE RANGE % RSD I Bias
14.4 -78.8
33.3 +2.9
N.A. -4.2
7.8 -6.1
11.2 -2.5
10.7 -27.0
9.2 -2.2
7.5 -10.6
9.4 -15.1
25.0 -9.1
15.0 -17.Q
44.1 N.A.
5.8 -6.2
a. Source: Quality Control in Remedial Site Investigation: Hazardous and Industrial Solid Waste Testing, Fifth Volume,
ASTM STP 925, C.L. Perket, Ed., American Society for Testing Materials, Philadelphia, 1986.
b. Volatiles precision and accuracy data is based on 1985 preaward analysis results from laboratories awarded
contracts; 6-14 data points for each compound.
c. N.A. - Not Available.
d. Semivolatiles precision and accuracy data is based on 1985 preaward analysis results; 9-20 data points
for each compound.
e. Dioxin precision and accuracy data is based on results of four performance evaluation samples including
120 data points.
f. Metals precision and accuracy data is based on performance evaluation sample results from 18 laboratories;
number of data points is not given.
Metalsb
Aluminum
Cadmium
Calcium
Chromium
Copper
Iron
^ Lead
^ Magnesium
-j Manganese
Mercury
Nickel
Tin
Zinc
ICP
ICP
ICP
ICP
ICP
ICP
Furnace AA
ICP
ICP
Cold Vapor
ICP
ICP
ICP
2-22600 ugAg
5.5-20
2664-29000
8.5-29600
33-109
5028-113000
11.5-714
2428-7799
73.5-785
1.1-26.5
44-67
N.A.
19-1720
-------
HISTORICAL PRECISION AND ACCURACY DATA/AIRa
LEVEL I FIELD SCREENING TECHNIQUES
n
i
ANALYTES
Organics
Organics
Organics
Organics
INSTRUMENT
(TECHNIQUE)
Century OVA-128
(Flame lonization)
HNu PI-101
(Photoionization)
AID - 710
(Flame lonization)
PhotoVac
(GC-Photoion-
ization)
INSTRUMENT
RANGE
0.1 - 1000 ppm
Methane
0.1 - 2000 ppm
Benzene
0.1 - 2000 ppm
Methane
N.A.
INSTRUMENT
SENSITIVITY
0.1 ppm Methane
0.1 ppm Benzene
0.1 ppm Methane
0.001 ppm
Benzene
INSTRUMENT
PRECISION
N.A.
+ 1% of full scale
deflection
N.A.d
N.A.
a. Source: Manufacturers' manuals unless otherwise cited. Mention of specific models
does not constitute an endorsement of these instruments.
b. It is difficult to differentiate between Level I and Level II techniques and
instrumentation. Several instruments may be used at both levels.
c. Sensitivity and precision refer to instrument specifications.
d. N.A. = Not Available.
-------
HISTORICAL PRECISION AND ACCURACY DATA/AIR
LEVEL II FIELD TECHNIQUES
TJ
I
ANALYTES
Organics
Compound
Specific
Organics,
Compound-
Specific
Organics,
Compound-
Specific
Organics,
Compound-
Specific
Mercury
INSTRUMENT
(TECHNIQUE)
Mi ran IB
(Infrared)
Century OVA-128
(GC/Flame
lonization)
PhotoVac
(GC-Photo-
ionization)
SCENTOR
(Argon lonization
or Electron Capture)
Gold film Mercury
Analyzer
INSTRUMENT
RANGE
Compound Dependent,
0-2000 ppra
1-1000 ppm
Methane
N.A.
N.A.
N.A.
INSTRUMENT
SENSITIVITY0
N.A.d
N.A.
0.001 ppm
Benzene
0.001 ppm
Benzene
less than
0.01 ppm
INSTRUMENT
PRECISION
N.A.d
N.A.
N.A.
N.A.
N.A.
a. Source: Manufacturers' manuals. Mention of specific models does not constitute an
endorsement of these instruments.
b. It is difficult to differentiate between Level I and Level II techniques and
instrumentation. Several instruments may be used at both levels.
c. Sensitivity and precision refer to instrument specifications.
d. N.A. = Not Available.
-------
HISTORICAL PRECISION AND ACCURACY DATA/AIR'
LEVEL III ANALYTICAL TECHNIQUES - METHODS OTHER THAN CLP RAS METHODS
ANALYTES
BENZENE
METHOD
(TECHNIQUE)
CRYOGENIC TRAP/GC
TENAX GC/MS
10
O
CONCENTRATION
RANGE
3.9 ppb
93 ppb
7.8 ug/m3
4.5 ug/m3
PRECISION
% RSD
4.0
5.1
11
21
ACCURACY
% BIAS
N.A.
N.A.
N.A.
N.A.
TOLUENE
10.8 ppb
5.11
N.A.
TRICHLORQETHENE
3.5 ppb
84 ppb
4.1
3.7
N.A.
N.A.
VINYL CHLORIDE
7.8 ppb
6.37
N.A.
LEAD
40 CFR 50, APP G
(FLAME AA)
0.6 ug/m3
8.01 ug/m3
8.6
3.9
0
-3.6
a. Source: Draft Compendium of Information and Performance Data on Routinely Used Measurement Methods (RUMM) - Pilot Phase,
RTI/3087/03, prepared for EPA Quality Assurance Management Staff, January 1986. This document should be
consulted for more information on individual analytes.
-------
HISTORICAL PRECISION AND ACCURACY DATAADTHER MEDIA3
LEVEL III ANALYTICAL TECHNIQUES - METHODS OTHER THAN CLP RAS METHODS
ANALYTE METHOD CONCENTRATION PRECISION ACCURACY
(TECHNIQUE) MEDIUM RANGE % RSD % BIAS
LEAD 6010 OIL WASTE 1.0 mg/kg 3.1 -10
(ICP) -2.5 mg/kg 22 -20
SOLID WASTE 50 mg/kg 10 3.4
75 mg/kg 3.7 -0.8
SOLID SLUDGE 5 mg/kg 2 0
20 mg/kg 11 55
a. Source: Draft Compendium of Information and Performance Data on Routinely Used Measurement Methods (RUMM) - Pilot Phase,
RTI/3087/03, prepared for EPA Quality Assurance Management Staff, January 1986. This document should be
consulted for more information on individual analytes.
-------
APPENDIX G
RCRA APPENDIX VIII
CLP HSL COMPARISON
-------
ORGANIC COMPOUNDS ON CLP/HSL
BUT NOT INCLUDED ON MODIFIED APPENDIX VIII
Common Name CAS RN
Acetone 67.64.1
Vinyl Acetate 108.05.4
2-Hexanone 591.78.6
Ethylbenzene 100.41.4
Styrene 100.42.5
Xylenes (Total) 1330-20-7
Benzyl Alcohol 100.51.6
Isophorone 78.59.1
2-Nitrophenol 88.75.5
Benzole Acid 65.85.0
2-Methylnaphthalene 91.57.6
2-Nitroaniline 88.74.4
3-Nitroaniline 99.09.2
Dibenzofuran 132.64.9
4,Chlorophenyl-phenylether 7005.72.3
Endrin Ketone 53494.70.5
Endosulfan Sulfate 1031.07.8
G-l
-------
ORGANIC COMPOUNDS ON MODIFIED APPENDIX VIII LIST
BUT NOT INCLUDED ON CLP/HSL
Common Name
Acetonitrile
Acetophenone
2-Acetylaminofluorine
Acrolein
Acrylonitrile
Allyl Alcohol
4-Aminobiphenyl
Aramite
Benzenethiol
p-Benzoquinone
Bromoacetone
2-sec-butyl-4,6-dinitrophenol
Chlorobenzilate
2-chloro-l,3-butadiene
3-chloropropene
3-chloropropionitrile
Diallate
Dibenzo [a,e] pyrene
Dibenzo [a,h] pyrene
Dibenzo [a,i] pyrene
1,2-dibromo-3-chloropropane
1,2-dibromoethane
Dibromomethane
1,4-dichloro-2-butene
Dichlorodifluoromethane
2,6 Dichlorophenol
1,3-Dichloropronene
0,0-Diethyl 0-2-pyrazinyl
phosphorothioate
3,3-Dimethoxybenzidine
p-Dimethylaminozobenzene
7,12-Dimethylbenz[a]anthracene
3,31-Dimethylbenzidine
alpha-Dimethylphenethylamine
1,4-Dioxane
Diphenylamine
1,2-Diphenylhydrazine
Di-n-propylnitrosamine
Disulfoton
Ethyl Cyanide
Ethylene Oxide
meta-dinitrobenzene
Silvex
1,2,3-trichloropronene
Tris (2,3-dibromopropyl) phosphate
CAS RN
75.05.8
98.86.2
53.96.3
107.02.8
107.13.1
107.18.6
92.67.1
140.57.8
108.98.5
106.51.4
598.31.2
88.85.7
510.15.6
126.99.8
107.05.1
542.76.7
2303.16.4
192.65.4
189.64.0
189.55.9
96.12.8
106.93.4
74.95.3
764.41.0
75.71.8
87.65.0
542.75.6
297.97.2
119.90.4
60.11.7
57.97.6
119.93.7
122.09.8
123.91.1
122.39.4
122.66.7
621.64.
298.04,
107.12.0
75.21.8
100.25.4
93.72.1
96.18.4
126.72.7
.7
.4
Class
CLP/VOA
CLP/BNA
CLP/BNA
CLP/VOA
CLP/VOA
NRA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
NRA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/VOA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/VOA
CLP/VOA
CLP/VOA
CLP/VOA
CLP/VOA
CLP/BNA
CLP/VOA
NRA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
NRA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/VOA
CLP/VOA
NRA
CLP/BNA
NRA
CLP/VOA
CLP/BNA
G-2
-------
Common Name
CAS RN
Class
Phenacetin
N-Phenylthiourea
Phorate
Famphur
2-Picoline
Propanamide
2-Propyn-l-ol
Pyridine
Resorcinol
Safrole
1,2,4,5-Tetrachlorobenzene
1,1,1,2-Tetrachloroethane
2-Naphthylamine
N-Nitrosodi-n-butylamine
N-Nitrosodiethylamine
N-Ni t rosomethylethylamine
N-Nitrosomorpholine
N-Nitrosopiperdine
5-Nitro-o-toluidine
Parathion
Pentachlorobenzene
Pentachlo roe thane
Pentachlo roni t robenzene
Kepone
Malonitrile
Methyacrylonitrile
Methapyrilene
3-Methylchloranthrene
4,4-Methylene-bis (2-chloroaniline)
Me thylmethac rylate
Methylmethanesulfonate
Aldicarb
Methyl parathion
1,4 Naphthoquinone
1-Naphthylamine
2,3,4,6-Tetrachlorophenol
Tetraethyldithiopyrophosphate
Trichloromethanethai
Trichloromonofluoromethane
2,4,5-T
Ethyl Methacrylate
Isodrin
Hexachlorophene
Hexachloropropene
lodomethane
Isobutylalcohol
Isosafrole
62.44.2
103.85.5
298.02.2
52.85.7
109.06.8
23950.58.5
107.19.7
110.86.1
108.46.3
44.59.7
95.94.3
630.20.6
91.59.8
924.16.3
55.18.5
10595.95.6
59.89.2
100.75.4
99.44.8
56.38.2
608.93.5
76.01.7
82.68.8
143.50.0
109.77.3
126.98.7
91.80.5
56.49.5
101.14.4
80.62.6
66.27.3
116.06.3
298.00.0
130.15.4
134.32.7
58.90.2
3689.24.5
75.70.7
75.69.4
93.76.5
97.63.2
465.73.6
70.30.4
1888.71.7
74.88.4
78.33.1
120.58.1
CLP/BNA
CLP/BNA
NRA
NRA
CLP/BNA
CLP/BNA
NRA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/VOA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
NRA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/PCB-Pest
CLP/BNA
CLP/VOA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
CLP/BNA
NRA
CLP/BNA
CLP/BNA
CLP/BNA
NRA
CLP/BNA
CLP/VOA
NRA
CLP/BNA
CLP/PCB-Pest
CLP/BNA
CLP/VOA
CLP/VOA
CLP/VOA
NRA
G-3
-------
NOTES
aClass Abbreviations
NBA - Not readily analyzable using current CLP Procedures
CLP/VOA - Potentially analyzable using current CLP/HSL GC/MS Volatile
Organics Procedure
CLP/BNA - Potentially analyzable using current CLP/HSL Base/Neutral Acid
Extractable GC/MS Procedure
CLP/PCB-Pest - Potentially analyzable using current CLP/HSL PCB/Pesticide
GC Procedure
G-4
-------
ORGANIC COMPOUNDS ON MODIFIED
APPENDIX VIII
LIST THAT ARE NOT READILY ANALYZABLE BY CURRENT
CLP/HSL PROCEDURES
Common Name CAS RN
Allyl alcohol 107.18.6
Bromoacetone 598.31.2
0,O-Diethyl-O-2-Pyrazinyl phosphorothioate 297.97.2
1,4 Dioxane 123.91.1
Ethylene Oxide 75.21.8
Silvex 93.72.1
Phorate 298.02.2
Famphur 52.85.7
2-Propyn-l-ol 107.19.7
Parathion 56.38.2
Methyl Parathion 298.00.0
Tetraethyldithiopyrophosphate 3689.24.5
2,4,5-T 93.76.5
isosafrole 120.58.1
Class'
WS/NV
WR
OP
WS/NV
NR (VOA)
CH
OP
OP
WS/NV
OP
OP
WR
CH
D/H
NOTES
aClass Abbreviations
WS/NV - Water soluble, nonvolatile compound probability not amenable to
purge and trap or liquid/liquid extraction pretreatment.
WR - Water reactive, unanalysable in aqueous matrix.
OP - Organophosphorous pesticide best analyzed by a modified SW-846,
Method 8140.
NR (VOA) - Not recoverable at 200 PPB using standard HSL/CLP volatile
organics procedures. May be more ameneable to head space analysis.
CH - Chlorinate herbicide, must be derivatized prior to analysis. Best
analyzed using modified SW-846 Method 8150.
D/H - Decomposes at conventional GC temperatures HLPC procedure may be
applicable.
G-5
-------
CLP VOLATILE ORGANIC CRDL
Target comoound name
£hloromethane
Bromomethane
Vinyl Chloride
Chi oroethane
Methyl ene Chloride
Acetone
Carbon D1sulf1de
1,1-01 chloroethene
1, 1-01 chl oroethane
Tran$-l, 2-01 chloroethene
Chloroform
1 ,2-01 chl oroethane
2-Butanone
1 ,1 ,l-Tr1 chl oroethane
Carbon Tetrachloride
Vinyl Acetate
Bromodl chl oromethant
1,1,2 ,2-Tetrachl oroethane
1 ,2-01 chl oropropane
Trans-1 ,3-01 chl oropropene
Tri chloroethene
01 bromochl oromethane
I,l,2-Tr1 chl oroethane
Benzene
C1 s-1 ,3-01 chl oropropene
2-Chloroethyl Vinyl Ether
Bromof orm
4-Methyl -2-pentanont
2-Hexanone
let ra chl oroethene
Toluene
Chlorobenzene
Ethyl Benzene
Styrene
Total Xylenes
SPCC&
cccc
SPCC
ccc
ccc
SPCC
ccc
SPCC
ccc
SPCC
ccc
SPCC
ccc
Low SOll
CRDL.
uq/kq
id
10
10
10
5
10
5
5
5
5
5
5
10
5
5
10
5
5
5
5
5
5
5
5
5
10
5
10
10
5
5
5
5
5
5
Low water
CRDL,
uq/L
id
10
10
10
5
10
5
5
5
5
5
5
10
5
5
10
s
5
5
5
5
5
5
5
5
10
5
10
10
5
5
5
5
5
5
CAS number
74-8J-*
74-83-9
75-01-4
75-00-3
75-99-2
67-64-1
75-15-0
75-35-4
75-35-3
156-60-5
67-66-3
107-06-2
78-93-3
71-55-6
56-23-5
108-05-4
75-27-4
79-34-5
78-87-5
10061-02-6
79-01-6
124-48-1
79-00-5
71-43-2
10061-01-5
110-75-8
75-25-2
108-10-1
591-78-6
127-18-4
108-88-3
108-90-7
100-41-4
100-42-5
N.A.
dCRDL values obtained from the IFB WA85-J664 [7J,
bSystem Performance Check Compounds (SPCC) are used to check compound
Instability and degradation 1n the GC/MS and to Insure minimum average
response factors are met prior to the use of the calibration curve.
^Column Check Compounds (CCC) are used to check the validity of the
initial calibration.
Note: Medium soil and water CROLs are 100 times the low level CROLs.
SOURCE: Flotard, R.D. et al 1986
G-6
-------
CLP INORGANIC COMPOUND CRDL,
INSTRUMENT DETECTION LEVEL AND WAVELENGTH
Element
Al
Sb
As
Ba
Be
Cd
Ca
Cr
Co
Cu
Fe
Pb
Mg
Mn
Hg
N1
K
Se
Ag
Na
Tl
Sn
V
In
CRDL
200
60
10
200
5
5
5000
10
50
25
100
5
5000
15
0.2
40
5000
5
10
5000
10
40
50
20
Method
ICP
ICP
FAA
ICP
ICP
ICP
ICP
ICP
ICP
ICP
ICP
ICP
ICP
ICP
cv
ICP
ICP
FAA
ICP
ICP
ICP
ICP
ICP
ICP
N
7
5
18
5
10
5
7
9
11
11
10
12
11
10
12
9
8
18
10
9
18
7
10
0
IDL
Mean
70.7
42.3
4.6
22.1
2.3
4.0
529
5.8
11.4
9.7
27.4
2.3
385
5.2
0.2
17.8
668
2.8
5.4
756
4.3
23.8
13.1
8.3
IDL
Std Dev
59.3
11.3
2.3
31.7
1.7
1.1
472
2.9
8.5
6.5
20.9
1.2
449
4.6
0.1
10.1
444
1.3
2.7
864
2.4
8.4
10.0
6.3
Wave-
Length (ntn)
309. J
217.6
198.7
493.4
312.0
228.8
317.9
267.7
228.6
324.5
259.9
283.3
279.6
257.6
253.7
232.0
766.5
196.0
328.1
589.0
276.8
190.0
292.5
213.9
IDL -Instrument Detection Limit U9/L).
N - Number of laboratories using the most common wavelength.
CRDL - Contract Required Detection Limit (yg/L).
SOURCE: Aleckson, K.A. et al 1986.
G-7
-------
CLP SEMI-VOLATILE HSL COMPOUNDS AND CRDL
Compound name
Phenol
bls(Z-ChloPoethyl) ether
2-Chlorophenol
1 ,3-D1chl orobenzene
1,4-01 chl orobenzene
Benzyl alcohol
1 , 2-01 chl orobenzene
2-Methyl phenol
b1s(2-Chloro1sopropyl) ether
4-Methyl phenol
N-N1 troso-d1-n-propyl ami ne
Hexachloroe thane
Nitrobenzene
Isophorone
2-N1trophenol
2, 4 -01 methyl phenol
Benzole add
b1s(2-Chloroethoxy)me thane
2, 4-01 chl orophenol
1 , 2, 4-Tr1chl orobenzene
Naphthalene
4-Ch1oroan1l1ne
Hexachl orobutadlene
4-Chl oro-3-methyl phenol
2-Methyl naphthal ene
Hexachl orocycl opentadl ene
2, 4, 6-Tr1chl orophenol
2,4, 5-Tr1 chl orophenol
2-Chl oronaphthal ene
2-N1troan111ne
Dimethylphthalate
Acenaphthylene
3-N1troan1l1ne
Acenaphthene
2 ,4-01 nl trophenol
4-N1trophenol
Dlbenzofuran
2, 4-01 nltro toluene
2, 6-01 nltro toluene
01 ethyl phthal ate
4-Chl orophenyl -phenyl ether
Fluorene
4-N1troan1l1ne
4 , 6-0 1 n1 tro-2-«ethyl phenol
SPCCa
or CCCb
ccc
ccc
SPCC
ccc
ccc
ccc
SPCC
ccc
ccc
SPCC
SPCC
Low Soil
CRDL, tig/kg
330
330
330
330
330
330
330
330
330
330
330
330
330
330
330
330
1.600
330
330
330
330
330
330
330
330
330
330
1,600
330
1,600
330
330
1,600
330
1,600
1,600
330
330
330
330
330
330
1,600
1,600
Low Hater
CRDL, ug/L
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
50
10
10
10
10
10
10
10
10
10
10
50
10
50
10
10
50
10
50
50
10
10
10
10
10
10
50
50
CAS
Number
108-95-2
111-44-4
95-57-8
541-73-1
106-46-7
100-51-6
95-50-1
95-48-7
39638-32-9
106-44-5
621-64-7
67-72-1
98-95-3
78-59-1
88-75-5
105-67-9
65-85-0
111-91-1
120-83-2
120-82-1
91-20-3
106-47-8
87-68-3
59-50-7
91-57-6
77-47-4
88-06-2
95-95-4
91-58-7
88-74-4
131-11-3
208-96-8
99-09-2
83-32-9
51-28-5
100-02-7
132-64-9
121-14-2
606-20-2
84-66-2
7005-72-3
86-73-7
100-01-6
534-52-1
G-8
-------
CLP SEMI-VOLATILE HSL COMPOUNDS AND CRDL
(continued)
Compound name
N-N1 trosodl phenyl ami ne
4-Bromophenyl -phenyl ether
Hexachl orobenzene
Pentachlorophenol
Phenanthrene
Anthracene
D1-n-buty1 phthal ate
Fl uoranthene
Pyrene
Butyl benzyl phthal ate
3,3'-D1ch1orobenz1d1ne
Benzot a) anthracene
b1s(2-Ethylhexyl) phthal ate
Chrysene
01 -n-octyl phthal ate
Benzo( b ) fl uoranthene
Benzol k ) fl uoranthene
Benzo(a)pyrene
Indeno(l,2,3-cd)pyrene
D1benz(a,h)anthracene
Benzo(g.h,1)perylene
SPCCa
or CCCb
CCC
CCC
CCC
CCC
CCC
Low Soil
CRDL, ug/kg
330
330
330
1.600
330
330
330
330
330
330
660
330
330
330
330
330
330
330
330
330
330
Low Water
CRDL, ug/L
10
10
10
50
10
10
10
10
10
10
20
10
10
10
10
10
10
10
10
10
10
CAS
Number
86-30-6
101-55-3
118-74-1
87-86-5
85-01-8
120-12-7
84-74-2
206-44-0
129-00-0
85-68-7
91-94-1
56-55-3
117-81-7
218-01-9
117-84-0
205-99-2
207-08-9
50-32-8
193-39-5
53-70-3
191-24-2
aCCC-Callbratlon Check compound
bSPCC-System Performance Check Compound
Note: Medium soil/sediment contract required detection limits are 60
times the Individual low soil/sediment CRDL and medium water
contract required detection limits are 100 times the Individual
low water CRDL.
SOURCE: Wolf, J.S. et al 1986.
G-9
-------
APPENDIX H
CONTRACT REQUIRED DETECTION
LIMITS FOR HSL ANALYSES
USING CLP IFB PROCEDURES
-------
CLP VOLATILE ORGANIC CRDL
Target compound name
£hl oromethane
Bromomethane
Vinyl Chloride
Chloroethane
Methyl ene Chloride
Acetone
Carbon D1sulf1de
l,l-D1chloroethene
1,1-01 Chloroethane
Trans-1, 2-01 chl oroethene
Chloroform
1,2-01 Chloroethane
2-Butanone
1 ,1 ,l-Tr1 chl oroethane
Carbon Tetrachloride
Vinyl Acetate
Bromodi chl oromethane
1,1,2 ,2-Tetrachl oroethane
1 ,2-01 chl oropropane
Trans-1 ,3-01 chl oropropene
Tri chl oroethene
01 bromochl oromethane
1 ,1 ,2-Tr1 chl oroethane
Benzene
Ci s-1 ,3-01 chl oropropene
2-Chloroethyl Vinyl Ether
Bromoform
4-Methyl -2-pentanone
2-Hexanone
Tetrachl oroethene
Toluene
Chlorobenzene
Ethyl Benzene
Styrene
Total Xylenes
SPCC&
cccc
SPCC
ccc
ccc
SPCC
ccc
SPCC
ccc
SPCC
ccc
SPCC
ccc
CRDL,
pg/kg
10
10
10
10
5
10
5
5
5
5
5
5
10
5
5
10
5
5
5
5
5
5
5
5
5
10
5
10
10
5
5
5
5
5
5
Low water
CRDL,
uq/L
10
10
10
10
5
10
5
5
5
5
5
5
10
5
5
10
5
5
5
5
5
5
5
5
5
10
5
10
10
5
5
5
5
5
5
CAS number
74-87-3
74-83-9
75-01-4
75-00-3
75-09-2
67-64-1
75-15-0
75-35-4
75-35-3
156-60-5
67-66-3
107-06-2
78-93-3
71-55-6
56-23-5
108-05-4
75-27-4
79-34-5
78-87-5
10061-02-6
79-01-6
124-48-1
79-00-5
71-43-2
10061-01-5
110-75-8
75-25-2
108-10-1
591-78-6
127-18-4
108-88-3
108-90-7
100-41-4
100-42-5
N.A.
aCRDL values ootained from the IFB WA85-0654L/J.
^System Performance Check Compounds (SPCC) are used to check compound
instability and degradation in the GC/MS and to insure minimum average
response factors are met prior to the use of the calibration curve.
^Column Check Compounds (CCC) are used to check the validity of the
initial calibration.
Note: Medium soil and water CROLs art 100 times the low level CROLs.
SOURCE: Flotard, R.D. et al 1986
H-L
-------
CLP INORGANIC COMPOUND CRDL,
INSTRUMENT DETECTION LEVEL AND WAVELENGTH
Element
Al
Sb
As
Ba
Be
Cd
Ca
Cr
Co
Cu
Fe
Pb
Mg
v
Mn
Hg
N1
K
Se
Ag
Na
Tl
Sn
V
Zn
CRDL
200
60
10
200
5
5
5000
10
50
25
100
5
5000
15
0.2
40
5000
5
10
5000
10
40
50
20
Method
ICP
1C?
FAA
ICP
ICP
ICP
ICP
ICP
ICP
ICP
ICP
ICP
ICP
ICP
CV
ICP
ICP
FAA
ICP
ICP
ICP
ICP
ICP
ICP
N
7
5
18
5
10
5
7
9
11
11
10
12
11
10
12
9
8
18
10
9
18
7
10
0
tDL
Mean
76.7 "
42.3
4.6
22.1
2.3
4.0
529
5.8
11.4
9.7
27 A
2.3
385
5.2
0.2
17.8
668
2.8
5.4
756
4.3
23.8
13.1
8.3
me
Std Dev
59.3
11.3
2.3
31.7
1.7
1.1
472
2.9
8.5
6.5
20.9
1.2
449
4.6
0.1
10.1
444
1.3
2.7
864
2.4
8.4
10.0
6.3
Wave-
length (nm)
309.3
217.6
198.7
493.4
312.0
228.8
317.9
267.7
228.6
324.5
259.9
283.3
279.6
257.6
253.7
232.0
766.5
196.0
328.1
589.0
276.8
190.0
292.5
213.9
1WU AM^vl IMII^II to W«*«*»vlWit te v \ 9*yt M / v
N - Number of laboratories using the most common wavelength.
CRDL - Contract Required Detection Limit (ug/L).
SOURCE: Aleckson, K.A. et al 1986.
H-2
-------
CLP SEMI-VOLATILE HSL COMPOUNDS AND CRDL
Compound name
Phenol
b1 s( 2-Chl oroethyl ) ether
2-Chlorophenol
l,3-D1ch1 orobenzene
1,4-01 chl orobenzene
Benzyl al cohol
1,2-01 chl orobenzene
2-Methyl phenol
b1s(2-Chloro1sopropyl) ether
4-Methyl phenol
N-N1troso-d1-n-propylam1ne
Hexachloroe thane
Nitrobenzene
Isophorone
2-N1trophenol
2,4-D1methyl phenol
Benzole add
b1 s( 2-Chl oroethoxy Jmethane
2 ,4-01 chl orophenol
1, 2, 4-Tr1 chl orobenzene
Naphthalene
4-Chl oroan H1ne
Hexachl orobutadiene
4-Chl oro-3-methyl phenol
2-Methyl naphtha! ene
Hexachl orocycl opentadl ene
2 ,4, 6-Tr1 chl orophenol
2, 4, 5-Tr1 chl orophenol
2-Chl oronaphthal ene
2-N1troan1l1ne
Dime thy! phthal ate
Acenaphthylene
3-N1troan1l1ne
Acenaphthene
2,4-01n1trophenol
4-N1trophenol
Dlbenzofuran
2, 4-01 nltro toluene
2, 6-01 nltro toluene
01 ethyl phthal ate
4-Chl orophenyl -phenyl ether
Fluorene
4-N1troan1l1ne
4 , 6-0 1 n1 tro-2-methyl phenol
5PCCa
or CCCb
ccc
ccc
SPCC
ccc
ccc
ccc
SPCC
ccc
ccc
SPCC
SPCC
LOW 5011
CRDL, tig/kg
330
330
330
330
330
330
330
330
330
330
330
330
330
330
330
330
1,600
330
330
330
330
330
330
330
330
330
330
1,600
330
1,600
330
330
1,600
330
1,600
1,600
330
330
330
330
330
330
1,600
1,600
Low Water
CRDL, ug/L
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
50
10
10
10
10
10
10
10
10
10
10
50
10
50
10
10
50
10
50
50
10
10
10
10
10
10
50
50
CAS
Number
108-95-2
111-44-4
95-57-8
541-73-1
106-46-7
100-51-6
95-50-1
95-48-7
39638-32-9
105-44-5
621-64-7
67-72-1
98-95-3
78-59-1
88-75-5
105-67-9
65-85-0
111-91-1
120-83-2
120-82-1
91-20-3
106-47-8
87-68-3
59-50-7
91-57-6
77-47-4
88-06-2
95-95-4
91-58-7
88-74-4
131-11-3
208-96-8
99-09-2
83-32-9
51-28-5
100-02-7
132-64-9
121-14-2
606-20-2
84-66-2
7005-72-3
86-73-7
100-01-6
534-52-1
H-3
-------
CLP SEMI-VOLATILE HSL COMPOUNDS AND CRDL
(continued)
Compound name
N-N1 trosodl phenyl ami ne
4-Bromophenyl -phenyl ether
Hexachl orobenzene
Pentachl orophenol
Phenanthrene
Anthracene
Di-n-butyl phthal ate
FT uoranthene
Pyrene
Butyl benzyl phthal ate
3 , 3 ' -01 chl orobenz 1 d1 ne
Benzo( a) anthracene
b1s(2-Ethylhexyl) phthal ate
Chrysene
D1-n-octyl phthal ate
Benzo( b ) f! uoranthene
Benzol k ) fl uoranthene
Benzo(a)pyrene
Indeno(l,2,3-cd)pyrene
D1benz(a,h)anthracene
Benzo(g.h,1 )perylene
aCCC-Caflbratlon Check Comp
SPCCa
or CCCb
ccc
ccc
ccc
ccc
ccc
lound
Low Soil
CRDL, ug/kg
330
330
330
1.600
330
330
330
330
330
330
660
330
330
330
330
330
330
330
330
330
330
Low Water
CRDL, ug/L
10
10
10
50
10
10
10
10
10
10
20
10
10
10
10
10
10
10
10
10
10
CAS
Number
86-30-6
101-55-3
118-74-1
87-86-5
85-01-8
120-12-7
84-74-2
206-44-0
129-00-0
85-68-7
91-94-1
56-55-3
117-81-7
218-01-9
117-84-0
205-99-2
207-08-9
50-32-8
193-39-5
53-70-3
191-24-2
bSPCC-Syste» Performance Check Compound
Note: Medium soil/sediment contract required detection limits are 60
times the Individual low soil/sediment CRDL and medium water
contract required detection limits are 100 times the Individual
low water CRDL.
ftU.S. GOVERNMENT PRINTING OFFICE: 1987 748-121/67042
SOURCE: Wolf, J.S. et al 1986.
h-4
------- |