United States
Environmental Protection
Agency
Office of Research and
Development
Washington, DC 20460
EPA/540/R-10/001
March 2010
vvEPA Guidance Manual for the
Preparation of Demonstration
and Quality Assurance Project
Plans for the Verification of
Field Characterization and
Monitoring Technologies
' .
#7 1

savo
RESEARCH AND DEVELOPMENT

-------

-------
EPA/540/R-10/001
March 2010
www.epa.gov
Guidance Manual for the
Preparation of Demonstration
and Quality Assurance Project
Plans for the Verification of
Field Characterization and
Monitoring Technologies
Prepared for
Stephen Billets
U.S. Environmental Protection Agency
National Exposure Research Laboratory
944 East Harmon Avenue
Las Vegas, NV 89119
Prepared by
Battel le
505 King Avenue
Columbus, OH 43201
Although this work was reviewed by EPA and approved for publication, it may not necessarily reflect official
Agency policy. Mention of trade names and commercial products does not constitute endorsement or
recommendation for use.
U.S. Environmental Protection Agency
Office of Research and Development
Washington, DC 20460
21615cmb10

-------
Notice
This document was prepared by Battelle Memorial Institute for the U.S. Environmental
Protection Agency under Contract No. EP-C-05-057, Task Order 0043. The document has met
the EPA's requirements for peer and administrative review and has been approved for
publication. Mention of corporation names, trade names, or commercial products does not
constitute endorsement or recommendation for use.
11

-------
Foreword
This work represents the technical and editorial contributions of a large number of U.S.
Environmental Protection Agency (EPA) employees and others familiar with or interested in the
demonstration and evaluation of innovative site characterization and monitoring technologies. In
the mid-1990s, the EPA National Exposure Research Laboratory, Environmental Sciences
Division - Las Vegas first convened a body of experts - the Consortium Action Team - to define
the elements of a guidance document. Subsequent discussions and meetings were held to revise
and expand the contents. EPA staff from each of the 10 Regions, the Office of Solid Waste and
Emergency Response, and the Office of Research and Development participated in this process.
This interdisciplinary, inter-programmatic team was convened to ensure that the demonstration
procedures articulated were acceptable across the Agency. This collaboration resulted in the
development of a 1996 interim guidance document for developing demonstration plans to gain
the acceptance of innovative technologies for use in characterizing and monitoring the
environment. In 2008, the interim guidance document was revised and updated to create this
document which now represents the current approach to development of demonstration/quality
assurance project plans for independent performance testing of site characterization and
monitoring technologies. For the most part, it relies on the experiences and the evolution of
thinking gained over the last 12 years of conducting demonstrations under the Superfund
Innovative Technology Evaluation (SITE) Monitoring and Measurement Technology (MMT)
and other technology evaluation programs.
in

-------
Table of Contents
Chapter 1 Introduction	1
1.1	Purpose and Content of This Guidance Manual	1
1.2	Evolution of the SITE MMT Program	2
1.3	Overview of the Technology Demonstration Process	4
Chapter 2 How to Use This Guidance Manual	5
2.1	Demonstration/Quality Assurance Project Plan Overview	5
2.2	Building a D/QAPP	5
Chapter 3 Elements of a Demonstration/Quality Assurance Project Plan	7
Concurrence Signatures	8
Notice	8
Abstract	8
Table of Contents	8
Abbreviations and Acronyms	8
Acknowledgements	9
1.0 INTRODUCTION	9
1.1	Description of Testing Program	10
1.2	Purpose and Scope of Demonstration	12
1.3	Background of the Problem	12
1.4	Sources of Contaminant(s) of Interest	12
1.5	Traditional Measurement Methods	13
2.0 DEMONSTRATION RESPONSIBILITIES AND COMMUNICATION	13
2.1	Developer Personnel	13
2.2	EPA Project Personnel (if applicable)	14
2.3	Independent Testing Organization Personnel (if applicable)	14
2.4	Demonstration Site Representatives	15
2.5	Reference Laboratory Personnel	15
2.6	Suppliers of Performance Evaluation Samples	15
3.0 DEVELOPER TECHNOLOGY DESCRIPTION(S)	16
3.1 Technology Name	16
3.1.1	Technology Description	16
3.1.2	Operating Procedure	17
3.1.3	Advantages and Limitations	17
4.0 DESCRIPTIONS OF DEMONSTRATION SITE AND SAMPLING LOCATIONS	17
4.1	Demonstration Site Description	17
4.2	Description of Sampling Locations	18
5.0 DEMONSTRATION APPROACH	 18
5.1	Demonstration Obj ectives	18
5.2	Overview of Demonstration Samples	19
5.3	Pre-Demonstration Study	20
5.4	Demonstration Schedule	20
5.5	Demonstration Design	21
5.6	Assessment of Primary and Secondary Objectives	21
iv

-------
5.6.1	Primary Objective PI: Accuracy	22
5.6.2	Primary Objective P2: Precision	23
5.6.3	Primary Objective P3: Comparability	23
5.6.4	Primary Objective P4: Method Detection Limit	24
5.6.5	Primary Objective P5: Matrix Effects	24
5.6.6	Primary Objective P6: Technology Costs	25
5.6.7 Secondary Objective SI: Skills and Training Requirements	25
5.6.8	Secondary Objective S2: Health and Safety	25
5.6.9	Secondary Objective S3: Technology Portability	25
5.6.10	Secondary Objective S4: Sample Throughput	26
6.0 SAMPLE COLLECTION AND CHARACTERIZATION	26
6.1	Sample Collection	26
6.1.1	Procedure	26
6.1.2	Sample Shipping	26
6.2	Sample Preparation	27
6.3	Characterization of Environmental Samples	27
6.4	Sample Handling, Sample Tracking, and Sample Management	27
7.0 REFERENCE LABORATORY AND METHOD(S)	27
7.1	Reference Method Selection	27
7.2	Reference Laboratory Selection	28
7.3	Reference Laboratory Sample Preparation and Analytical Methods	28
8.0 DAT A MANAGEMENT	28
8.1	Data Reduction	28
8.2	Data Review	29
8.2.1	Data Review by Developers	29
8.2.2	Data Review by Reference Laboratory	29
8.2.3	Data Review by ITO (if applicable)	29
8.3	Data Reporting	30
8.3.1	Developer Data Package	30
8.3.2	Reference Laboratory Data Package	30
8.3.3	Innovative Technology Verification Reports	30
8.4	Data Evaluation Report	31
8.5	Datastorage	31
9.0 QA/QC PROCEDURES	32
9.1	QA/QC Objectives	32
9.2	Internal QC Checks	33
9.2.1	Reference Method QC Checks	33
9.2.2	Developer Technology QC Checks	33
9.3	Audits, Corrective Action, and QA Reports	33
9.3.1	Techni cal Sy stem s Audits	33
9.3.2	Corrective Action Procedures	34
9.3.3	QA Reports	34
10.0 HEALTH AND SAFETY PLAN	35
v

-------
Chapter 4 Refererences	37
Appendix	A-l
vi

-------
List of Figures
Figure 1-1. Overview of Technology Demonstration Process	
Figure 2-1. Elements of a Technology Demonstration	
Figure 2-2. Table of Contents from a Typical Technology Demonstration Plan
Figure 3-1. Typical Table of Contents from a Health and Safety Plan	
vii

-------
Abbreviations and Acronyms
CERCLA
Comprehensive Environmental Response, Compensation, and Liability Act

(a.k.a. Superfund)
CSCT
Consortium for Site Characterization Technology
DER
Data Evaluation Report
DoD
U.S. Department of Defense
DOE
U.S. Department of Energy
D/QAPP
Demonstration/Quality Assurance Project Plans
EPA
U.S. Environmental Protection Agency
ESTCP
Environmental Security Technology Certification Program
ETV
Environmental Technology Verification Program
HASP
Health and Safety Plan
ITO
Independent Testing Organization
ITVR
Innovative Technology Verification Report
MDL
Method Detection Limit
MMT
Monitoring and Measurement Technology
NERL
National Exposure Research Laboratory
ORD
Office of Research and Development
OSWER
Office of Solid Waste and Emergency Response
PE
Performance Evaluation
QA
Quality Assurance
QAPP
Quality Assurance Project Plan
QC
Quality Control
RPD
Relative Percent Difference
RSD
Relative Standard Deviation
SARA
Superfund Amendments and Reauthorization Act
SITE
Superfund Innovative Technology Evaluation
SOP
Standard Operating Procedure
SW-846
Test Methods for Evaluating Solid Waste, EPA Publication SW-846
TSA
Technical Systems Audit
TTEP
Technology Testing and Evaluation Program
viii

-------
Acknowledgements
The 1996 interim guidance document was prepared by Stephen Billets, Eric Koglin, and Gary
Robertson of the U.S. EPA National Exposure Research Laboratory (NERL). The interim
guidance document served as the basis of this revised and final guidance on the preparation of
demonstration/quality assurance project plans. The current document was prepared from the
interim guidance document by Battelle under the guidance and leadership of Dr. Billets. EPA
NERL thanks Eric Koglin, EPA, and Roger Jenkins, private consultant, for their peer review of
this document. EPA NERL also acknowledges the many developers who participated in the
Superfund Innovative Technology Evaluation Program who helped to make it a more effective
and efficient process, leading to the approaches which are described in this document.
IX

-------
X

-------
Chapter 1
Introduction
1.1 Purpose and Content of This Guidance Manual
The purpose of this manual is to provide guidance to testing organizations and technology
vendors for preparing a demonstration plan for the performance testing of field characterization
and monitoring technologies. A carefully developed demonstration plan assures that testing will
be performed in a manner that generates the high quality data necessary to verify the
performance of the technology. Furthermore, the demonstration plan assures that all appropriate
health, safety, regulatory, quality assurance (QA), and environmental concerns related to the
demonstration are addressed. This manual provides general guidance on the various aspects of
the performance verification process, and specifically how to develop such a plan. Where
appropriate, specific examples of how the guidance can or has been implemented by the EPA
Superfund Innovative Technology Evaluation (SITE) Monitoring and Measurement Technology
(MMT) Program are provided for reference.
Potential users of innovative approaches must be confident that new technologies perform as
anticipated. This is particularly important when environmental data are being collected to
support important decisions (for example, protection of human health and the environment,
remedy selection, risk assessment, regulatory enforcement, or litigation). Typically, most
information about the performance of innovative technologies comes from the vendor or
developer. However, a user's confidence and willingness to apply an innovative technology is
more likely following independent verification by a credible third-party organization. Ideally, the
test protocol should be recognized and accepted by EPA. The user community looks to the EPA,
because of its regulatory mission, to evaluate innovations that will improve the way the Nation
manages its environmental problems. Potential users may find new technologies appealing, but
without government acceptance of such technologies, users will often continue to rely on
accepted, conventional approaches, whether or not they are the most appropriate or cost-
effective.
This guidance document is divided into four chapters. Chapter 1 (this chapter) provides an
overview of the purpose of the SITE MMT Program and predecessor and successor programs
that were the basis of this guidance. A general description of the technology demonstration
process is also provided. Chapter 2 contains a description of how to use this guidance manual,
and an introduction to Chapter 3. Chapter 3 provides an example of how to prepare a
demonstration and quality assurance project plan (D/QAPP) under the SITE MMT Program. The
1

-------
guidance is in the form of an annotated outline. Each section of the D/QAPP is identified as a
subsection of the chapter. Each subsection contains a short description of the information that
should be included in the D/QAPP. The use of this standard structure will facilitate document
preparation and may reduce the amount of review time required for plan approval. This approach
will also help, if there is an EPA point of contact or a technical expert involved, to provide
timely assistance to the plan authors. References are provided in Chapter 4.
A note regarding the types of technologies applicable to this guidance is warranted here. Site
characterization and monitoring instruments can include a diverse assortment of technologies.
These can range from test kits (e.g., enzyme linked immunosorbent assays) to field portable
instrumentation (e.g., x-ray fluorescence spectrometers). Most, if not all, of the demonstration
plan elements described in Chapter 3 will be applicable to all types of technologies. However,
there are often special conditions and concerns that are unique to a particular type of technology.
These should be addressed on a case-by-case basis following the framework presented in
Chapter 3.
1.2 Evolution of the SITE MMT Program
A historical account of the evolution of the SITE MMT Program has been documented.1 The
U.S. Congress enacted the Comprehensive Environmental Response, Compensation, and
Liability Act (CERCLA), commonly known as Superfund, in 1980.2 The creation of this law
provided broad federal authority to respond directly to releases or threatened releases of
hazardous substances that might endanger public health or the environment. The 1986
Superfund Amendments and Reauthorization Act (SARA) amendments to CERCLA provided
legislation which mandated EPA, through its Office of Solid Waste and Emergency Response
(OSWER) and Office of Research and Development (ORD), to create the SITE Program. Prior to
enactment, the draft legislative language focused only on remediation, but the pressing need to
test field analytical measurement technologies prompted EPA to recommend the expansion of
this legislation to include monitoring. This led to the formation of the MMT arm of the SITE
Program.3
Technical staff in EPA's ORD National Exposure Research Laboratory, at the Environmental
Sciences Division facility in Las Vegas, Nevada, managed the MMT Program since its inception.
The early years of the program (1986-1992) focused solely on EPA ORD-sponsored projects, by
evaluating the performance of field monitoring technologies that were developed as part of ORD
contracts or cooperative agreements. These demonstrations were viewed as an extension of the
research which provided an opportunity for additional testing of the monitoring device. In 1995,
on the 25th anniversary of the first Earth Day, the president announced a new environmental
technology strategy, Bridge to a Sustainable Future. This government-wide strategy recognized
that industry is the primary creator of new technology and the main engine of sustained
economic growth. It assigned to government a catalytic role in promoting the development of
new technologies across a range of environmental sectors. It became clear that the objectives of
the MMT Program were shared by the U.S. Department of Defense (DoD) and the U.S.
2

-------
Department of Energy (DOE). On this basis, EPA integrated the MMT Program into the
Consortium for Site Characterization Technology (CSCT). The CSCT brought federal agencies
with a common need for faster, cheaper, and better monitoring technologies together with end-
users of these technologies to facilitate unbiased, third-party performance verification testing.4 In
1995, the CSCT and a newly formed EPA program called the Environmental Technology
Verification (ETV) Program5 collaborated jointly on technology verifications. The CSCT was
one of 12 pilot programs under ETV. Some of the other ETV pilot programs included air
pollution control, drinking water systems, and greenhouse gas emission technologies. During this
period, the SITE MMT Program, through the CSCT and the ETV Program, was leveraging
resources to verify monitoring and site characterization technologies. The collaboration ended in
1999 with the SITE MMT Program focusing on soil and sediment technologies that could be
applied to Superfund sites (more closely related to the original mission), while the ETV Program
focused primarily on monitoring technologies for air and water.
The SITE MMT Program was not the only pathway for developers and users of new and
emerging monitoring, measurement, and site characterization technologies trying to gain
acceptance or commercialize a technology. However, the Program attempted to fill many
technical and institutional needs. These included:
•	Providing a sound scientific basis for demonstrating and evaluating technology
performance;
•	Facilitating acceptance of innovative technologies by state, local, and federal
regulators;
•	Supporting the implementation and use of verified technologies;
•	Identifying and meeting changing user needs;
•	Increasing the number and commercial availability of innovative technologies;
•	Accelerating the routine use of innovative technologies being developed by DoD,
DOE, and other public and private entities into routine use at a faster rate;
•	Providing an incentive for developers to push the state of the technology beyond
present capabilities;
•	Leveraging resources and expertise among federal agencies, the private sector, and
academia; and
•	Identifying the technology and data gaps that impede cost-effective and efficient
environmental problem-solving and communicating them to the developer
community.
3

-------
An important product of the CSCT partnership was the development of the interim guidance
manual that captured the process by which technologies were to be demonstrated and evaluated.6
The interim guidance manual was used for 12 years and was the basis of this document, which is
an update to the original guidance. The current document represents the approach to
development of demonstration/quality assurance project plans (D/QAPPs) for independent
performance testing of site characterization and monitoring technologies. For the most part, it
relies on the experiences and the evolution of thinking gained over the last 12 years of
conducting demonstrations under the SITE MMT and other technology evaluation programs.3'5'
7, 8
1.3 Overview of the Technology Demonstration Process
This guidance provides developers and independent testing organizations (ITO) with a proven
and clearly defined technology demonstration pathway, from planning through testing, reporting,
and finally information dissemination (Figure 1-1). The technology demonstration process is
intended to serve as a template for conducting technology demonstrations that will generate
high-quality data needed by EPA and others to verify technology performance. The verification
process is a model process that can help in moving innovative site characterization and
monitoring technologies into routine use more quickly. An ITO can be funded by a technology
developer, EPA, or some other source. Following this guidance document will allow the ITO to
conduct an unbiased performance test of a technology or group of technologies. Activities
performed by the ITO include: assisting in designing the performance tests; assisting with
identification, selection, and/or access of the field test site(s); overseeing or conducting the actual
testing of technologies; conducting quality assurance/quality control (QA/QC) oversight
activities; and submitting reports on technology performance. These activities can be performed
in whole or in part by the technology developer, but the independence of the testing organization
brings credibility to the demonstration process. It is important that the results of the
demonstration be publicly available through posting on Web sites, technical presentations, press
releases, and newsletters. If EPA is involved in the demonstration process, then the relevant
information from the demonstration will be posted on the program's Web sites. However, much
of the responsibility for information dissemination rests with the developer, which must put the
information in the hands of those who need the performance information in order to gain interest
and/or acceptance of their technology.
Prepare
Performance
Report
Conduct
Demonstration
Prepare
D/QAPP
Dissemination
of Results
Figure 1-1. Overview of Technology Demonstration Process
4

-------
Chapter 2
How to Use This Guidance Manual
2.1 Demonstration/Quality Assurance Project Plan Overview
As an expansion of Figure 1-1, Figure 2-1 depicts the major activities associated with each step
in a technology demonstration.1 This guidance manual focuses on key elements of a
demonstration plan. The activities associated with the planning, demonstration, and data
evaluation steps are described as part of the plan's development.
Planning
	*
Demonstration
—~
Evaluation
•	Identify technology category
•	Technology selection process
•	Technical panel involvement
•	D/QAPP
•	Pre-demonstration testing
•	Sample collection/preparation
•	Sample pre-characterization
•	Field demonstration
•	Reference analysis
*	Data analysis/reduction
*	Report preparation
•	Peer review
•	Information dissemination
Figure 2-1. Elements of a Technology Demonstration
Typically, D/QAPPs are 50-100 pages in length, plus appendices where procedures, checklists,
and other documents related to the execution of the demonstration, as appropriate, are housed.
The D/QAPP serves a number of purposes. First, it provides a "roadmap" for the demonstration.
It contains detailed guidance for those executing the demonstration on how data need to be
generated and collected to support an objective performance evaluation (PE) of the technology.
Second, it is an important reference for those who choose to review the steps used by the
developer or ITO in executing the demonstration to assess the validity of the process. Finally, it
can serve as a useful reference for other organizations in building future demonstration plans
involving related technologies.
2.2 Building a D/QAPP
Figure 2-2 is a typical Table of Contents for a D/QAPP. It is derived from the section headings in
Chapter 3. The section order and content specified in Chapter 3 of this manual should be used as
guidance. The user of Chapter 3 is advised of the font appearance conventions used to
distinguish text that is intended to provide guidance from text that can be directly included in the
plan. The portions of Chapter 3 with the font having a normal appearance are intended to be
included directly into the demonstration plan, assuming the narrative is appropriate to the
demonstration. In places where there is a name, date, single word, or phrase to be inserted, it
appears in bold. Finally, the text in italics is intended to serve as guidance to the user in how to
prepare the specific section. The developer should provide the technology-specific information,
while the D/QAPP author, who may not be the technology developer, can provide the remaining
5

-------
portions identified in Chapter 3. Variations in the content of a technology-specific demonstration
plan are expected, since different technologies have different characteristics and needs. For
example, some field analytical technologies will have a directly corresponding laboratory
method for reference analysis, while others, such as down-hole geophysical measurements, may
require other methods of confirming performance. In addition, the preparer of the D/QAPP may
choose not to include all technical elements that are suggested in Chapter 3. It is expected that
content of a D/QAPP will be modified to meet the needs of a particular technology category
while still meeting the general guidelines and data quality expectations set forth in Chapter 3.
6

-------
TITLE PAGE
CONCURRENCE SIGNATURES
NOTICE

ABSTRACT
TABLE OF CONTENTS
ABBREVIATIONS AND ACRONYMS
ACKNOWLEDGEMENT S
1.0
INTRODUCTION
1.1
Description of Testing Program
1.2
Purpose and Scope of Demonstration
1.3
Background of the Problem
1.4
Sources of Contaminant(s) of Interest
1.5
Traditional Measurement Methods
2.0
DEMONSTRATION RESPONSIBILITIES AND COMMUNICATION
2.1
Developer Personnel
2.2
EPA Project Personnel (if applicable)
2.3
Independent Testing Organization Project Personnel (if applicable)
2.4
Demonstration Site Representatives
2.5
Reference Laboratory Personnel
2.6
Suppliers of Performance Evaluation Samples
3.0
DEVELOPER TECHNOLOGY DESCRIPTION(S)
3.1
Technology Name
3.2
Operating Procedure
3.3
Advantages and Limitations
4.0
DESCRIPTIONS OF DEMONSTRATION SITE AND SAMPLING LOCATIONS
4.1
Demonstration Site Description
4.2
Description of Sampling Locations
5.0
DEMONSTRATION APPROACH
5.1
Demonstration Objectives
5.2
Overview of Demonstration Samples
5.3
Pre-Demonstration Study
5.4
Demonstration Schedule
5.5
Demonstration Design
5.6
Assessment of Primary and Secondary Objectives
6.0
SAMPLE COLLECTION AND CHARACTERIZATION
6.1
Sample Collection
6.2
Homogenization of Environmental Samples
6.3
Characterization of Environmental Samples
6.4
Sample Handling, Sample Tracking, and Sample Management
7.0
REFERENCE LABORATORY AND METHOD(S)
7.1
Reference Method Selection
7.2
Reference Laboratory Selection
7.3
Reference Laboratory Sample Preparation and Analytical Methods
8.0
DATA MANAGEMENT
8.1
Data Reduction
8.2
Data Review

8.2.1 Data Review by Developers

8.2.2 Data Review by Reference Laboratory

8.2.3 Data Review by ITO (if applicable)
8.3
Data Reporting

8.3.1 Developer Data Packages

8.3.2 Reference Laboratory Data Packages

8.3.3 Innovative Technology Verification Reports
8.4
Data Evaluation Report
9.0
QA/QC PROCEDURES
9.1
QA/QC Objectives
9.2
Internal QC Checks

9.2.1 Reference Method QC Checks

9.2.2 Developer Technology QC Checks
9.3
Audits, Corrective Actions, and QA Reports

9.3.1 Technical Systems Audits

9.3.2 Corrective Action Procedures

9.3.3 QA Reports
10.0
HEALTH AND SAFETY PLAN
11.0
REFERENCES
APPENDICES (procedures; checklists; lists, etc., as required by the plan)
Figure 2-2. Table of Contents from a Typical Technology Demonstration Plan
7

-------
Chapter 3
Elements of a Demonstration/Quality Assurance Project Plan
Title Page
The title page should include the name of the technology category (or the technology itself if only
one technology is being demonstrated) and the authors responsible for development of the
D/QAPP.
Concurrence Signatures
This is the page where approval signatures are documented. It is important to note that the
completed D/QAPP must be approved by the appropriate people prior to implementation and
use. Who approves the D/QAPP will be dependent upon the circumstances surrounding the
demonstration, but written approval by the technology developer should always be required.
Other possible signatory approvals include EPA, ITO technical lead, QA staff, Health and Safety
staff, reference laboratory personnel, and other key site personnel, as appropriate.
Notice
Notices (e.g., disclaimers) are part of all EPA publications. This may not be required if the
D/QAPP is not being prepared/reviewed in conjunction with EPA.
Abstract
The abstract should be less than a one-page description of the demonstration. Include a
summary description of the primary and secondary objectives which will be verified during the
demonstration, the demonstration sites, schedule, and a list of participants.
Table of Contents
An example Table of Contents is provided as Figure 2-2. The Table of Contents should include
the headings provided in this manual although they may be modified as appropriate for a
particular technology demonstration.
Abbreviations and Acronyms
A list of the abbreviations and acronyms used in D/QAPP should be provided.
8

-------
Acknowledgements
This section should recognize those people who are not authors of the D/QAPP but who
contributed to its development. Examples of people to include in the Acknowledgements are
technical support personnel, sample collection personnel, test site hosts, and reference
laboratory staff
1.0 INTRODUCTION
This chapter describes the program under which the demonstration is being conducted, the scope
of the demonstration, and pertinent information on the purpose of the demonstration. The
following is an example as if the demonstration was being conducted under the SITE MMT
Program.
The U.S. Environmental Protection Agency (EPA), Office of Research and Development (ORD),
National Exposure Research Laboratory (NERL) has contracted with ITO to conduct a
demonstration of monitoring and measurement technologies for contaminants of interest in
environmental matrix to be tested. The demonstration is being conducted as part of the EPA
Superfund Innovative Technology Evaluation (SITE) Monitoring and Measurement Technology
(MMT) Program from date to date, in City, State. The purpose of this demonstration is to
obtain reliable performance and cost data on the participating technologies in order to provide (1)
potential users with a better understanding of the technologies' performance and operating costs
under well-defined field conditions and (2) the technology developers with documented results
that will help promote the acceptance and use of their technologies.
This demonstration plan describes the procedures that will be used to verify the performance of
each measurement technology. The plan also incorporates a site health and safety plan and the
quality assurance and quality control (QA/QC) elements needed to ensure that data of sufficient
quality is generated to document each technology's performance. This plan has been prepared
using, "A Guidance Manual for the Preparation of Site Characterization and Monitoring
Technology Demonstration Plans."
This demonstration plan describes the name of testing program, the scope of the demonstration,
and other pertinent information on the purpose of the demonstration, such as descriptions or
definitions of the problem being addressed by the technology(ies) (Chapter 1); the demonstration
organization and responsibilities of the participants (Chapter 2); the number of technologies that
will be demonstrated (Chapter 3); sample collection, sample handling procedures, and other
sample preparation procedures that might be unique to this demonstration, such as sample
homogenization (Chapter 4); the demonstration site and the sampling locations (Chapter 5); the
demonstration approach, including the objectives, experimental design, data analysis procedures,
and the demonstration schedule (Chapter 6); the confirmatory process, including the reference
9

-------
methods and the reference laboratory that will be used during the demonstration (Chapter 7); the
data management procedures (Chapter 8); the QA/QC procedures (Chapter 9); the health and
safety plan (Chapter 10); and references (Chapter 11).
1.1 Description of Testing Program
The following is an example as if the demonstration was being conducted under the SITEMMT
Program.
Performance verification of innovative environmental technologies is an integral part of the
regulatory and research mission of EPA. The SITE Program was established by the EPA Office
of Solid Waste and Emergency Response and ORD under the Superfund Amendments and
Reauthorization Act of 1986. The overall goal of the Program is to conduct performance
verification studies and to promote the acceptance by the user and regulatory community of
innovative technologies that may be used to achieve long-term protection of human health and
the environment. The Program is designed to meet three primary objectives: (1) identify and
remove obstacles to the development and commercial use of innovative technologies, (2)
demonstrate promising innovative technologies and gather reliable performance and cost
information to support site characterization and cleanup activities, and (3) develop procedures
and policies that encourage use of innovative technologies at Superfund sites as well as at other
waste sites or commercial facilities.
The demonstration of monitoring and measurement technologies for compound(s) of interest is
being conducted as part of the MMT Program, which provides developers of innovative
sampling, monitoring, and measurement technologies with an opportunity to demonstrate their
technology's performance under actual field conditions (where appropriate). These technologies
may be used to sample, detect, monitor, or measure hazardous and toxic substances in water,
soil, soil gas, and sediment. The technologies include chemical sensors for in situ measurements,
groundwater, soil, and sediment samplers, field portable analytical equipment, and other systems
that support field sampling and analysis.
The MMT Program promotes acceptance of technologies that can be used to (1) accurately
assess the degree of contamination at a site, (2) provide data to evaluate potential effects on
human health and the environment, (3) apply data to assist in selecting the most appropriate
cleanup action, and (4) monitor the effectiveness of a remediation or mitigation process. The
Program places a high priority on innovative technologies that provide more cost-effective,
faster, or safer methods for producing real-time or near-real-time data than conventional,
laboratory-based technologies. These innovative technologies are demonstrated under field
conditions, and the results are compiled, evaluated, published, and disseminated by the ORD.
The MMT Program's technology verification process is designed to conduct demonstrations that
will generate high-quality data so that potential users have reliable information regarding the
technology performance and cost. Four steps are inherent in the process: (1) needs identification
and technology selection, (2) demonstration planning and implementation, (3) report preparation,
10

-------
and (4) information distribution. The first step of the technology verification process begins with
identifying technology needs of the EPA and regulated community. The EPA Regional offices,
the U.S. Department of Energy, the U.S. Department of Defense, industry, and state
environmental regulatory agencies are asked to identify technology needs for sampling,
measurement, and monitoring of environmental media. Once a need is identified, a search is
conducted to identify suitable technologies that will address the need. The technology search and
identification process consists of examining industry and trade publications, attending related
conferences, exploring leads from technology developers and industry experts, and reviewing
responses to announcements of the demonstration.
The second step of the technology verification process is to plan and implement a demonstration
that will generate representative, high-quality data to assist potential users in selecting a
technology. Demonstration planning activities include a pre-demonstration sampling and
analysis investigation that assesses existing conditions at the proposed demonstration site or
sites. The objectives of the pre-demonstration investigation are to (1) provide an initial
assessment to the technology developer as to the potential performance of the technology
without going to the expense of a full-blown verification test; (2) confirm available information
on applicable physical, chemical, and biological characteristics of contaminated media at the
sites to justify selection of site areas for the demonstration; (3) provide the technology
developers with an opportunity to evaluate the areas, analyze representative samples, and
identify logistical requirements; (4) assess the overall logistical requirements for conducting the
demonstration; and (5) select and provide the reference laboratory involved with an opportunity
to identify any matrix-specific analytical problems associated with the contaminated media and
to propose appropriate solutions. Information generated through the pre-demonstration
investigation is used to develop the final demonstration design and to confirm the nature and
source of samples that will be used in the demonstration.
Demonstration planning activities also include preparation of a demonstration plan that describes
the procedures to verify the performance and cost of each technology. The demonstration plan
incorporates information generated during the pre-demonstration investigation as well as input
from technology developers, demonstration site representatives, and technical peer reviewers.
The demonstration plan also incorporates the QA/QC elements needed to produce data of
sufficient quality to document the performance and cost of each technology.
During the demonstration, each technology is evaluated independently and, when possible and
appropriate, is compared to a reference technology. The performance and cost of one technology
are not compared to those of another technology evaluated in the demonstration. Rather,
demonstration data are used to evaluate the performance, cost, advantages, limitations, and field
applicability of each technology.
As part of the third step of the technology verification process, EPA publishes a detailed
evaluation in an innovative technology verification report (ITVR) for each participating
technology. The participating technologies are not directly compared to each other, only where
possible to a reference method's results, since each technology is typically targeted for different
11

-------
needs. In addition, it was not the purpose of the SITE Program to choose a winner or endorse a
particular technology, but to provide information on each technology leading to an informed
decision. To ensure its quality, the ITVR is published only after comments from the technology
developer and external peer reviewers are satisfactorily addressed.
All demonstration data used to evaluate each technology are summarized in a data evaluation
report (DER) that constitutes a complete record of the demonstration. The DER documents the
underlying quality of the demonstration and contains much more detailed information than the
ITVR, including items such as certificates of analysis, completed chains-of custody forms, and
raw data results, which are not appropriate to include in the ITVR. The DER is not published as
an EPA document, but an unpublished copy may be obtained from the EPA project manager.
The fourth step of the verification process is to distribute demonstration information. To benefit
technology developers and potential technology users, EPA distributes fact sheets, newsletters,
brochures, bulletins and ITVRs through direct mailings, at conferences, and on the Internet.
Information on the SITE Program, including the publication of all D/QAPPs and ITVRs, is
available on the EPA ORD Web site (http://www.epa.gov/ORD/SITE). Additionally, a Visitor's
Day is held in conjunction with the demonstration so that potential users can have a first-hand
look at the technologies in operation.
1.2	Purpose and Scope of Demonstration
Describe the intent of the demonstration. Note how many technologies will participate in the
demonstration, and where and when the demonstration will take place. Any other pertinent
information to the scope of the demonstration can also be noted in this section.
1.3	Background of the Problem
Provide a brief background of the problem that the technology (ies) being tested are designed to address.
For example, if the demonstration is designed to evaluate a technology's ability to detect an
organic contaminant in soil, then provide background information on the contaminant, the
different names and categorization strategies for these compounds, and other relevant
information that will be needed by the reader to fully understand why this demonstration should
be conducted. This description should include relevancy to EPA and/or state methods and
regulations, as appropriate.
1.4	Sources of Contaminant(s) of Interest
This section should briefly describe the sources of target analytes or contaminants being
analyzed in the demonstration. The description might include how the test sites became
contaminated.
12

-------
1.5 Traditional Measurement Methods
Describe the traditional measurement methods used to analyze the compounds of interest in this
demonstration. If there are multiple traditional methods being used, provide details on each
method in individual subsections (i.e., 1.5.1, 1.5.2, etc.). Provide information on specifics such
as the calibration range of the technique, general sample sizes, and final sample volume, as
applicable. Also, in the case of multiple methods, note which traditional method was chosen as
the reference methodfor the demonstration and the rationale for the decision.
2.0	DEMONSTRATION RESPONSIBILITIES AND COMMUNICATION
This chapter identifies key project personnel and summarizes their responsibilities in planning
and executing the demonstration. Figure 2-X is an organization chart that shows key project
personnel and the lines of communication among them. Table 2-X presents the key
demonstration participants. During the demonstration, the participants will be asked to follow the
health and safety procedures outlined in Chapter 10. However, each organization is directly and
fully responsible for the health and safety of its own employees.
Provide an organizational chart (Figure 2-X) on the next full page that identifies all
participating parties, the key personnel, and their connections. See Appendix A, Figure A-l for
an example organization chart from a SITE MMT demonstration. On the following page, provide
a table (Table 2-X) that provides the name of each organization involved in the demonstration,
the point of contact for that organization, and the contact information for the point of contact.
Each category (organization, point of contact, and contact information) should be a separate
column. The contact information should include an address, telephone number, fax number, and
e-mail address. Multiple points of contact may be listed in the table, but contact information
should only be providedfor the lead individual. See Appendix A, Table A-l for an example of a
Demonstration Participant's table from a SITE MMT demonstration.
2.1	Developer Personnel
The responsibilities of the developer will vary depending on the type of demonstration. The
following example assumes that an ITO will be involved.
The developers of the number technologies {or developer) are responsible for providing,
mobilizing, operating, and demobilizing their respective technologies at the demonstration site.
The developer responsibilities include the following:
•	Provide ITO with information on the technology.
•	Review and concur with the D/QAPP.
•	Notify ITO in writing of technology-specific requirements, such as the type of power
supply and the amount of work space needed, so that proper arrangements can be
made for field demonstration of the technologies.
13

-------
•	Provide the personnel and all supplies needed for demonstration of the technologies
unless otherwise arranged in advance with ITO.
•	Analyze the samples specified in the D/QAPP.
•	Analyze developer-specified QC samples (for example, blanks or standards) in
accordance with the technology specifications.
•	Provide technology-specific demonstration results to ITO at the end of the
demonstration.
•	Review and comment on the technology-specific ITVRs.
•	Conduct all activities in accordance with the schedule to ensure timely completion of
the final report.
2.2	EPA Project Personnel (if applicable)
The EPA program manager, name of EPA project manager, has overall responsibility for the
project. Name of EPA project manager will review and concur with the project deliverables,
including the demonstration plan, ITVRs, and DER. The EPA QA officer at the EPA NERL,
name of EPA QA officer, is responsible for reviewing and concurring with the D/QAPP. The
roles for EPA in this demonstration include:
•	Review and approve the D/QAPP.
•	Review and approve the DER and ITVRs.
•	Be present at the demonstration.
•	Participate in Visitor's Day.
•	Coordinate activities with the ITO project manager.
2.3	Independent Testing Organization Personnel (if applicable)
The ITO project manager, name of ITO project manager, is responsible for conducting day-to-
day management of ITO project personnel, maintaining direct communication with the
developers (and EPA, where appropriate), and ensuring that all ITO personnel involved in the
demonstration understand and comply with the D/QAPP. Name of ITO project manager is also
responsible for distributing the draft and final D/QAPPs to all key project personnel and for
reviewing measurement and analytical data obtained during the demonstration. ITO project
personnel will assist name of ITO project manager in preparing project deliverables and in
performing day-to-day project activities. ITO project personnel are responsible for the
following elements of the demonstration:
•	Developing and implementing all elements of this D/QAPP.
•	Scheduling and coordinating the activities of all demonstration participants.
•	Coordinating the collection of samples; performing sample homogenization;
performing characterization analyses for compounds of interest; and sample
aliquoting.
•	Coordinating activities with suppliers of certified samples.
•	Developing and maintaining sample control process and distributing samples during
the demonstration.
14

-------
•	Auditing the reference laboratory (name of reference laboratory) to verify that the
operations are properly performed.
•	Overseeing the operation of the developer technologies and documenting the
operation of each technology during the demonstration.
•	Summarizing, evaluating, interpreting, and documenting demonstration data for
inclusion in the ITVRs and DER.
•	Evaluating and reporting on the performance and cost of each technology.
•	Preparing draft and final versions of ITVRs (one for each technology).
•	Preparing draft and final versions of the DER, consistent with the format and content
of historical documents.
•	Coordinating meetings among demonstration participants.
•	Providing required planning, scheduling, cost control, documentation, and data
management for field activities.
•	Managing demobilization activities, including proper waste disposal.
•	Immediately communicating any deviation from the demonstration plan during field
activities to the EPA program manager and discussing appropriate resolutions of
the deviation.
•	Interfacing with the demonstration site representatives and making logistical
preparations for the demonstration.
Tasks for specific ITO staff will include:
Provide bulleted paragraphs for each of the key personnel from the ITO stating their specific
responsibilities for the demonstration.
2.4	Demonstration Site Representatives
Name the representatives for the demonstration site and their affiliation. Identify the
responsibilities of each site representative.
2.5	Reference Laboratory Personnel
Identify the reference laboratory that will be performing the reference analyses. Provide the
names of key laboratory personnel that will participate in the reference analyses. Also, briefly
describe the responsibilities of the key staff.
2.6	Suppliers of Performance Evaluation Samples
Provide any relevant information on the PE samples in this section by adding to the provided
language.
The performance evaluation (PE) samples will be supplied from various sources (see Section
6.2.X). This will include purchasing standard reference materials and preparation of spiked
15

-------
samples. All activities, including purchasing standard reference materials and spiked sample
preparation, will be conducted under the direct supervision of the ITO project manager.
3.0	DEVELOPER TECHNOLOGY DESCRIPTION(S)
This chapter contains technology descriptions for each of the number technologies that are
participating in the demonstration. This information was provided by the developer(s) with only
editorial changes made by ITO to ensure consistency and the needs of this document. The
technology description, operating procedure, and advantages and limitations presented below are
based on information provided by the developer(s).
3.1	Technology Name
Provide technology descriptions, supplied by the developer, for each technology that participates
in the demonstration. There should be a separate section (i.e., 3.1, 3.2, etc for each technology.
Provide a subsection for a technology description (i. e., 3.1.1 Technology Description), one for
the technology operation (i.e., 3.1.2 Operating Procedure), and one to discuss the advantages
and limitations of the technology (3.1.3 Advantages and Limitations).
3.1.1 Technology Description
The technology description should include:
•	A brief introduction and discussion of the scientific principles on which the
technology is based before the Technology Description section.
•	A brief description of the physical construction/components of the technology.
Include general environmental requirements and limitations, size, weight,
transportability, ruggedness, power, and other consumables needed, etc.
•	Identify the parameters or analytes the technology is designed to measure.
•	Identify the matrices for which the technology is applicable, e.g., soil, water, sludge,
etc.
•	Cost of the technology (purchase or lease and typical operational costs).
•	Typical operator training requirements and sample handling or preparation
requirements.
•	Define the performance range of the technology and verification requirements of the
demonstration.
16

-------
•	Identify any special licensing requirements associated with the operation of the
technology (for example, a technology that contains a radioactive source).
•	Provide a picture of each technology and its associated components.
3.1.2	Operating Procedure
Provide detailed steps to perform an analysis using the technology.
3.1.3	Advantages and Limitations
Describe the applications of the technology and what advantages it provides over existing
technology. Provide comparisons in such areas as: initial cost, cost per analysis, speed of
analysis, precision and accuracy of the data, usable or linear operating range, field versus
laboratory operation, solvent use, durability, potential for waste minimization, etc.
Discuss the known limitations of the technology. Include such items as detection limits in various
matrices (as appropriate), interferences, environmental limits (temperature, vibration, light,
dust, power requirements, water needs, etc.), upper concentration limits, linear range, operator
training, and experience requirements, etc.
4.0	DESCRIPTIONS OF DEMONSTRATION SITE AND SAMPLING LOCATIONS
This chapter describes the demonstration site and the sampling locations and why each was
selected.
The technology(ies) should be tested under different geologic, climatologic, and waste
environments. The technology (ies) can be demonstrated at more than one site, if resources are
available to support multiple demonstration sites. An alternative would be to conduct the
demonstration at only one site and bring in samples from various other sites.1 Information on the
site history and site characteristics should be available through the ITO or EPA contact unless
the developer is making its own arrangements for the demonstration sites.
4.1	Demonstration Site Description
This section describes the site selected for hosting the demonstration, along with the selection
rationale and criteria. The candidate sites were required to meet certain selection criteria,
including necessary approvals, support, and access to the demonstration site; enough space and
power to host the technology developers, ITO, and other participants; and various levels of
analyte of interest-contaminated soil and/or sediment that could be analyzed as part of the
demonstration. Historically, these demonstrations are conducted at sites known to be
contaminated with the analytes of interest. The visibility afforded the sites is a valuable way of
keeping the local community informed of new technologies.
17

-------
Provide the demonstration site name(s) and location(s); where appropriate, area and location
maps should be included. Be sure to include information on the site history. Include history of
ownership and uses, especially information relating to the contamination found at the site.
Provide summarized reasons as to why the site was selected. This description should include a
geological description of the site, including soil types, etc. Provide a list of the known
contaminants at the site, including the distribution and estimated concentrations.
4.2 Description of Sampling Locations
This section provides an overview of the number sampling sites and methods of selection. Table
4-X summarizes each of the locations, what type of sample was provided, and the number of
samples from each location. Describe why and how the number of sampling locations were
selected. It should be noted that it is not an objective of the demonstration to characterize the
concentration of analytes of interest in material from a specific sampling location at a particular
contaminated site. Because the samples are homogenized, they may not be representative of
actual site conditions. It is, however, necessary to ensure comparability between technology
results and the reference laboratory results, which is why the samples are homogenized. State
how the samples will be homogenized. An example of homogenization procedures can be found
in SITED/QAPPs.9
Provide a subsection for each sampling location (e.g., 4.2.1, 4.2.2, etc.). Provide descriptions of
the sampling site as provided by the site owners/sample providers. As appropriate, note that
information was provided by the site owners/sample providers, and only editorial changes were
made. The descriptions should include the sampling locations and how specific sampling
locations within the site were selected. Considerations would include such things as source of
contamination, analytes, concentration, matrix type, sampling depth, etc.
5.0	DEMONSTRATION APPROACH
This chapter presents the objectives, design, data analysis procedures, and schedule for this
technology demonstration. Guidance for demonstration plans is also available from other
government programs, such as the DoD 's Environmental Security Technology Certification
Program (ESTCP).8 In addition, published test plans for assessing the performance of
environmental monitoring technologies generated under EPA technology evaluation programs,
such as the SITE MMT Program,3 Technology Testing and Evaluation Program (TTEP), 7 and
the Environmental Technology Verification (ETV) Program5 are valuable resources when
planning a demonstration.
5.1	Demonstration Obj ectives
The primary goal of the SITE MMT Program is to develop reliable performance and cost data on
innovative, commercial-ready technologies. A SITE demonstration must provide detailed and
reliable performance and cost data so that technology users have adequate information to make
sound judgments regarding comparability to conventional methods. The demonstration has both
18

-------
primary and secondary objectives. Primary objectives are critical to the technology evaluation
and require the use of quantitative results to draw conclusions regarding a technology's
performance. Secondary objectives pertain to information that is useful but will not necessarily
require the use of quantitative results to draw conclusions regarding a technology's performance.
Each report will summarize the findings of these objectives and provide sufficient documentation
for a user to choose an alternative to conventional technology.
The primary objectives for the demonstration of the participating technologies are as follows:
PI. Determine the accuracy.
P2. Determine the precision.
P3. Determine the comparability of the technology to EPA standard methods.
P4. Determine the method detection limit (MDL).
P5. Evaluate the impact of matrix effects on technology performance.
P6. Estimate costs associated with the operation of the technology.
The primary objectives should at least include the six listed above, provided that they are
appropriate for the technology (ies) being tested. Other primary objectives can be added, such as
false positives/false negatives, depending on the technology data and output.
The secondary objectives for the demonstration of the participating technologies are as follows:
51.	Document the skills and training required to properly operate the technology.
52.	Document health and safety aspects associated with the technology.
53.	Document the portability of the technology.
54.	Evaluate sample throughput.
The secondary objectives should include those listed above, provided that they are appropriate
for the technology(ies) being tested. Others could be added if necessary.
The objectives for the demonstration were developed based on input from the analyte of interest
SITE Demonstration Panel members (if appropriate), general user expectations of field
measurement technologies, the time available to complete the demonstration, technology
capabilities that the developers participating in the demonstration intend to highlight, and the
historical experimental components of former SITE Program demonstrations to maintain
consistency.
5.2 Overview of Demonstration Samples
The goal of the demonstration is to perform a detailed evaluation of the overall performance of
the technology for use in contaminated site evaluation. The demonstration objectives will be
centered on providing performance data that support action levels for contaminated sites.
Describe the action levels prescribed for the contaminant(s) of interest. Describe the different
sample types that will be used as part of this demonstration. Provide information on what test
19

-------
parameters will be determined with each set of test samples. Provide a table that gives the
performance objective, the type of sample that will be evaluatedfor that objective, and the range
of concentrations (or the contaminant(s) of interest) that will be testedfor that objective.
Provide a detailed description of each sample type that will be used in the demonstration in a
separate subsection. For example, provide information on PE samples in one section (6.2.1),
environmental samples in another (6.2.2), and extracts in another (6.2.3). For each sample type,
discuss details about each sample as appropriate, such as from where the samples will be
obtained (e.g., NIST or demonstration site locations), the organization responsible for handling
and/or analyzing the samples, brief descriptions of the analysis methodology, and, in the case of
PE samples, analysis guidelines.
5.3	Pre-Demonstration Study
The best way to predict and prevent problems from occurring during the demonstration is to
perform a "dry run" exercise. This was accomplished through a pre-demonstration study. The
pre-demonstration study served as a final readiness check for the developer so that modifications
could be made to their procedure if warranted by site-specific conditions. It was also a test of the
demonstration plan to ensure a well-established process of sampling, compositing,
homogenizing, splitting, extract preparation and aliquoting, and shipping of samples to the
developers and the reference laboratory. The pre-demonstration study consisted of number
samples, including list the samples used in the pre-demonstration study. A distribution of the
sample concentrations, as determined by the characterization analyses (see Section 4.3), is
presented in Figure 5-X. The samples selected for the pre-demonstration study covered a wide
range of concentrations and included a representative of each environmental site that will be
analyzed during the demonstration.
Briefly describe the overall design of the pre-demonstration study. The reference laboratory
should analyze all pre-demonstration samples blindly, and this should be noted in this section.
Describe how the data was collected and distributed by the ITO. Note that if an ITO is not
involved, it is still appropriate for the developer to perform a pre-demonstration study to confirm
that the technology is fully ready for verification testing.
5.4	Demonstration Schedule
Describe where the developer will analyze the demonstration samples. Discuss in detail the
schedule for the demonstration study. Indicate day-by-day what will happen. Because the
demonstration study is meant to simulate the use of the technology of interest in the field,
developers should analyze the samples at the site. Indicate how many and what type of samples
the developers are required to analyze in the field. Discuss what will happen if a developer is
unable to complete all analyses in the field. Provide a figure detailing the events and schedule.
Example demonstration schedules9 are provided in Appendix A in Table A-2 and Figure A-2.
20

-------
5.5	Demonstration Design
Tables 6-X through 6-X include a generic summary of the samples to be included in the
demonstration.
Describe how the samples will be identifiedfor analysis, what samples will be tested (including
the use of any QC samples), how samples will be randomized and labeled, and how chain-of-
custody will be used to ensure the proper delivery of the samples. Include a brief explanation of
why the concentrations to be used are distributed as they are (for example, including more
samples around key regulatory decision levels). Also detail what (if any) sample information
will be provided with the samples (e.g., samples believed to be above a certain concentration
should be marked to alert the recipient of potential safety concerns). Discuss any sample
identification requests that the developers have made.
Understanding the operational aspects of a technology is important for any end-user. To
accomplish this in a demonstration study, it is recommended that independent technical
observers (for example, from the ITO) be on location at the demonstration site to watch the
developers use each technology. Checklists can be provided to each observer to guide their
observations. An example checklist is provided in Appendix A, pages A-9 through A-13. Any
such checklist used in the demonstration should be provided in a separate appendix in the report.
Describe the use of independent technical observers as part of the demonstration study.
Discuss any waste that might be generated and how it will be handled. Describe how long
testing will be allowed to continue on each day. If an ITO is involved, reiterate that the
developers will be operating their own technologies and state what equipment they are
responsible for bringing (e.g., all supplies and equipment necessary for operating their
technology, any needed personal protective equipment, etc.).
5.6	Assessment of Primary and Secondary Objectives
The purpose of this section is to discuss how each objective will be assessed. Each objective will
be discussed in detail in a separate subsection. Before beginning the first subsection, provide
details on the analysis by the reference laboratory. List what the reference laboratory will be
analyzing for and how (i.e., methods used). Also, discuss the QA/QC procedures employed by
the reference laboratory, including how non-detects andflags will be handled and implemented.
If useful, a table listing what the reference laboratory will report versus what each technology
will report could be included.
Primary Objectives generally include the following measures: accuracy, precision,
comparability, method detection limits, matrix effects, and technology costs. Secondary
Objectives generally include skills and training required to properly operate the instrument,
health and safety aspects, portability, and sample throughput. Other objectives, such as false
positives/false negatives, can be added based on the technology category. These objectives can
be presented in any order. Information is provided below for each objective. The text should be
21

-------
expanded as necessary to discuss the particulars of a given technology category. The general
concepts should not change, but the parameters to be evaluated within a category can vary
depending upon the specific circumstances. In addition, appropriate consideration should be
given to additional or alternative statistical approaches. The text provided below should only
serve as an example.
5.6.1 Primary Objective PI: Accuracy
The determination of accuracy for each technology's measurements will be based on the extent
to which they agree with the certified or spiked levels of PE samples. For each technology, PE
samples containing concentrations from across the analytical range of interest will be analyzed.
The technology measurements from the number PE samples will be evaluated to determine
whether there is a statistically significant difference between the technology measurements and
the certified value or spiked level. Percent recovery values relative to the certified or spiked
concentrations will also be calculated. ITO (if appropriate) will evaluate whether a statistically
significant difference exists between a given technology's results and the reference values by
performing a two-tailed, paired, Student's t-test. The null hypothesis will be that the mean
difference between the technology results and the certified or spiked value is zero. The PE
samples will also be analyzed by the laboratory reference method for confirmation of certified
and spiked values.
To evaluate accuracy, the average of replicate results from the field technology measurement
will be compared to the certified or spiked value of the PE samples to calculate percent recovery.
The equation to be used will be:
R = C/Crx 100
where C is the average concentration value calculated from the technology replicate
measurements and Cr is the certified value. For the spiked samples, if the reference laboratory's
average measured value is within 10% of the spiked concentration value, the spiked
concentration value will be used as the certified value. If the average measured value by the
reference laboratory is > 10% different, the reference laboratory's average measured value will
be the certified value.
Acceptable R values are between 75% and 125%.
It is possible that PE samples will not be commercially available for the contaminants of interest.
If such is the case, PE materials could be prepared by a reputable source. Alternatively,
accuracy could be measured relative to reference laboratory measurements rather than to
certified concentrations if such samples are not available and/or appropriate for the technology
being tested.
22

-------
5.6.2 Primary Objective P2: Precision
A technology's precision refers to its reproducibility. Higher precision leads to less uncertainty
in the results. To evaluate each technology's precision, all samples, both environmental and PE,
will be analyzed in at least triplicate, with quadruplicate preferred. Replication is necessary
because precision will be evaluated at both low and high concentration levels, and across
different matrices. The statistic used to evaluate precision is relative standard deviation (RSD).
The equation used to calculate standard deviation (SD) between replicate measurements will be:
SD =
n
1 " i —\
-y\ck-c)
1 k=1
n 1 / 2
where SD is the standard deviation and C is the average measurement.
The equation used to calculate RSD between replicate measurements will be:
SD
RSD =
xlOO
C
Low RSD values (< 20%) indicate high precision. For a given set of replicate samples, the RSD
of a given technology's results will be compared with that of the laboratory reference method's
results to determine whether the reference method is more precise than the technology or vice
versa for a particular sample set.
Homogeneity of the sample concentrations provided to the developers is an important factor to
consider with regards to evaluating a technology's precision.
5.6.3 Primary Objective P3: Comparability
A third primary performance objective is comparability, i.e., the degree of agreement between
each technology and reference laboratory results. For comparability, ITO will evaluate whether a
statistically significant difference exists between the measurements provided by a given
technology and the laboratory reference method by performing a two-tailed, paired, Student's t-
test. If the data are found to be non-normally distributed, a nonparametric Wilcoxon signed-rank
test will be performed to determine if the two sets of results are statistically the same or different.
Technology results will also be compared to the corresponding reference laboratory by
calculating a relative percent difference (RPD) for the average of each paired and replicate
measurement. The equation for RPD is as follows:
RPD =	(M*~Md)
averageyMR ,MD)
where Mr is the reference laboratory measurement and MD is the developer measurement. RPD
values between ฑ 25% will indicate good agreement between the two measurements. Because the
absolute value will not be taken, negative RPD values would indicate that the technology
measurements were less than the reference laboratory measurements. As such, the median RPD
23

-------
value will be calculated (rather than the average RPD where the negative and positive values
would be neutralized) to provide a summary calculation of comparability between each
technology's results and reference laboratory measurements.
The types of comparability assessments to be performed should be appropriate for the
technology demonstration, although the approach described should be applicable to most site
characterization and monitoring technologies. Other methodology such as linear correlation can
also be used. In addition, it may be appropriate to evaluate the comparability of the technologies
to the reference method on a semi-quantitative basis (such as using performance intervals) if it is
anticipated that the technology being tested and the reference method do not generate results
that are directly comparable.10
5.6.4	Primary Objective P4: Method Detection Limit
A fourth primary performance objective is to determine the MDL for each technology. To
determine the MDLs, the developer will analyze seven aliquots of a low-level PE or
environmental sample. The concentration of the samples will be dependent on the detection
capability of each technology, but will ideally be three-to-five times the reporting limit for each
technology. ITO will use these data to calculate an MDL for each technology.
The MDL calculation procedure11 involves use of the Student's t-value and standard deviation to
calculate the MDL for each technology in soil and sediment as shown in the following equation:
MDL = ^„_11_oo=0 99)(5D)
where t(n-1,1-4=0.99) = Student's t-value appropriate for a 99 percent confidence level and a
standard deviation estimate with n-1 degrees of freedom.
If data is not obtainedfrom all seven replicates, an "estimated" MDL can be calculated with the
data that is available.
5.6.5	Primary Objective P5: Matrix Effects
The likelihood of matrix-dependent effects on performance will be investigated by evaluating the
data sets in multiple ways. This will include evaluation of: samples from the number different
environmental sampling locations individually and as a group to determine if performance was
different for environmental samples versus PE samples; grouping the data by matrix; assessing
the performance with samples containing high levels of contaminants other than analyte(s) of
interest; and evaluation of in-field versus laboratory conducted measurements (where
appropriate).
Discuss any further sample analysis or comparison that may occur to determine potential matrix
effects. These analyses will vary by technology category.
24

-------
5.6.6	Primary Objective P6: Technology Costs
Since conventional laboratory-based analytical methods for measuring analyte(s) of interest are
relatively costly, the cost of each field technology is an important evaluation factor. With input
from each technology developer, ITO will document the full cost of each technology and
compare those costs to typical and actual costs for analyte(s) of interest analytical methods. At a
minimum, cost inputs will include equipment, consumable materials, mobilization and
demobilization, and labor.
5.6.7	Secondary Objective SI: Skills and Training Requirements
The operator should be trained to safely set up and operate the technology. The amount of
training required depends on the complexity of the technology. Most developers have
established standard training programs. The time required to complete the developer's training
program will be estimated.
If an observer from an ITO will be included in the demonstration, then language such as below
should be included:
ITO observers will be assigned to each of the technologies. An example is on Page A-9. These
notes and observations will help to assess the skill level required of the operator. The observers
will also determine the type of background and training required to properly operate the
technology. The evaluation of this secondary objective will also include how user-friendly the
technologies are. The developers will have the opportunity to review and comment on the
observer's notes before the observations are incorporated into the report to ensure accuracy.
5.6.8	Secondary Objective S2: Health and Safety
It is important to understand the health and safety aspects associated with each technology. This
will include health and safety issues when operating the technology as well as the amount and
type of hazardous and nonhazardous waste generated by the technology. Not included in the
evaluation are potential risks from exposure to site-specific hazardous materials or physical
safety hazards.
5.6.9	Secondary Objective S3: Technology Portability
This evaluation will document if the technology can be readily transported to the field and how
easy the technology was to operate in the field. The size of the technology, including physical
dimensions and weight, will be recorded. The number of components, power requirements,
support structures, and reagent requirements will also be reported.
The durability and availability of the technology could also be included as a secondary objective
either with portability or as separate secondary objectives, if it is deemed appropriate for the
technology category.
25

-------
5.6.10 Secondary Objective S4: Sample Throughput
Sample throughput is a calculation of the total number of samples that can be evaluated in a
specified time (i.e., generally a typical 8-hour work day, although a field demonstration work
day may exceed 8 hours). The primary factors that affect sample throughput include the time
required to prepare a sample for analysis, to conduct the analytical procedure for each sample,
and to process and tabulate the resulting data.
The start and end of sample throughput recording will depend on the operation of the
technology. State when sample throughput times will be collected and how often they will be
evaluated. If a technical observer is used in the demonstration, their notes could be used to
determine sample throughput.
6.0	SAMPLE COLLECTION AND CHARACTERIZATION
This chapter discusses the sample collection, sample preparation, and sample characterization
procedures used in the demonstration.
6.1	Sample Collection
This section describes the environmental sample collection activities performed at various sites
across the country.
Provide a brief introductory paragraph that provides an overview of the samples that will be
collectedfor the demonstration. Include information on who will collect the samples, where they
will be shipped or if analyses will be performed on-site immediately following sample collection,
pre-analytical holding time considerations (if applicable), sample preservation procedures (if
required), and estimated concentration ranges.
6.1.1	Procedure
This section describes the method that will be used to collect the samples by each of the site
personnel.
Describe the detailed sample collection procedures. Sufficient detail must be provided to direct
the step-by-step sample collection process. A summary can be provided in this section with
further details given in an appendix. Identify the specific collection tools, devices or containers,
and procedures; contamination prevention; and decontamination procedures.
6.1.2	Sample Shipping
Describe how the samples will be received (if they are being collected prior to testing) and how
they will be stored.
26

-------
6.2	Sample Preparation
Describe the procedures that will be used to preserve or homogenize the sample. Provide details
on the equipment and containers to be used in the sample preparation process. Cite differences
between field analysis and requirements for reference laboratory analysis, if applicable. Justify
any differences between the standard method and the field sample preparation requirements. If
applicable, provide a separate subsection (e.g., 6.2.1) on the criteria employed to determine that
the samples were adequately prepared (i.e., preserved or homogenized).
6.3	Characterization of Environmental Samples
If applicable, provide a brief introductory paragraph describing the number of environmental
samples that will be characterized. Include a reference to the Sample Preparation section (6.2),
what the samples will be characterizedfor, what methods will be usedfor characterization, and
who will perform the analyses. Also, note the purpose of the characterization and criteria for its
success. Characterization may not be necessary or possible if analyses are being conducted on-
site as samples are being collected. However, if possible, it is advisable to collect soil and
sediment samples ahead of time so that homogenization and characterization of the samples can
be performed prior to use in the demonstration.
6.4	Sample Handling, Sample Tracking, and Sample Management
Describe the procedures used for sample handling, tracking, and management. This includes
detailing any chain of custody procedures, how samples are distributedfor shipping, how the
samples are randomized, how the samples are distributed, how the samples will be stored an
archived, and how any sample by-products will be handled. If an ITO is involved, that
organization is responsible for sample distribution.
7.0	REFERENCE LABORATORY AND METHOD(S)
This chapter describes the process for the selection of the reference method and laboratory. Note
if the reference laboratory provided any method performance information presented in the
chapter.
7.1	Reference Method Selection
The reference analytical method should be chosen from standard methods approved by EPA or
another recognized body, such as ASTM International or AO AC International. The method
selected should generate data similar in quality and type expected to be derived from the
technology being demonstrated. A justification for selecting a specific method must be provided.
Typically, SW-846 methods were used as reference methods for SITEMMT demonstrations since
these methods were closest in approach to innovative technologies that were being tested.
27

-------
The selection process may identify a nonstandard method as providing the best data match.
Since many field technologies offer qualitative data (e.g., immunoassay techniques), rigorous
quantitative laboratory methods may make direct comparisons unreasonable. Some
modification of existing methods may be required to ensure that an appropriate method is used
for comparison. Alternatively, different approaches to data analyses may be implemented which
do not focus on direct comparison of the tested technologies and the conventional method. For
example, in the SITE MMT Dioxin Demonstration, in addition to a direct quantitative
comparison of the data, the assessment also involved whether the technology data and the
reference data fell into the same data interval, which were based on decision action levels.10
7.2	Reference Laboratory Selection
Describe how the laboratory was chosen. This decision should be based on the experience of
prospective laboratories with QA procedures, reporting requirements, and data quality
parameters consistent with the goals of the program.
The laboratory must demonstrate past proficiency with the method selected and could be asked
to participate in a review of the experimental design. Laboratory management should be briefed
on the nature and purpose of the demonstration and may suggest enhancements to the proposed
procedure.
7.3	Reference Laboratory Sample Preparation and Analytical Methods
The purpose of this section is to describe the reference methods that will be used in the
demonstration sample analyses. This section briefly describes the procedures for instrument setup
and calibration for the selected methods. In addition, sample management procedures are also
discussed.
Discuss the reference method(s) that the laboratory will use for this demonstration. Describe in
detail, including modifications from a standard method that the reference laboratory might have
used. The information can be presented either by analyte of interest or by method used. Each
discussion (whether analyte or method) should be a separate subsection (e.g., 7.3.1, 7.3.2, etc.).
8.0	DATA MANAGEMENT
To ensure that the demonstration data are scientifically valid and defensible, appropriate
procedures will be used to perform data management. This chapter describes (1) data reduction,
(2) data review, (3) data reporting, and (4) data storage procedures for the demonstration.
8.1	Data Reduction
Each analytical method participating in the demonstration and each developer technology's
instruction manual contain detailed instructions and equations for generation of results. If an ITO
28

-------
is involved, the developer will be responsible for reducing its own data and providing final
results to ITO in an agreed upon form. The reference laboratory will generate concentration data
for the analyte(s) of interest using reference method(s). The reference laboratory will generate
the data, and ITO will review those results using standard data validation procedures.
Comparisons between the developer and reference laboratory data will be dependent on how the
developer is reporting its data and if the results are intended to be directly comparable.
8.2 Data Review
A review of technology and laboratory analytical data will be conducted by each developer and
the reference laboratory, respectively. If appropriate, ITO will also conduct a review of all field
and laboratory data. The review processes that will be used for developer and laboratory
analytical data are described below.
8.2.1	Data Review by Developers
Each developer will review all results generated by its technology. The developer will review all
demonstration sample data as well as QC results for their technology. The developer will report
results to ITO. Provide information on any details that the developer will follow in presenting
data to the ITO, as appropriate. Also, describe any procedures that the ITO might use to
transcribe the developer's reviewed data, if appropriate.
8.2.2	Data Review by Reference Laboratory
Include information on the data review process for the reference laboratory. If this is a complex
process, this information can be provided in an appendix.
8.2.3	Data Review by ITO (if applicable)
In addition to the review process that will be used by the developers and reference laboratory, the
ITO project manager or designee will review all laboratory and developer results, based on
demonstration objectives. The ITO project manager or designee (such as the QA manager) will
also conduct a complete data validation for 100 percent of the data as an independent check of
the reference laboratory results. If this validation reveals no oversights or problems, ITO will
consider all data to be acceptable. If oversights or problems are identified, the reference
laboratory project manager will be consulted. If appropriate, the reference laboratory data will be
compared to the data generated by ITO during the sample characterization analyses. This will be
a key comparison which will confirm the overall quality of the data set. A checklist for
performing the data validations can be included in an appendix, with a reference to that here.
Or, brief details on the data validation procedures by the ITO can be provided here.
During its data review, ITO will identify project outlier data using statistical testing and will
report these data to the EPA program manager. Project outlier data are defined as sample data
outside specified acceptance limits that are established during the demonstration planning
29

-------
process. For example, for data known or assumed to be normally distributed, the specified
acceptance limits could be the 95 percent confidence limits defined by the Student's two-tailed t-
test. Consistent procedures will be used to identify outliers for both reference laboratory and
developer data. No data will be rejected simply because they are statistical outliers, but data may
be reported with and without the statistical outliers as appropriate. ITO will conduct a thorough
check to identify the reasons for the outliers and will provide an explanation of why some data
appear to be outliers.
8.3 Data Reporting
Each developer and the reference laboratory will prepare and submit data packages reporting
their results. Both the reference laboratory and the developer should be required to produce both
hard copy and electronic data reports to avoid transcription errors. Described below are the data
reporting requirements for (1) developer data packages, (2) reference laboratory data packages,
(3) ITVRs, and (4) the DER.
8.3.1	Developer Data Package
The developers will compile their results on standard forms provided by ITO. An example form
should be provided as an appendix. The forms will contain sample identification numbers and
spaces for a developer to enter their results as appropriate (i.e., each form will be unique to each
developer). These forms can double as chain-of-custody forms to document sample transfer. The
developers will only be required to report their sample results for evaluation. Developer-supplied
QC sample results will be requested for the DER. Raw data, copies of logbook pages, standards
preparation logs, etc., that are included in a typical laboratory data package will not be required
from the developers. If the developers are completing the analysis of the demonstration samples
on-site, each developer will be expected to submit their complete results for the demonstration
samples before they leave the demonstration site to assure the integrity of the developers' data.
8.3.2	Reference Laboratory Data Package
The reference laboratory will provide the data package to ITO in standard analytical data forms
and in electronic format. A specified procedure can be provided in an appendix.
8.3.3	Innovative Technology Verification Reports
In accordance with the demonstration plan, ITO will evaluate the performance and cost data
collected for each technology demonstrated and prepare an ITVR for the technology. Each
ITVR will be a focused report of about 100 pages and will include the following:
•	An introduction
•	A description of the technology
•	Site descriptions and the demonstration design
•	Deviations from the demonstration plan
30

-------
•	A description of the reference method and its performance
•	A description of the technology's performance
•	A sample cost analysis
•	A summary of demonstration results.
The reports will be written in such a way that a reader with a basic science background can
understand their contents and make an informed decision regarding the performance of the
technologies. The ITVRs will undergo a rigorous review process that will include reviews by
external peer reviewers, project collaborators, and stakeholders. If the ITVR will be written by
ITO for EPA, the format will follow EPA guidance for reports (e.g., "Visual and Product
Standards Graphics Manual, " EPA 600/R-07/054, July 2007) and project-specific guidance
from the EPA program manager.
8.4	Data Evaluation Report
The DER contains all of the detailed demonstration records that are not provided in the ITVR.
ITO will prepare a DER containing tabular summaries of investigative and QA/QC data from the
demonstration as well as results of technical systems and performance audits. The DER will
include raw data files, including reference laboratory data, chains-of-custody, certificates of
analysis, completed log sheets, etc. These data are important to documenting the quality of the
demonstration but are not necessary to be included in the summary of performance that is
described in the ITVR. The DER will be made available after completion of all demonstration
activities (including final ITVRs).
8.5	Data Storage
The reference laboratory analysts responsible for performing measurements will enter raw data
into logbooks or on data sheets. In accordance with standard document control procedures, the
laboratory will maintain on file the original logbooks or data sheets, which will be signed and
dated by the laboratory analysts responsible for them. Similar procedures will be used for all data
entered directly into the laboratory information management system. Separate instrument logs
will also be maintained by the laboratory to allow reconstruction of the run sequences for
individual instruments. The reference laboratory will maintain all raw data, including raw
instrument output on tape or diskette, on file for 5 years after the submission of the data packages
to ITO. The data will be disposed of upon receipt of instructions to do so or after 5 years,
whichever is sooner. A central project file for the demonstration will be established at ITO. This
file will be a repository for all relevant field and laboratory project documentation. If ITO is
under contract to EPA, the project files will be maintained in accordance with the terms of the
contract.
31

-------
9.0 QA/QC PROCEDURES
This chapter describes the QA/QC procedures that will be implemented in this demonstration to
ensure that the data generated are of high quality.
9.1 QA/QC Objectives
The overall QA objective for the demonstration is to produce well-documented data of known
quality. Where appropriate, data quality will be measured in terms of precision, accuracy,
representativeness, completeness, and comparability. Table 9-1 contains the objectives for the
data quality indicators, which applies to both the developer and reference laboratory data. If
analytical data from the reference laboratory fail to meet the QA objectives described in this
section (except for comparability, which does not apply), the source of the errors will be
investigated and corrective actions will be taken if necessary and possible. (Corrective actions
associated with the reference method are discussed in detail in Section 9.2.) If analytical data
from the field technologies did not meet the QA objectives, the discrepancies will be described in
the ITVRs, as well as the usefulness and limitations of the data generated.
Table 9-X. Data Quality Indicator Objectives for Reference Laboratory and Developer
Data
D iH Quality Indicator
Cilculition
Objective
Precision
Relative standard deviation
(RSD) of replicate samples
Average of all RSDs < 20
percent
Accuracy
Percent recovery of certified or
spiked sample values
75 percent to 125 percent
Representativeness
Valid samples from each
matrix type
At least one valid sample
result generated from each
sampling location
Comparability of reference
method*
Average absolute median RPD
Within ฑ 25 percent
Completeness
Percent of total samples
analyzed and valid results
provided
98 percent
*Applies only to developer data
32

-------
9.2 Internal QC Checks
9.2.1	Reference Method QC Checks
Tables 9-X, through 9-X summarize the QC checks that will be performed by the reference
laboratory as described in reference method or reference laboratory standard operating
procedure (SOP) names.
Provide a table or tables of the information described above. Further details on the QC
procedures for each method(s) (calibration, blanks, spikes, duplicates, etc.) can also be provided
in subsections (i.e., 9.2.1.2, 9.2.1.3, etc.).
9.2.2	Developer Technology QC Checks
Quality control checks to be performed by the developers will be at each developer's discretion,
although it is highly recommended that quality controls such as blanks, spikes, and duplicates, be
systematically analyzed throughout the demonstration. Developer QC data will be reported to
ITO for inclusion in the DER.
9.3 Audits, Corrective Action, and QA Reports
The assessment stage involves procedures to verify that demonstration efforts are in compliance
with the quality system and that upon conclusion of the data gathering stage of the
demonstration, the collected data meet the performance and acceptance criteria (e.g., data quality
objectives) specified in the planning stage. The QA manager or designee conducts audits at
planned, scheduled intervals; implements provisions for timely responses and implementation of
corrective actions if needed; and completes the evaluation process with written reports to
technical and management staff. The ITO project manager will ensure that this individual has
sufficient authority, access to project staff, access to documents and records, and organizational
freedom to conduct the assessment.
QA audits are independent assessments of measurement systems and associated data and are
more rigorous than routine assessments. QA audits may be internal or external and most
commonly incorporate technical system reviews and analysis of blind or double-blind
performance audit samples. System audits, performance audits, and associated corrective action
procedures are described below.
9.3.1 Technical Systems Audits
Technical systems audits (TSA) include thorough evaluations of field and laboratory sampling
and measurement systems. The QA manager or designee will conduct a TSA during the time
when the reference laboratory is analyzing the demonstration samples. Provide information on
how the TSAs will be performed and what activities and documents will be reviewed as part of
33

-------
the TSA. These activities and documents should be provided either as a list or in a table. If
possible, a separate TSA will be performed by ITO at the demonstration site to ensure that the
demonstration plan is being implemented properly.
If the demonstration is being conducted with EPA involvement, the EPA quality manager has the
authority to conduct an independent TSA at any time during the demonstration.
9.3.2	Corrective Action Procedures
Corrective action procedures are an important component to ensuring a quality demonstration.
Each demonstration plan must incorporate a corrective action plan. This plan must include the
predetermined acceptance limits, the corrective action to be initiated whenever such acceptance
criteria are not met, and the names of the individuals responsible for implementation. These
procedures may vary depending on the type and severity of the finding.
Describe how ITO will respond to noted deficiencies in any of the audits that will be performed.
Briefly note the procedures that will be followed, including the chain of commandfor
notification of a problem.
9.3.3	QA Reports
The outcome of each assessment will be fully documented. The ITO project manager will
archive all audit documentation collected during the project and include it in the DER. The QA
manager or designee will report the findings of each audit to ITO or Reference Laboratory
project manager, as appropriate, who will then address the audit findings and provide an
appropriate response. QA reports require a written response by the person performing the
inspected activity and acknowledgment of the audit by the ITO project manager.
Authority to report all TSA results is designated to the ITO QA manager or designee. These
reports should:
•	Identify and document problems that affect quality and the achievement of objectives
required by the demonstration and quality assurance project plan and any associated SOPs.
•	Identify and cite noteworthy practices that may be shared with others to improve the
quality of their operations and products.
•	Propose recommendations (if requested) for resolving problems that affect quality.
•	Independently confirm implementation and effectiveness of solutions.
•	Provide documented assurance (if requested) that, when problems are identified, further
work performed is monitored carefully until the problems are suitably resolved.
Responses to adverse findings are addressed immediately during a debriefing after the
assessment is completed and preferably at the site of the assessment. Responses to each adverse
34

-------
finding will be documented in a letter or memo to ITO project manager. The letter or memo will
indicate for each adverse finding the corrective action(s) taken or planned.
The ITO QA manager or designee will review the responses to each adverse finding and will
follow up with the ITO, developer, or reference laboratory representative on any findings that
were not adequately addressed. Once all corrective actions associated with the QA report have
been verified, the QA manager or designee will approve the QA report. The QA report and
responses to adverse findings will be sent to the ITO project manager for review and approval.
The QA report and responses will be maintained in the QA project files and will be included in
the DER.
10.0 HEALTH AND SAFETY PLAN
This chapter contains the site health and safety plan for demonstration activities. This plan will
be reviewed and signed by all demonstration participants before work begins. The Health and
Safety Plan (HASP) is a very important part of the demonstration plan. It should be an
adaptation or appendix to the existing site HASP with any additions that are specific to the
demonstration. A copy of the site HASP should be available from the site manager or through
the ITO. Figure 3-1 contains a representative list of topics that should be addressed in the
HASP. The HASP may have different components, particularly if the demonstration is only being
conducted in a laboratory setting.
35

-------
Health and Safety Plan - XYZ, Inc. Site
Introduction
Purpose and Policy
Health and Safety Plan Enforcement for the XYZ, Inc. Site
Project Manager and Field Site Supervisor
Health and Safety Director
Site Health and Safety Officer
Requirements for Visitors
Site Background
Demonstration-Specific Hazard Evaluation
Exposure Pathways
Inhalation
Dermal Contact
Ingestion
Health Effects
Physical Hazards
Fire
Heat/Cold Stress
Mechanical
Unstable/Uneven Terrain
Insect and Other Animal Stings, Bites, and Encounters
Plant/Vegetation Hazards
Noise
Electrical
Inclement Weather
Training Requirements
Personal Protection
Levels of Protection
Protective Equipment and Clothing
Limitations of Protective Clothing
Duration of Work Tasks
Respirator Selection, Use, and Maintenance
Medical Surveillance
Health Monitoring Requirements
Documentation and Recordkeeping Requirements
Medical Support and Followup Requirements
Environmental Surveillance
Initial Air Monitoring
Periodic Air Monitoring
Monitoring Parameters
Use and Maintenance of Survey Equipment
Site Control
Site Control Zones
Safe Work Practices
Health and Safety Plan Enforcement
Complaints
Decontamination
Personnel Decontamination
Equipment Decontamination
Emergency Contingency Planning
Injury in the Exclusion or Contamination Reduction Zones
Injury in the Support Zone
Fire or Explosion
Protective Equipment Failure
Emergency Information Telephone Numbers
Directions to Hospital (or On-Site Clinic)
Figure 3-1. Typical Table of Contents from a Health and Safety Plan
36

-------
Chapter 4
References
Billets, S. and Dindal, A. "History and Accomplishments of the U.S. Environmental
Protection Agency's Superfund Innovative Technology Evaluation (SITE) Monitoring
and Measurement Technology (MMT) Program." Journal of Testing and Evaluation.
35(5), 486-495, September 2007.
U.S. Code, 2004, Title 42, Chapter 103. Comprehensive Environmental Response,
Compensation, and Liability Act.
http://www.access.gpo.gov/uscode/title42/chapterl03 .html
U.S. EPA. Superfund Innovative Technology Evaluation Program, Publications.
http://www.epa.gov/ORD/SITE/reports.html.
Billets, S. and Koglin, E. "Overview of the EPA's Verification Program for Site
Characterization Technologies " EEMI Environmental News, Summer 1997.
U.S. EPA. Environmental Technology Verification Program, Test and Quality Assurance
Plans, http://www.epa.gov/nrmrl/std/etv/tqap.html.
U.S. EPA, 1996. A Guidance Manual for the Preparation of Site Characterization and
Monitoring Technology Demonstration Plans, interim final report, version 5.0, October.
U.S. EPA. Technology Testing and Evaluation Program.
http://www.epa.gov/nhsrc/ttep.html.
U.S. Department of Defense. Environmental Security Technology Certification Program,
Demonstration Plan Guidance, http://www.estcp.org/pi resources/index.cfm. Accessed
November 3, 2008.
U.S. EPA, 2004. Demonstration and Quality Assurance Project Plan: Technologies for
the Monitoring and Measurement of Dioxin and Dioxin-like Compounds in Soil and
Sediment. EPA/600/R-04/036.
37

-------
U.S. EPA, 2005. Innovative Technology Verification Report Technologies for Monitoring
and Measurement of Dioxin and Dioxin-like Compounds in Soil and Sediment Xenobiotic
Detection Systems, Inc. CALUXฎ byXDS Section 7.1.3, pp 42-44.
http://www.epa.gov/ORD/SITE/reports/540r05001/540r05001r072005.pdf
Electronic Code of Federal Regulations. Definition and Procedure for the Determination
of the Method Detection Limit - Revision 1.11. 40 CFR Part 136, Appendix B.
38

-------
APPENDIX
Examples from Previous SITE MMT D/QAPPs
A-l

-------
This page intentionally left blank.
A-2

-------
BaUvllv
project manager
Any Dittdal
EPA project manager
Stephen Bittern
Battelle Visitors Day
coordinator
Helen Latham.
PE sample suppliers
Various source.*
Rylhik' obsrr\ li s
Josh iinc^vht
1 Utrk Misitu
Mary Schl ock
AXYS
project manager
€k'or<>iita Ilrooks
Battelle
QA staff
Zuthmy II Hk'nhcri
l{u\iimui liii/tl
Battelle sample custodian
Robyn Kroeger
Battelle
infrastructure and
logistics coordinator
Racket Sett
Site A act her-1 fatlm-h
Mtchocl Jury
{Michigan Dept. o f
Environmental (Jimiity)
Demonstration siu
rcprcieiilam vs
Becky fioclw
(U.S. H\h itiiil ll'ihl/ife
Service)
Randy Ulen
(llyhnzyme Corporation)
•Xoriyoshi Inoue
(Paracelxian, Inc.)
I ernantlo iiuhio
( liravn. / IX)
Technolngy developers
Bob Harrison
(CAPE Technologies JUL C)
.Mm Gordon
(Xenobiotic Detection Systems, Inc.)
Mmakn IInyukitwu
(Wuko Pure Chemical Industries, Ltd)
Notes:
H'\	- ( S. 1-mitonmc Hla) i'rotocti	PI-. pt'ilornianti' ji.ilu.ifioii
(} V	~ quality assurance
Figure A-l. Example organizational chart to be used in Chapter 2 of the D/QAPP.
A-3

-------
Table A-l. Example Demonstration Participant's Table to be used in Chapter 2 of the
D/QAPP.
Organization
Point of Contact
Contact Information
U.S. Environmental Protection
Agency
Stephen Billets
George Brilis
Battelle
Amy Dindal
Michigan Department of
Environmental Quality
Sue Kaelber-
Matlock
Michael Jury
U.S. Fish and Wildlife Service Becky Goche
Abraxis LLC
CAPE Technologies L.L.C.
Hybrizyme Corporation
Fernando Rubio
Bob Harrison
Randy Allen
National Exposure Research
Laboratory
944 East Harmon Avenue
Las Vegas, Nevada 89119
Telephone: (702) 798-2232
Fax: (702) 798-2261
E-mail: billets.stephen@epa.gov
505 King Avenue
Columbus, Ohio 43201-2693
Telephone: (561)422-0113
Fax: (561)258-0777
E-mail: DindalA@battelle.org
Remediation and Redevelopment
Division
503 N. Euclid Avenue
Bay City, Michigan 48706
Telephone: (989) 686-8025, X 8303
Fax:(989) 684-9799
E-mail: matlocks@michigan.gov
Green Point Environmental Learning
Center
3010 Maple Street
Saginaw, Michigan 48602
Telephone: (989) 759-1669
E-mail: becky_goche@fws.gov
54 Steamwhistle Drive
Warminster, Pennsylvania 18974
Telephone: (215) 357-3911
E-mail: frubio@abraxiskits.com
3 Adams Street
South Portland, Maine 04106-1604
Telephone: (207) 741-2995
E-mail: cape-tech@ceemaine.org
Suite G-70
2801 Blue Ridge Road
Raleigh, North Carolina 27607
Telephone: (919) 783-9595
E-mail: rallen@hybrizyme.com
A-4

-------
Xenobiotic Detection Systems,
Inc.
Wako Pure Chemical Industries,
Ltd.
Paracelsian, Inc.
AXYS Analytical Services
John Gordon
Masako Hayakawa
Noriyoshi Inoue
Laurie Phillips
1601 E. Geer Street, Suite S
Durham, North Carolina 27704
Telephone: (919) 688-4804
E-mail: johngordon@dioxins.com
1600 Bell wood Road
Richmond, Virginia 23237-1326
Telephone: (877) 714-1920
E-mail:
hay akawa. masako@ wako-chem .co.jp
72 Hampton Road
Scarsdale, New York 10583
Telephone: (914) 472-5152
E-mail: inomak@earthlink.net
2045 Mills Road
Sidney, British Columbia, Canada
V8L358
Phone: (250) 655-5800
E-mail: lphillips@axys.com	
A-5

-------
Table A-2. Example demonstration schedule for inclusion in Chapter 5 of the D/QAPP.
K\ent
Origiiiiil Schedule
for Completion
Reused Schedule
for Completion
Actiiiil Completion
Diite
Prepare and distribute developer
survey
July 18, 2003
n/a
July 18, 2003
I'irsl Conference C:ill
July 28, 2003
n/ii
July 21), 2003
Distribute summary nutes frum llic
conference call
August 5, 2003
ii/a
July 31, 2003
Develop preliminary strategy for
sample homogenization
August 12, 2003
n/a
August 12, 2003
Prepare one-page demonstration
flyer for Dioxin 2003 Conference
August 22, 2003
n/a
August 22, 2003
Obtain dioxin-contaminated soil
from one site and test
homogenization procedure
September 30, 2003
n/a
October 7, 2003
Draft homogenization procedure
October 3, 2003
n/a
October 1, 2003
Second Conference Csill
October 8. 2003
n/ii
October 8. 2003
Identify. obtain. and homogenize
samples liom addilional sites
\o\emlvr 28. 2<)i)3
n a
\o\ember 13. 2<)i)3
Third Conference C'sill
December 4, 2003
n/ii
December 4, 2003
First droll demonstration plan lo
EPA, developers, peer reviewers,
and 1 or 2 technical advisors
December 12, 2003
n. a
December 12, 2003
Final receipt of environmental
samples
December 19, 2003
n/a
December 24, 2003
PE samples sent to dioxin
laboratories and audits scheduled
January 9, 2004
n/a
January 12, 2004
Comments due to Battelle on first
draft demonstration plan
January 15, 2004
n/a
January 15, 2004
l-'ourlh Conference Csill
l-'ebruiirv 5, 2004
n/ii
l ebniiirv 5. 2004
Reference laboratory selected
February 3, 2004
n. a
February 20, 2004
Pre-demonstration samples
distributed
February 10, 2004
n/a
Phase 1: February
12, 2004
Phase 2: March 16,
2004
Developer and reference laboratory
pre-demonstration results due to
Battelle
March 31, 2004
n/a
April 16, 2004
Distribute second draft
demonstration plan to EPA,
developers, and entire Dioxin SITE
Demonstration Panel (includes peer
reviewers, technical advisors, and
observers) for final review
March 31, 2004
n/a
April 2, 2004
A-6

-------
Pre-demonstration results
distributed to developers
April 9, 2004
n/a
April 16, 2004
Til'th ( on I'itciut Call
April S. 2004
n/a
April S. 2004
Comments due to Battelle on third
draft demonstration plan
April 12, 2004
n/a
April 12, 2004
Demonstration plan finalized
April 16, 2004
n/a
April 20, 2004
Field demonstration (Saginaw,
Michigan)
April 26 through
May 5, 2004
Visitor's Day on
April 28
n/a
April 26 - May 5,
2004; Visitors Day
on April 28
Audit of reference laboratory
May 24, 2004
n/a
May 26, 2004
First draft report template to EPA
August 2, 2004
n/a
August 3, 2004
Five draft report templates to EPA
and developers
September 6, 2004
n/a
September 10, 2004
Final pre-demonstration results to
developers and selected technical
panel members
new milestone
October 4, 2004
October 4, 2004
Data tables to developers after
receipt of developer review
comments on report template
new milestone
October 15, 2004
December 6, 2004
Reference laboratory data set
completed
November 30, 2004
December 17, 2004
December 20, 2004
Reports to developers for review
October 1, 2004
n/a
n/a - combined this
review step with
peer review
First full draft reports to EPA
project mgt., EPA QA, EPA
technical editor, developers, and
peer reviewers
January 7, 2005
January 21, 2005
January 28, 2005
EPA administrative report and
comment reconciliation review.
Draft final copy (with comments
incorporated) to developers and
peer reviewers
February 4, 2005
February 25, 2005
March 8, 2005
EPA report publication
new milestone
March 30, 2005

Sixth CoiiI'itciut C'iill
January 27.2005
May 10. 2005

Data Evaluation Record (DER) to
EPA in hard copy and electronic
formats
new milestone
April 30, 2005
April 27, 2005
A-7

-------
PniTuifionr
iprii
May
Numbs*!- r.f : ample-;
to be analysed
on-site
Cs


?'5i

: 1

n
>7

29
30
i
lit
|f|
i
s
i;



















11 f;




















CAPE Technologies


















H c






































110



















aceitian


















0
Wako


















209


















> *nobiotic Detection


















A Q
iy tie-ins


















Q
.• i S ฃ i i.)! v' (i ซ. ซ't 1
S"?l -iirj


















0
Figure A-2. Example demonstration schedule to be included in Chapter 5 of the D/QAPP.
A-8

-------
EXAMPLE
Procedural Observations and Questionnaire
ABRAXIS LLC Coplanar PCB ELISA Kit
Procedure witnessed by:
Date witnessed:
Time/Date procedure started:
Time/Date procedure ended:
Name of Kit Used:
Individuals witnessed:
Lot Number of Kit:
Expiration Date of Kit:
Answer the following questions:
Could this kit be performed in the field without a mobile lab/trailer?
Would it take long to set up in the field before first samples could be processed?
How long?
How many samples could be prepared and analyzed in one day in the field once setup is
complete?
By an experienced kit user?
By the novice kit user?
Would sample throughput be faster in the lab than in the field?
If so by how much?
Are the instructions supplied with the kit the same as the operating procedure listed in the demo
plan?
If not, why?
If not, use the kit instructions for evaluation.
Was testing carried out at kit-recommended temperature of 20 ฐC to 25 ฐC?
How was temperature measured?
Was measuring device calibrated?
A-9

-------
Are the following equipment and reagents supplied with the kit? (Note if item not used at all;
also note grade and supplier of solvents)
thermometer
soil collector bottle (containing dispersion device)
digital balance
30-mL high-density polyethylene (HDPE) bottle
steel mixing ball
anhydrous sodium sulfate
acetone
hexane
shaker/rotator
filter
centrifuge
extraction tube
concentrated sulfuric acid
nitrogen evaporator
methanol
water
1:10 in 50% methanol/water
anti-coplanar PCB antibody solution
controls
standards
Parafilm
strip holder
pipettor
enzyme conjugate solution
waste container
IX wash solution
paper towels
color solution
stop solution
microplate reader
graph paper
commercial ELISA program
Were any supplies or equipment used that were not listed in the instructions?
If so, please list.
What are recommended hold times and storage conditions for:
Samples?
Extracts?
Reagents?
Standards?
A-10

-------
Would you know based on the instructions provided (if not how did you decide):
How much sample to extract?
How many samples to extract in a "batch"?
How much sodium sulfate to mix with sample?
Which solvent and how much to extract with?
How long to extract?
How many controls and standards to prepare with "batch"?
How long to agitate during oxidation cleanup (acid wash) before letting phases separate and
removing top layer?
Maximum number of oxidation (acid wash steps) that can be complete before results are
affected?
After acid wash, is sample evaporated to complete dryness during nitrogen evaporation step?
Is additional cleanup ever necessary?
How do you know and what additional cleanup options are there?
Are all samples diluted? If not, how do you know which ones to dilute?
How long do you mix the wells by moving in a circular motion? (If measured, what did you
measure with?)
How long to incubate? (If measured, what did you measure with?)
What temperature to incubate? (If measured, what did you measure with? Is it calibrated?) How
critical is this temperature?
How long do you mix the wells with the enzyme conjugate solution?
How long to incubate? (If measured, what did you measure with?)
What temperature to incubate? (If measured, what did you measure with? Is it calibrated?) How
critical is this temperature?
How dry do the wells have to be after the IX wash step?
A-ll

-------
Do you have to mix in the color solution? How long does color solution incubate? Is its
incubation temperature critical? If so, what temperature is recommended?
How critical is it that the plate be read within 15 minutes of adding the stop solution?
How to use/measure with the microplate reader? Is it calibrated, if so, how?
How much sample solution needs to be used with the microplate reader?
How do you calculate PCDD/PCDF amounts from the data generated? Is it clear how to account
for dilutions? For the cross-reactivity factor?
Must all procedures be completed in the same day?
If not, when can procedure be stopped and how must samples be stored?
Is that in the instructions?
Were any procedural steps performed differently than you interpreted from the instructions?
Were any of the instructions confusing?
If so please comment:
What QC samples are required with this approach and at what frequency?
What are recommended QC acceptance criteria?
Did QC samples meet acceptance criteria?
If not, is it clear what corrective action to take?
What QC samples would vendor recommend, but not require and at what frequency?
Do you recommend that some of the data be verified by conventional methods?
What method?
What frequency?
How accurate do weights and volumes used with this technique have to be?
Were all balances, pipettes, and thermometers calibrated?
Following the procedure you just observed, including QC requirements, how many samples do
you, the observer, think you could process in a day?
In a week?
Does the vendor provide training in kit use? Is this extra charge?
A-12

-------
Video?
Classes?
Phone support?
What education/experience would vendor recommend kit users have?
What do you think would be required education/experience for successful operation of this
technology?
Additional Comments:
A-13

-------

-------

-------
4>EFW
United States
Environmental Protection
Agency
Office of Research
and Development (8101R)
Washington, DC 20460
Official Business
Penalty for Private Use
$300
EPA/540/R-10/001
March 2010
www.epa.gov
Please make all necessary changes on the below label,
detach or copy, and return to the address in the upper
left-hand corner.
If you do not wish to receive these reports CHECK HERE ~;
detach, or copy this cover, and return to the address in the
upper left-hand corner.
PRESORTED STANDARD
POSTAGE & FEES PAID
EPA PERMIT No. G-35

Recycled/Recyclable
Printed with vegetable-based ink on
paper that contains a minimum of
50% post-consumer fiber content
processed chlorine free

-------