Office of Research and Development
National Exposure Research Laboratory Interim Final
Characterization Research Division October 1996
P.O. Box 93478
Las Vegas, NV 89193-3478
TECHNOLOGY
A Guidance Manual for the Preparation
of Site Characterization and Monitoring
Technology Demonstration Plans
^ ^m y
*^r
55
o
z-
*.
-------
A Guidance Manual for the Preparation
of Characterization and Monitoring
Technology Demonstration Plans
Prepared by
Stephen Billets, Gary Robertson
and
Eric Koglin
Consortium for Site Characterization Technology
Characterization Research Division-Las Vegas
National Exposure Research Laboratory
Office of Research and Development
U.S. Environmental Protection Agency
October 31,1996 Interim Final Report Version 5.0
-------
NOTICE
The information in this document has been developed by the U.S. Environmental Protection
Agency (EPA). It has not been subjected to the Agency's external review process. The policies
and procedures set forth here are intended as guidance to Agency and other government and non-
government personnel. They do not constitute rule making by the Agency, and may not be relied
upon to create a substantive or procedural right enforceable by any other person. The
Government may take action that is at variance with the policies and procedures in this
document. Mention of trade names or commercial products does not constitute endorsement or
recommendation for use.
October 31, 1996 ii Interim Final Report Version 5.0
-------
FOREWORD
This work represents the technical and editorial contributions from a large number of U.S.
Environmental Protection Agency (U.S. EPA) employees and others familiar with or interested
in the demonstration and evaluation of innovative site characterization and monitoring
technologies. The Characterization Research Division - Las Vegas (CRD-LV) first convened a
body of expertsthe Consortium Action Teamto define the elements of this guidance
document. Subsequent discussions and meetings were held to revise and expand the contents to
create this version. EPA staff from each of the ten Regions, the Office of Solid Waste and
Emergency Response, and the Office of Research and Development participated in this process.
This interdisciplinary, inter-programmatic team was convened to ensure that the demonstration
procedures articulated are acceptable across the Agency. This was an important first step for
gaining the acceptance of innovative technologies for use in characterizing and monitoring the
environment.
October 31,1996 iii Interim Final Report Version 5.0
-------
This page intentionally left blank.
October 31, 1996 iv Interim Final Report Version 5.0
-------
TABLE OF CONTENTS
NOTICE ii
FOREWORD iii
LIST OF ACRONYMS ix
CHAPTER 1
INTRODUCTION 1-1
1.1 Purpose and Content of this Guidance Manual 1-1
1.2 The Consortium for Site Characterization Technology 1-2
1.2.1 Background 1-2
1.2.2 What is the Consortium for Site Characterization Technology? 1-3
1.2.3 Using the Consortium Process 1-5
1.2.4 Roles and Responsibilities 1-6
CHAPTER 2
HOW TO USE THIS GUIDANCE MANUAL 2-1
2.1 Elements of a Demonstration Plan 2-1
2.2 Using Chapter 3 to Build a Demonstration Plan 2-1
CHAPTER 3
ELEMENTS OF A TECHNOLOGY DEMONSTRATION PLAN 3-1
TITLE PAGE 3-1
FOREWORD 3-1
TABLE OF CONTENTS 3-1
EXECUTIVE SUMMARY 3-1
ABBREVIATIONS AND ACRONYMS 3-2
1.0 INTRODUCTION 3-2
1.1 Demonstration Objectives 3-2
1.2 What is the Consortium for Site Characterization Technology? 3-2
1.3 Technology Verification Process 3-3
1.4 Purpose of this Demonstration Plan 3-3
2.0 DEMONSTRATION RESPONSIBILITIES AND COMMUNICATION ........ 3-3
October 31, 1996 v Interim Final Report Version 5.0
-------
2.1 Demonstration Organization and Participants 3-4
2.2 Organization 3.4
2.3 Responsibilities 3.4
3.0 TECHNOLOGY DESCRIPTION 3-5
3.1 Technology Performance 3-6
3.2 History of the Technology 3-6
3.3 Technology Applications 3-6
3.4 Advantages of the Technology 3-6
3.5 Limits of the Technology 3-7
3.6 Demonstration Performance Goals 3-7
4.0 DEMONSTRATION SITE DESCRIPTIONS 3-8
4.1 Site Name and Location 3-8
4.2 Site History 3-8
4.3 Site Characteristics 3-8
5.0 CONFIRMATORY PROCESS 3-8
5.1 Method Selection 3-9
5.2 Reference Laboratory Selection 3-9
5.3 Contingency Laboratory Selection 3-9
5.4 Special QC Requirements 3-9
5.5 Laboratory Audit 3-10
5.6 Statistical Analysis of Results 3-10
5.6.1 Methods of Data Reduction and Adjustment 3-10
5.6.2 Methods of Statistical Analysis 3-10
6.0 DEMONSTRATION DESIGN 3-11
6.1 Objectives 3-11
6.2 Experimental Design 3-11
6.2.1 Qualitative Factors 3-11
6.2.2 Quantitative Factors 3-12
6.3 Sampling Plan 3-12
6.3.1 Sampling Operations 3-12
6.3.2 Predemonstration Sampling and Analysis 3-13
6.4 Field Data 3-14
6.4.1 Field Audit 3-14
6.5 Demonstration Schedule 3-14
7.0 FIELD OPERATIONS 3-14
7.1 Communication and Documentation 3-15
7.2 Sample Collection Procedures 3-15
7.2.1 Sampling Procedures 3-16
7.2.2 Sampling Locations 3-16
7.2.3 Sample Preparation 3-16
7.2.4 Sample Distribution 3-16
7.2.5 Decontamination and Disposal 3-16
October 31, 1996 vi Interim Final Report Version 5.0
-------
7.3 Personnel Requirements 3-17
7.4 Technology Logistical Needs 3-17
7.4.1 Special Needs 3-17
8.0 QUALITY ASSURANCE PROJECT PLAN 3-17
8.1 Purpose and Scope 3-17
8.2 Quality Assurance Responsibilities 3-17
8.3 Data Quality Indicators 3-18
8.3.1 Representativeness 3-18
8.3.2 Comparability 3-19
8.3.3 Completeness 3-19
8.3.4 Accuracy 3-20
8.3.5 Precision 3-20
8.4 Calibration Procedures and Quality Control Checks 3-21
8.4.1 Initial Calibration Procedures 3-21
8.4.2 Continuing Calibration Procedures 3-21
8.4.3 Method Blanks 3-22
8.4.4 Spike Samples 3-22
8.4.5 Laboratory Control Samples 3-22
8.4.6 Performance Evaluation Materials 3-22
8.4.7 Duplicate Samples 3-22
8.5 Data Reduction, Validation, and Reporting 3-23
8.5.1 Data Reduction 3-23
8.5.2 Data Validation 3-23
8.5.3 Data Reporting 3-23
8.6 Calculation of Data Quality Indicators 3-24
8.7 Performance and System Audits 3-24
8.7.1 Performance Audit 3-24
8.7.2 On-Site System Audits 3-24
8.8 Quality Assurance Reports 3-24
8.8.1 Status Reports 3-24
8.8.2 Audit Reports 3-25
8.9 Corrective Action 3-25
9.0 DATA MANAGEMENT AND ASSESSMENT 3-25
10.0 HEALTH AND SAFETY PLAN 3-26
10.1 Health and Safety Plan Enforcement 3-26
October 31, 1996 vii Interim Final Report Version 5.0
-------
LIST OF FIGURES
Figure 1-1. Technology Demonstration Process 1-4
Figure 2-1. Elements of a Technology Demonstration 2-1
Figure 2-2. Table of Contents from a Typical Technology Demonstration Plan 2-2
Figure 3-1. Typical Table of Contents from a Health and Safety Plan 3-27
APPENDICES
APPENDIX A
Description of The Environmental Monitoring Management Council A-l
APPENDIX B
Environmental Monitoring Management Council Method Format B-l
APPENDIX C
Office of Solid Waste Method Requirements C-l
APPENDIX D
Representative Demonstration Schedule D-l
APPENDIX E
Guidance for Addressing Quality Assurance/Quality Control Requirements in a
Demonstration Plan E-l
October 31, 1996 viii Interim Final Report Version 5.0
-------
LIST OF ACRONYMS
ADQ Audit of Data Quality
AOAC Association of Official Analytical Chemists
ASTM American Society for Testing and Materials
CAS Chemical Abstract Service
CERCLA Comprehensive Environmental Response, Compensation, and Liability Act (a.k.a.
Superfund)
CRD-LV Characterization Research Division-Las Vegas
CSCT Consortium for Site Characterization Technology
DoD U.S. Department of Defense
DOE U.S. Department of Energy
EMMC Environmental Monitoring Management Council
ETV Environmental Technology Verification Program
EPA U.S. Environmental Protection Agency
ETT Environmental Technology Initiative
FPXRF Field portable X-ray fluorescence spectrometer
GC/MS Gas chromatography/mass spectrometer
HASP Health and Safety Plan
ITVR Innovative technology verification report
LCS Laboratory Control Sample
NERL National Exposure Research Laboratory
ORD Office of Research and Development
OSW Office of Solid Waste
OSWER Office of Solid Waste and Emergency Response
PEA Performance Evaluation Audit
PE Performance Evaluation
QA Quality assurance
QAPP Quality Assurance Project Plan
QC Quality Control
RA Regional Administrator
RCI Rapid Commercialization Initiative
RCRA Resource Conservation and Recovery Act
RPD Relative Percent Deviation
RSD Relative Standard Deviation
SOP Standard Operating Procedure
SPC Science Policy Council
SW-846 Test Methods for Evaluating Solid Waste, EPA Publication SW-846, Third
Edition
TER Technology Evaluation Report
TSA Technical System Audit
October 31, 1996
IX
Interim Final Report Version 5.0
-------
This page intentionally left blank.
October 31, 1996 x Interim Final Report Version 5.0
-------
CHAPTER 1
INTRODUCTION
1.1 Purpose and Content of this Guidance Manual
The purpose of this manual is to provide guidance to technology developers in preparing a
demonstration plan for the performance testing of characterization, monitoring, measurement,
and sampling technologies. The purpose of the demonstration plan is to assure that
demonstrations will be performed in a manner that generates the data needed to verify the
performance of the technology. Another purpose of the plan is to assure that all appropriate
health, safety, regulatory, quality assurance, and environmental concerns related to the
demonstration are addressed.
This guidance document is divided into three chapters. Chapter 1 (this chapter) provides an
overview of the purpose of the Consortium for Site Characterization Technology (CSCT) and a
description of the technology demonstration process. Chapter 2 contains a description of how to
use this guidance manual, and an introduction to Chapter 3.
Chapter 3 provides detailed guidance on how to prepare a technology demonstration plan.
The guidance is in the form of an annotated outline. Each section of the plan is identified as a
subsection of the chapter. Each subsection contains a short description of the information that
should be included in the demonstration plan. The use of this standard structure will facilitate
document preparation and reduce the amount of review time required for plan approval. This
approach will also help the EPA point of contact or technical expert provide timely assistance to
the demonstration plan authors.
There are five appendices to this document. Appendix A provides a description of the
Environmental Monitoring Management Council (EMMC) and its proposed role in helping the
Consortium achieve Agency-wide approval. Appendix B describes the standard EMMC format
which is used to document EPA's environmental monitoring methods. This format is used to
standardize the method used during the demonstration and for subsequent Agency approval, if
appropriate. Appendix C provides an overview of the expectations that the EPA's Office of Solid
Waste has regarding the acceptance of a method into S W-846, Test Methods for Evaluating Solid
Waste. A single demonstration may not produce sufficient information for approval. This
appendix describes additional information which may be needed to support method approval.
Appendix D provides a representative schedule of key events in the demonstration process.
Appendix E is a condensation of EPA document Preparation Aids for the Development of
Category II Quality Assurance Project Plans. This condensed version contains those portions
that are applicable to the technology demonstration process.
Site characterization and monitoring instruments can include a diverse assortment of
technologies. These can range from immunoassay test kits to field portable instrumentation (for
example, x-ray fluorescence spectrometers). Most, if not all, of the demonstration plan elements
described in Chapter 3 will be applicable to all types of technologies. However, there are often
special conditions and concerns that are unique to a particular type of technology. These can be
addressed on a case by case basis following the framework presented in Chapter 3.
October 31, 1996 1-1 Interim Final Report Version 5,0
-------
1.2 The Consortium for Site Characterization Technology (CSCT)
1.2.1 Background
The quick acceptance and application of innovative monitoring and site characterization
technologies are goals shared by many. Potential users of innovative approaches must be
persuaded that new technologies perform as well as or better than conventional methods. This is
particularly important when considering that data are being collected to support important
decisions (for example, protection of human health and the environment, remedy selection,
enforcement, or litigation). Currently, most information about the performance of innovative
technologies comes from the vendor or developer. However, a user's confidence and willingness
to apply an innovative technology is more likely following EPA recognition and acceptance.
The user community looks to the EPA, because of its regulatory mission, to evaluate
innovations that will improve the way the Nation manages its environmental problems. Potential
users may find new technologies appealing, but without government acceptance of such
technologies, users will continue to rely on already accepted approaches, whether or not they are
the most appropriate or cost-effective.
In 1995, on the 25th anniversary of the first Earth Day, the President announced a new
environmental technology strategy, Bridge to a Sustainable Future. This government-wide
strategy recognizes that industry is the primary creator of new technology and the main engine of
sustained economic growth. It assigns to government a catalytic role in promoting the
development of new technologies across a range of environmental sectors. The Consortium plays
a major part in the fulfillment of three of the 15 initiatives found in the Strategy:
The Environmental Technology Initiative (ETI) seeks to accelerate environmental
protection, strengthen America's industrial base, and increase exports of U.S.
technologies and expertise. The Consortium's primary source of funding comes from
ETI.
The Environmental Technology Verification Program (ETV) is a new program to
verify the performance and cost of new technologies through partnership with public
and private third-party testing organizations. EPA sponsored verification of
performance, as new technologies reach commercialization stage, is expected to
speed the adoption of innovative methods both at home and abroad. The Consortium
is the first often pilots to come on line under the ETV program.
The Rapid Commercialization Initiative (RCI) is a partnership between EPA, the
Departments of Commerce, Defense, and Energy, and several multi-State
organizations to combine facilitated siting, performance verification, and expedited
permitting to speed environmental technology into use. The Consortium participates
in the verification of characterization and monitoring technologies in this program.
October 31, 1996 1-2 Interim Final Report Version 5.0
-------
1.2.2 What is the Consortium for Site Characterization Technology?
As noted above, the Consortium for Site Characterization Technology (CSCT) is one of the
ETV pilots being supported with ETI resources. The Consortium is a partnership between, at
present, the EPA, and the Departments of Energy and Defense (DOE and DoD). DOE and DoD
have established programs and facilities (testing venues) for testing, demonstrating, and
evaluating the performance of monitoring, measurement, and site characterization technologies.
Our primary partner, DOE's Oak Ridge National Laboratories, Oak Ridge, Tennessee, is serving
as a technology verification organization. A technology verification organization conducts such
activities as: assist EPA in formulating generic guidance; assist developers in designing
demonstration plans; oversee actual testing of technologies; conduct quality assurance/quality
control (QA/QC) activities; and submit reports to EPA on technology performance and cost. As a
partnership, the Consortium offers valuable technical expertise to support the demonstration and
verification of the performance of new and emerging technologies and offers access to a wide
array of testing venues.
A goal of the Consortium is to facilitate the acceptance and use of cost-effective
technologies applicable to a wide range of environmental problems. The Consortium will meet
this goal by working with technology developers and other agencies in planning and conducting
demonstrations, evaluating data generated in demonstrations, and managing and disseminating
information. The Consortium is not intended to become another technology testing organization
that must touch every technology, but rather it is designed to support existing demonstration
efforts and developer-driven demonstrations. The Consortium does not offer any financial
support to those desiring to conduct a technology demonstration. The developer is expected to
secure the appropriate resources to support their part of the technology verification process.
These responsibilities are described in Section 1.2.4.
The Consortium provides developers with a clearly defined technology verification pathway
to follow, from demonstration planning to data evaluation and verification (Figure 1-1). The
technology demonstration/verification process established by the Consortium is intended to serve
as a template for conducting technology demonstrations that will generate high-quality data
needed by the Agency to verify technology performance. The Consortium verification process is
a model process that can help in moving innovative site characterization and monitoring
technologies into routine use more quickly.
The Consortium provides cost and performance data to technology users and developers. A
major product of the Consortium is the preparation of reports that contain the data generated for
each technology demonstration and an assessment of the technology's performance. This
important product should increase the users' confidence in employing innovative technologies
for site characterization and monitoring. A key part of the technology verification report is a
verification statement. The verification statement is a summary of the findings of the
demonstration which describes how well the technology achieved the performance goals the
demonstration was designed to test. It also provides a discussion of the advantages and
limitations of the technology and defines representative conditions under which the technology
can be applied.
October 31, 1996 1-3 Interim Final Report Version 5.0
-------
ENVIRONMENTAL TECHNOLOGY DEMONSTRATION
AND VERIFICATION PROCESS
! Demonstration
; Plan
Developer with
Assistance from a
Verification
Organization or
Independent Third
Party (with EPA
Review and Approval)
* Demonstration i ->
Developer with
Assistance from a
Verification
Organization or
Independent Third
Party (with EPA
Oversight)
r i
Verification
> Ml
Report
Verification
Organization (with EPA
Oversight and
Approval)
i
Verification
Statement
Prepared by EPA and
sent to:
Developer
EPA Regions
Program Officers
Other Federal Agencies
States
Public
Figure 1-1. Technology Demonstration Process
The Consortium is not a panacea for developers and users of new and emerging
monitoring, measurement, and site characterization technologies trying to gain acceptance or
commercialize a technology. However, the Consortium attempts to fill many technical and
institutional needs. These include:
Providing a sound scientific basis for demonstrating and evaluating technology
performance;
Facilitating acceptance of innovative technologies by state, local, and Federal
regulators;
Supporting the implementation and use of verified technologies;
Identifying and meeting changing user needs;
Increasing the number and commercial availability of innovative technologies;
Providing a mechanism to "pull" technologies being developed by DoD, DOE, and
other public and private entities into routine use at a faster rate;
Providing an incentive for developers to push the state of the technology beyond
present capabilities;
Leveraging resources and expertise among Federal agencies, the private sector, and
academia.
Although the Agency is interested in any and all innovative site characterization and
monitoring technologies, the Consortium resources, and those of the verification organization,
are limited. Therefore, a major role of the Consortium is to identify the technology and data gaps
that impede cost-effective and efficient environmental problem-solving and to communicate
them to the developer community. This assessment will allow us to prioritize those technology
October 31, 1996
1-4
Interim Final Report Version 5.0
-------
demonstrations that address the most pressing needs and gaps. The information that supports the
assessment will be gathered from within EPA, other Federal agencies, states, tribes, and the user
industries.
1.2.3 Using the Consortium Process
The Consortium conducts field demonstrations of technologies that are determined to be of
appropriate maturity and of significant benefit in environmental problem-solving. The
Consortium asks the developer to provide a profile of the technology which will be used to
determine if the technology will be tested and evaluated or demonstrated and verified. That
determination will be based on two factors: the level of maturity and the attributes of the
technology. The Consortium has defined three levels of maturity:
ป Level I - Laboratory prototype which will be field tested for the first time or is in the
earliest stage of development
f Level 2 - Pre-production prototype of a technology that has been field tested but is not
yet commercially available.
> Level 3 - Commercial production unit
The attributes used to assess maturity include:
Commercial history (number of units sold, years on the market.);
Technology performance (accuracy, sensitivity, selectivity, stability, etc.);
Technology applications (under what conditions has the technology been used, the
type(s) of site it was used at);
Costs of technology acquisition, installation, and operation;
Field experience (list of field uses, i.e., number of sites, locations, customers, etc.).
If the technology is acceptable, an EPA contact (a Regional representative or technical
expert) is assigned to facilitate the demonstration process. The developer is also be assigned a
verification organization which assists in developing the demonstration plan, staging the
demonstration, and which prepares the technology verification report.
The demonstration/verification process applies only to those technologies that fit into Level
3. We feel that it is important, if not imperative, for the CSCT to nurture emerging technologies
as well as those in or near commercial production. Level 1 and Level 2 technologies are clearly
not ready for a rigorous demonstration and verification, as the developer is usually still in the
process of exploring the potential of their innovation. However, developers of Level 1 and Level
2 technologies are faced with a need to try their technologies in the field and to have a credible
body, particularly the EPA, participate with and assist them in testing and evaluation. Therefore,
the CSCT also administers a scaled-down version of the demonstration/verification process - a
testing and evaluation process - that helps the Level 1 or Level 2 developer with access to sites;
provides limited review of their test plan; provides limited oversight of their field activities, and
October 31, 1996 1-5 Interim Final Report Version 5.0
-------
that serves as a distribution point for the information about the technology. Under a Level 1
demonstration, the developer is responsible for a self-evaluation of their performance. The CSCT
does not provide an independent evaluation. In a Level 2 demonstration, a more formal
procedure is followed and the EPA or its verification organization will review the performance
of the technology and issue an evaluation report.
1.2.4 Roles and Responsibilities
Consortium: The Agency and its verification organizations are responsible for the following
activities:
Defining the readiness of a technology for demonstration;
Assisting the developer with the preparation of the demonstration plan (in
accordance with the guidelines presented in this document);
Assisting in selecting the demonstration site;
Approving the demonstration plan prior to the demonstration;
Participating in the demonstration to monitor the implementation of the
demonstration plan;
Evaluating and verifying the data and information generated in the demonstration;
Preparing the technology verification report and issuing a verification statement.
Developer: The developer is responsible for:
Contributing to the development of the demonstration plan;
Articulating the performance goals of the technology;
Mobilizing the technology to the demonstration site;
Operating the technology during the demonstration;
Collecting data in accordance with the demonstration plan;
Contributing to the cost of referee analysis (optional);
Providing the data and information generated during the demonstration to the
Agency or the verification organization in the prescribed format.
October 31, 1996 I-6 Interim Final Report Version 5.0
-------
CHAPTER 2
HOW TO USE THIS GUIDANCE MANUAL
2.1 Elements of a Demonstration Plan
The purpose of the demonstration process is to generate the data and information necessary
to fairly and thoroughly evaluate the performance of a technology. Figure 2-1 depicts the steps in
a technology demonstration and some of the activities associated with each. This guidance
manual focuses primarily on the activities associated with the planning and demonstration steps.
Planning j ; Demonstration ! Data Review & Evaluation
* Quality Assurance * Sample Collection * Data/Information Reduction
* Experimental Design * Sample Analysis (field) * Data Interpretation
* Sampling & Analysis Plan * Referee Analysis (laboratory) * Report Findings
* Articulate Performance Goals * Field Data Management
ป Method or SOP * QA/QC Materials
* Selection of Demonstration * Photodocumentation
Site(s)
* Data Management Plan
* Coord. Regional Participation
* Predemonstration Testing
Figure 2-1. Elements of a Technology Demonstration
The demonstration plan serves a number of purposes. First, it provides a "roadmap" for the
demonstration. It contains detailed guidance for those executing the demonstration on how data
need to be generated and collected to support an objective performance evaluation of the
technology. Second, it is an important reference for those who choose to review the steps used by
the developer in executing the demonstration to assess the validity of the process. Finally, it can
serve as a useful reference for other technology developers in building future demonstration
plans involving related technologies.
2.2 Using Chapter 3 to Build a Demonstration Plan
The section order and content specified in Chapter 3 of this manual should be followed.
Figure 2-2 is a typical Table of Contents for a demonstration plan. It is derived from the section
headings in Chapter 3. The preparer of the demonstration plan should be working with the
assigned verification organization and/or an EPA contact. This is required so that all
demonstration activities under the Consortium are coordinated and the appropriate oversight
provided.
The user of Chapter 3 must be aware of the font appearance conventions used to distinguish
text that is intended to provide guidance from text that can be directly included in the plan. The
portions of the manual with the font having a normal appearance (the words in this sentence are
normal) are intended to be included directly into the demonstration plan, assuming the narrative
is appropriate to the demonstration. In places where there is a name, date, single word, or
phrase to be inserted, it will appear in bold. Finally, the text in italics is intended to serve as
October 31, 1996 2-1 Interim Final Report Version 5.0
-------
guidance to the user in how to prepare the specific section. In an effort to make the
demonstration/verification process as expedient as possible, an electronic copy of Chapter 3 can
be provided. The electronic copy contains all the headings, sub-headings, etc., identified in
Figure 2-2, and the necessary language that is common to all demonstrations.
The developer will need to fill in the technology-specific information and the EPA or the
verification organization will provide those portions identified in Chapter 3. Variations in the
content of a technology-specific demonstration plan are expected, since different technologies
have different characteristics and needs. For example, some field analytical technologies will
have a directly corresponding laboratory method for referee analysis, while others, such as
down-hole geophysical measurements, may require other methods of confirming performance.
Typically, demonstration plans are 50 - 100 pages in length, plus appendices, as appropriate.
Figure 2-2. Table of Contents from a Typical Technology Demonstration Plan
TITLE PAGE
FOREWORD
TABLE OF CONTENTS
EXECUTIVE SUMMARY
ABBREVIATIONS AND ACRONYMS
1.0 INTRODUCTION
1.1 Demonstration Objectives
1.2 What is the Consortium for Site Characterization Technology?
1.3 Technology Verification Process
1.4 Purpose of this Demonstration Plan
2.0 DEMONSTRATION RESPONSIBILITIES AND COMMUNICATION
2.1 Demonstration Organization and Participants
2.2 Organization
2.3 Responsibilities
3.0 TECHNOLOGY DESCRIPTION
3.1 Technology Performance
3.2 History of the Technology
3.3 Technology Applications
3.4 Advantages of the Technology
3.5 Limits of the Technology
3.6 Demonstration Performance Goals
4.0 DEMONSTRATION SITE DESCRIPTIONS
4.1 Site Name and Location
4.2 Site History
4.3 Site Characteristics
October 31, 1996 2-2 Interim Final Report Version 5.0
-------
Figure 2-2. Table of Contents from a Typical Technology Demonstration Plan Cont'd
5.0 CONFIRMATORY PROCESS
5.1 Method Selection
5.2 Reference Laboratory Selection
5.3 Contingency Laboratory Selection
5.4 Special QC Requirements
5.5 Laboratory Audit
5.6 Statistical Analysis of Results
5.6.1 Methods of Data Reduction and Adjustment
5.6.2 Methods of Statistical Analysis
6.0 DEMONSTRATION DESIGN
6.1 Objectives
6.2 Experimental Design
6.2.1 Qualitative Factors
6.2.2 Quantitative Factors
6.3 Sampling Plan
6.3.1 Sampling Operations
6.3.2 Predemonstration Sampling and Analysis
6.4 Field Data
6.4.1 Field Audit
6.5 Demonstration Schedule
7.0 FIELD OPERATIONS
7.1 Communication and Documentation
7.2 Sample Collection Procedures
7.2.1 Sampling Procedures
7.2.2 Sampling Locations
7.2.3 Sample Preparation
7.2.4 Sample Distribution
7.2.5 Decontamination and Disposal
7.3 Personnel Requirements
7.4 Technology Logistical Needs
7.4.1 Special Needs
8.0 QUALITY ASSURANCE PROJECT PLAN
8.1 Purpose and Scope
8.2 Quality Assurance Responsibilities
8.3 Data Quality Indicators
8.3.1 Representativeness
8.3.2 Comparability
8.3.3 Completeness
8.3.4 Accuracy
8.3.5 Precision
8.4 Calibration Procedures and Quality Control Checks
8.4.1 Initial Calibration Procedures
October 31, 1996 2-3 Interim Final Report Version 5.0
-------
Figure 2-2. Table of Contents from a Typical Technology Demonstration Plan Cont'd
8.4.2 Continuing Calibration Procedures
8.4.3 Method Blanks
8.4.4 Spike Samples
8.4.5 Laboratory Control Samples
8.4.6 Performance Evaluation Materials
8.4.7 Duplicate Samples
8.5 Data Reduction, Validation, and Reporting
8.5.1 Data Reduction
8.5.2 Data Validation
8.5.3 Data Reporting
8.6 Calculation of Data Quality Indicators
8.7 Performance and System Audits
8.7.1 Performance Audit
8.7.2 On-Site System Audits
8.8 Quality Assurance Reports
8.8.1 Status Reports
8.8.2 Audit Reports
8.9 Corrective Action
9.0 DATA MANAGEMENT AND ASSESSMENT
10.0 HEALTH AND SAFETY PLAN
10.1 Health and Safety Plan Enforcement
October 31, 1996 2-4 Interim Final Report Version 5.0
-------
CHAPTER 3
ELEMENTS OF A TECHNOLOGY DEMONSTRATION PLAN
After a technology has been accepted into the CSCT Demonstration Program, a letter
agreement will be signed by the developer and the CSCT. The purpose of the agreement is to
specify a framework of responsibilities for conducting the demonstration and evaluating the
technology. Several activities need to be conducted by the EPA, the developer, the verification
organization, and other demonstration participants before the demonstration begins. Response
to the following issues will be required during the preparation of the demonstration plan:
Identification of demonstration sites that provide the appropriate analytes in the
desired environmental sample media or matrices (contaminants must be present
in concentrations amenable to the technology being evaluated)
Definition of the roles of appropriate demonstration participants
Arranging appropriate analytical support for reference laboratory testing
Supplying standard operating procedures (SOPs), analytical methodologies (in
the Environmental Monitoring Management Council format - Appendix B), and
other relevant protocols
It is important to note that the entire demonstration plan, incorporating a Quality Assurance
Project Plan (QAPP), must be approved by the developer, verification organization, and EPA
before the demonstration can proceed.
Note: The section numbers shown in this chapter are used to indicate the order of these
headings in the actual demonstration plan.
TITLE PAGE
FOREWORD
TABLE OF CONTENTS
Provide a Table of Contents for the Demonstration Plan. It should include the headings
provided in this manual although they may be modified as appropriate for a particular
technology demonstration. An example Table of Contents is provided as Figure 2-2.
EXECUTIVE SUMMARY
Describe the contents of the demonstration plan (not to exceed two pages). Include a
summary description of the technology and the performance goals which will be verified during
the demonstration, the demonstration sites, a schedule, and a list of participants.
October 31, 1996 3-1 Interim Final Report Version 5.0
-------
ABBREVIATIONS AND ACRONYMS
Provide a list of the abbreviations and acronyms used in the demonstration plan.
1.0 INTRODUCTION
This chapter discusses the purpose of the demonstration and the demonstration plan,
describes the elements of the demonstration plan and provides an overview of the Consortium
for Site Characterization Technology (CSCT) and the technology verification process.
1.1 Demonstration Objectives
Discuss the reasons why this demonstration will be performed. Reasons may include:
Generate field data appropriate for verifying the performance of the technology
Evaluate new advances in specific technology (e.g., field portable X-ray
fluorescence spectrometer (FPXRF))
Specifically, this plan defines the following elements of the demonstration:
Roles and responsibilities of demonstration participants;
Procedures governing demonstration activities such as sample collection,
preparation, analysis, data collection, and interpretation;
Experimental design of the demonstration;
Quality assurance (QA) and quality control (QC) procedures for conducting the
demonstration and for assessing the quality of the data generated from the
demonstration (See Appendix E); and,
Health and safety requirements for performing work at hazardous waste sites.
1.2 What is the Consortium for Site Characterization Technology?
The Consortium for Site Characterization Technology (CSCT) is a partnership between the
EPA, and the Departments of Energy and Defense. DoD and DOE have established programs
and facilities (testing venues) for testing, demonstrating, and evaluating the performance of
monitoring, measurement and site characterization technologies, among other technologies. As a
partnership, the Consortium will offer valuable technical expertise to support the demonstration
and verification of the performance of new and emerging technologies and will offer access to a
wide array of testing venues.
A goal of the Consortium is to facilitate the acceptance and use of cost-effective
technologies applicable to a wide range of environmental problems. The Consortium will meet
this goal by working with technology developers and other agencies in planning and conducting
October 31, 1996 3-2 Interim Final Report Version 5.0
-------
demonstrations, evaluating data generated in demonstrations and managing and disseminating
information. The Consortium is not intended to become another technology testing organization
that must touch every technology, but rather it is designed to support existing demonstration
efforts or developer-driven demonstrations. The Consortium does not offer any financial support
to those desiring to conduct a technology demonstration. The developer is expected to secure the
appropriate resources to support their part of the technology verification process.
1.3 Technology Verification Process
The technology verification process established by the Consortium is intended to serve as a
template for conducting technology demonstrations that will generate high quality data that the
Agency can use to verify technology performance. The Consortium verification process is a
model process that can help in moving innovative site characterization and monitoring
technologies into routine use more quickly. After the completion of the selection process, the
verification of a technology's performance involves five steps:
1. Development of a demonstration/test plan.
2. Execution of the demonstration.
3. Data reduction, analysis, and cost verification.
4. Report preparation
5. Information transfer.
Although the Agency is interested in any and all innovative site characterization and
monitoring technologies, the Consortium resources, and those of the verification organization,
are limited. Therefore, a major role of the Consortium is to identify the technology and data gaps
that impede cost-effective and efficient environmental problem-solving and to communicate
them to the developer community. This assessment identifies those technologies that meet the
most pressing needs. The information that supports the assessment will be gathered from within
EPA, other Federal agencies, states, tribes, and the user industries to ensure that the most
pressing needs and gaps are addressed first.
1.4 Purpose of this Demonstration Plan
The purpose of the demonstration plan is to describe the procedures that will be used to
verify the performance goals of a technology. This document incorporates the QA/QC elements
needed to provide data of appropriate quality sufficient to reach a defensible position regarding
the technology performance. This is not a method validation study, nor does it represent every
environmental situation which may be acceptable for this technology. But it will provide data of
sufficient quality to make a judgement about the application of the technology under conditions
similar to those encountered in the field demonstration.
2.0 DEMONSTRATION RESPONSIBILITIES AND COMMUNICATION
This section identifies the organizations involved in this demonstration and describes the
primary responsibilities of each organization. It also describes the methods and frequency of
communication that will be used in coordinating the demonstration activities.
October 31, 1996 3-3 ' Interim Final Report Version 5.0
-------
2.1 Demonstration Organization and Participants
This demonstration is being conducted by the developer with support by Verification
Organization under the direction of the U.S. Environmental Protection Agency's (EPA) Office
of Research and Development, National Exposure Research Laboratory, Characterization
Research Division - Las Vegas, Nevada (CRD-LV). The CRD-LV's role is to administer the
CSCT Demonstration Program. Verification Organization's role is to provide technical and
administrative leadership and support in conducting the demonstration.
Participants in this demonstration are listed in Table -. The specific responsibilities of each
demonstration participant are discussed in Section 2.3
Provide a table which includes the name, affiliation, and mailing address of each
demonstration participant, a point of contact, their role, and telephone, fax and email address.
2.2 Organization
The organizational structure for the demonstration showing lines of communication is to be
provided. See Figure I, Appendix Efor an example organization chart.
2.3 Responsibilities
The responsibilities of the developer will vary depending on the type of demonstration. When
a multiple developer demonstration is held many of the duties given here will be performed by
the Verification Organization as they would be common to all of the developers. The Verification
Organization will inform the developer of the specific sections of the demonstration plan they
must provide. For a single developer demonstration, the developer is responsible for all of the
listed duties.
The Developer, in consultation with Verification Organization and the EPA technical lead,
is responsible for the following elements of this demonstration:
Designing and preparing all elements of the demonstration plan;
Developing a quality assurance project plan (QAPP) (Section 8 of the
demonstration plan); The QAPP is an integral part of the demonstration plan,
not a separate document.
A health and safety plan (HASP) (Section 10 of the demonstration plan) for the
demonstration activities; The HASP could either be included in the demonstration
plan as Section 10 or provided as a separate document and referenced in Section
10.
Acquiring the necessary reference analysis data;
October 31, 1996 3-4 Interim Final Report Version 5.0
-------
Detailed procedures for using the technology; This could either be a standard
operating procedure or an analytical method provided in the EMMC method
format, which is included in Appendix B of this document
Complete, field-ready technology for demonstration;
Operating and monitoring the technology during the demonstration;
Documenting the experimental methodology and operation of the technology
during the demonstration;
Data reduction and interpretation support, as required; and,
Logistical, and other support, as required.
Verification Organization and the EPA have coordination and oversight responsibilities for:
Providing needed logistical support, establishing a communication network, and
scheduling and coordinating the activities of all demonstration participants;
Ensuring that appropriate sites are selected consistent with the objectives of the
demonstration (developer may furnish a demonstration site(s));
Performing on-site sampling activities including collecting and homogenizing
samples, dividing them into replicates, and bottling, labeling, and shipping them
where appropriate;
Managing, evaluating, interpreting, and reporting on data generated by the
demonstration; and,
Evaluating and reporting on the performance of the technologies.
The site owners and the EPA will provide the following support:
Site access;
Characterization information for the site;
Health and safety information for the site; and,
Other logistical information and support needed to coordinate access to the site
for the field portion of the demonstration.
3.0 TECHNOLOGY DESCRIPTION
This section provides a description of the innovative technology including the scientific
principles, components and application history of the technology. Technology performance goals
October 31, 1996 3-5 Interim Final Report Version 5.0
-------
are provided, both those specific to the demonstration and those relevant to general applications.
The performance range through which the technology is applicable and advantages and
limitations of the technology are also included.
3.1 Technology Performance
Describe the technology to be demonstrated. This should include:
A brief introduction and discussion of the scientific principles on which the
technology is based
A brief description of the physical construction/components of the technology.
Include general environmental requirements and limitations, -weight,
transportability, ruggedness, power and other consumables needed, etc.
Identify the parameters or analytes the technology is designed to measure
Identify the matrices for which the technology is applicable, e.g., soil, water,
sludge, etc.
Cost of the technology (purchase or lease and typical operational costs)
Typical operator training requirements and sample handling or preparation
requirements.
Define the performance range of the technology and verification requirements of
the demonstration
Identify any special licensing requirements associated with the operation of the
technology
3.2 History of the Technology
This section should outline prior applications and experience with the innovative technology.
3.3 Technology Applications
Identify and discuss the environmental measurement problem that the technology is designed
to address, how the technology will solve the problem, and the potential users of the technology.
3.4 Advantages of the Technology
Describe the applications of the technology and what advantages it provides over existing
technology. Provide comparisons in such areas as: initial cost, cost per analysis, speed of
analysis, precision and accuracy of the data, usable or linear operating range, field versus
laboratory operation, solvent use, durability, potential for waste minimization, etc.
October 31, 1996 3-6 Interim Final Report Version 5.0
-------
3.5 Limits of the Technology
Discuss the known limitations of the technology. Include such items as detection limits in
various matrices (as appropriate), interferences, environmental limits (temperature, vibration,
light, dust, power requirements, water needs, etc.), upper concentration limits, linear range,
operator training, and experience requirements, etc.
3.6 Demonstration Performance Goals
This section discusses factors that will be considered in the design and implementation of the
demonstration. These factors include comparability, precision, portability, ease of operation,
ruggedness and instrument reliability, health and safety issues, sample throughput, and sample
matrix effects.
This section is established to summarize the advantages of the technology and to state
formally the results required to demonstrate these advantages. The presentation should address
several issues including: (1) the criterion for a successful demonstration in terms of the quality
parameters listed in Section 8.3 should be described in detail; (2) the range of applications over
which the technology can be applied should be specified; and (3) a summary of the technological
advantages and disadvantages of the innovative technology should be specified.
The Developer must provide performance features of the technology which can be verified
in the demonstration. These elements must be specific and be verifiable by a statistical analysis
of the data.
The performance goals should be within the capabilities of the technology but challenging to
it. Goals that are too easily met may not be of interest to the potential user, while those that are
overstated may not be achievable. This section forms the basis of the entire demonstration and
must be carefully written. The experimental design must include a sufficient number and kinds of
samples to permit the statistical verification of performance.
The performance goal must be specific and be verifiable by a statistical analysis of the data.
An example of satisfactory goal would be:
"The technology will detect 95 percent of all analytes detected by the reference
laboratory at concentrations greater than 50 parts per billion. The technology will
produce results that are within 30 percent of the reference laboratory on 90 percent of
the samples. The relative standard deviation on replicate samples will be less than 25
percent."
Statements such as: The technology will provide better precision than the referee method;
the technology has low detection limits; are not acceptable.
Goals will differ from demonstration to demonstration, depending on the nature and
maturity of the technology. The developers 'performance goals should specify levels of accuracy,
precision, and completeness (see Section 8.3 for detailed definitions) that the demonstration will
test and verify.
October 31, 1996 3-7 Interim Final Report Version 5.0
-------
Accuracy will, in general, be the most important of the performance goals. If the technology
is not accurate, then the degree to which samples are precise or complete loses importance. If
results from the innovative technology are accurate, then their precision and completeness
become important. Precision and completeness provide a basis for comparing the number of
samples required to characterize the site using one technology to that of another (and hence for
comparing the costs of information for the site characterization). Clearly, these quality
parameters are of interest to potential users of the technology.
Here the developer should specify goals regarding accuracy, precision and completeness.
Section 10.1 of Appendix E provides a discussion of some of the possible measures of accuracy
and precision and Section 8.3.3 provides the definition of completeness.
4.0 DEMONSTRATION SITE DESCRIPTIONS
This section discusses the history and characteristics of the demonstration site(s).
4.1 Site Name and Location
Provide the site names and locations where appropriate, area and location maps should be
included. In most cases, the technology will be demonstrated at more than one site. The
technology should be tested under different geologic, dimatologic, and waste environments.
Information on the site history and site characteristics should be available through the
Verification Organization or EPA contact unless the developer is making their own
arrangements for the demonstration sites.
4.2 Site History
Insert the site history. Include history of ownership and uses, especially those relating to the
contamination found at the site.
4.3 Site Characteristics
Provide a description of the site. This should include a description of the buildings, etc. at
the site and the adjoining property. This description should also include a geological description
of the site including soil types, etc. Provide a list of the kno\vn contaminants at the site, including
the distribution and concentrations. Use tables and maps as appropriate.
5.0 CONFIRMATORY PROCESS
The verification process is based on the presence of a statistically validated data set against
which the performance goals of the technology may be compared. The choice of an appropriate
reference method and referee laboratory are critical to the success of the demonstration. The
developer, verification organization, and EPA must work closely to select the best combination
of method and laboratory. In certain cases, modification of existing methods, special QC
requirements, or laboratory training may be required.
October 31,1996 3-8 Interim Final Report Version 5.0
-------
5.1 Method Selection
The reference analytical method should be chosen from standard methods approved by EPA
or another recognized body, such as American Society for Testing and Materials (ASTM) or
Association of Official Analytical Chemists (AOAC). The method selected should generate data
similar in quality and type expected to be derived from the technology being demonstrated. A
justification for selecting a specific method must be provided.
The selection process may identify a non-standard method as providing the best data match.
Since many of the field technologies offer qualitative data, rigorous quantitative laboratory
methods may make direct comparisons unreasonable. Although it is not the purpose of this
program to develop or validate new methods; some modification of existing methods may be
required to insure that an appropriate method is used for comparison.
5.2 Reference Laboratory Selection
To assess the performance of the insert technology name technology, the data obtained
using the technology will be compared to data obtained using conventional analytical methods.
Developer will propose a reference laboratory in consultation with Verification Organization
and the EPA.
This decision will be based on the experience of prospective laboratories with QA
procedures, reporting requirements, and data quality parameters consistent with the goals of the
Program. Describe how the laboratory was chosen and demonstrate that there is no conflict of
interest with the developer.
The Laboratory must demonstrate past proficiency with the method selected and may be
asked to participate in a review of the experimental design. Laboratory management should be
briefed on the nature and purpose of the demonstration and may suggest enhancements to the
proposed procedure. Approval of the method and procedure will reside with the EPA.
5.3 Contingency Laboratory Selection
A contingency laboratory would be used to support the data from the referee laboratory if
preliminary results differ significantly from those obtained by the technology in the field. This
section should outline the circumstances when a contingency laboratory would be used and the
type ofQA/QC which would be employed. The degree to which a contingency laboratory will be
used should also be discussed.
5.4 Special QC Requirements
In order to increase the likelihood that high quality data will be obtained, an enhanced QC
strategy may be required. The use of standard reference materials, double blind standards and
special performance evaluation materials should be considered. The nature and use of these
materials will depend on the technology being tested, the nature of the analytical method, and the
performance of the laboratory. This section should describe the use and benefit of such
materials. Cost and availability of the appropriate materials should also be considered.
October 31,1996 3-9 Interim Final Report Version 5.0
-------
5.5 Laboratory Audit
The Verification Organization or the EPA will conduct an on-site audit of the reference
laboratory both before and during sample analysis. The preliminary audit will identify
laboratory history with samples of the type to be analyzed and the reporting requirements. The
use of performance evaluation samples may be considered if insufficient history is evident.
Before such analyses are required, the selection process should be reviewed and reaffirmed. An
audit of the laboratory will be scheduled during the time period in which the field samples are
being analyzed. The audit will address the QC procedures and document any changes to the
analysis process.
This section of the demonstration plan will highlight key elements of the audit process and
any special procedures which must be addressed.
5.6 Statistical Analysis of Results
The discussion in this section relate to the analysis of the results. This section should
provide methods of data reduction and an example showing one possible specific method of
analysis. General methods of statistical analysis differ from application to application too
extensively to be provided here.
5.6.1 Methods of Data Reduction and A djustment
Before any formal statistical analysis of the data, it is important to determine which data
adjustments and which transformations are appropriate. Likely adjustments include the
identification and removal of outlying values and the transformation of results so that they more
closely satisfy statistical requirements. Statistical analyses of concentration data are often more
accurately performed based on concentration percentages rather than the raw data results.
Proposed methods of data reduction and adjustment should be specified here.
5.6.2 Methods of Statistical Analysis
The purpose of this section is to describe the statistical methods that might be used to
evaluate the developers goals. Appropriate statistical procedures depend on several issues
described next.
The nature of the developers' performance goals for example if a goal of the
developers is as simple as a required percentage error then this analysis is reduced to a
simple computation. More sophisticated goals, like demonstrating better results with a
(specified) high probability, could involve statistical tests.
The type of data (qualitative vs. quantitative, for example) will dictate different
statistical methodology.
Assumptions associated with the data -- distributional assumptions, for example, can be
requirements for specific statistical tests. Tests or methods of estimation may have to be
October 31, 1996 3-10 Interim Final Report Version 5.0
-------
proposed where their eventual use is contingent on verification of assumptions through
data analysis.
The treatment of data from the reference laboratory.
Because of these dependencies it is unreasonable to try to list all possible statistical
procedures that may be appropriate for a specific demonstration. This section should list
methods proposed for statistical analysis of demonstration results.
6.0 DEMONSTRATION DESIGN
This section discusses the objectives of the demonstration, factors that must be considered to
meet the performance objectives, and the information that Verification Organization will use to
evaluate the results of the demonstration.
6.1 Objectives
The primary objectives of this demonstration are to evaluate insert technology name in the
following areas: (1) how well it performs relative to conventional analytical methods, (2) the
impacts of sample matrix variations on performance, and (3) the logistical and economic
resources necessary to operate the technology. Secondary objectives for this demonstration are
to evaluate insert technology name for its reliability, ruggedness, cost, and range of usefulness,
data quality, and ease of operation. The performance will be compared to the performance of
conventional analytical methods used in performing similar site characterization activities. The
verification process will evaluate the performance of the technology against the developer
performance goals as stated in Section 3.6.
6.2 Experimental Design
This section discusses factors that will be considered in the design and implementation of the
demonstration. These factors include comparability, precision, portability, ease of operation,
ruggedness and instrument reliability, health and safety issues, sample throughput, and sample
matrix effects.
Demonstration procedures will simulate routine field conditions as much as possible by
testing the performance characteristics of the technology which were defined in Section 5.2.
6.2.1 Qualitative Factors
Some factors, while important, are difficult or impossible to quantify. These are considered
qualitative factors.
Typical factors to be discussed are listed below. Discuss those that are appropriate to your
technology, others may be added:
Reliability or susceptibility to environmental conditions
Effect of operator experience on results
October 31, 1996 3-11 Interim Final Report Version 5.0
-------
ป Portability of the technology
* Special logistical requirements
Add additional factors as appropriate for the technology.
6.2.2 Quantitative Factors
Many factors in this demonstration can be quantified by various means. Some can be
measured while others, such as the presence of interferents, cannot be controlled. The
demonstration should be designed to distinguish between these factors and to report on the
results of each.
Typical factors to be discussed are listed below. Discuss those that are appropriate to your
technology, others may be added.
* Weight and size of the technology
> Power and environmental requirements
* Cost of operation, expendables, and waste disposal
Add additional factors as appropriate for the technology
These quantitative factors will be used to assess technology performance by comparison to
reference laboratory data. Where practical, 100 percent of the samples will be submitted for
referee analysis. Common reference materials will be supplied to both the field technology and
laboratory so that direct comparisons can be made.
6.3 Sampling Plan
The sampling plan for this demonstration specifies the procedures that will be used to
ensure the consistency and integrity of the samples. In addition, this plan will outline the sample
collection procedures necessary to meet the demonstration purpose and objectives. Careful
adherence to these procedures will ensure that sample data analyzed by the reference laboratory
using conventional analytical methods will be directly comparable to the data obtained from the
field technology.
A table should be used to summarize the sampling and analysis strategy, giving the numbers
and types of samples needed. An example table is provided in Appendix E. Table E-l.
6.3.1 Sampling Operations
Provide an overview of the sample collection activities. Discuss specific aspects of the field
sampling effort. These aspects include:
> The matrix to be sampled, the requirement for different sampling techniques,
equipment, etc. depends on whether the target matrix is a single-phase system,
such as soil, sediment, air, water, or a multimedia sample involving multiple
phases.
October 31, 1996 3-12 Interim Final Report Version 5.0
-------
* A discussion of the sampling design and the physical locations of the samples to
be collected.
Selection of the sampling equipment. Discuss considerations necessary in choosing a proper
sampling method. Some of these aspects include: sampler compatibility in terms of physical and
chemical characteristics of the tool; matrix effects resulting from the use of the tool as related to the
need for disturbed versus undisturbed sample; whether the volume capacity is sufficient to obtain
the needed sample amount in a single grab or whether multiple grabs will be needed; the physical
requirements necessary to operate/use the tool, such as the need for an external power source, the
weight and size of the tool as they relate to its transport and use; the ease of operation of the
equipment (whether there is a need to train personnel on using the tool); whether the tool requires
multiple operators or just a single person; the decontamination and reuse potential in terms of the
ease, success, and time required to decontaminate; cost of the equipment; the usability of the
equipment for various sample types such as soil or sediment and ground or surface water, etc.
Provide or include an SOP for the sampling activities. The SOP should include such items as
equipment and supply needs, transport to the field, and reserves/backups required should equipment
failures occur.
Guidance for performing some of these tasks may be found in Test Methods for Evaluating
Solid Waste. EPA publication number SW-846, Third Edition. The verification organization
and/or an EPA representative can make this reference available.
6.3.2 Predemonstration Sampling and Analysis
A predemonstration sampling and analysis event is required to establish that the samples
from the proposed site are appropriate for the technology and the reference method. The
technology developer and the referee laboratory should each analyze replicate portions of one or
more samples from the site. An objective of this activity should be to better anticipate the
numbers of samples and replicates that will be required to meet statistical analysis requirements.
This predemonstration sampling will allow the technology developers to refine their
technologies and revise their operating instructions, if necessary. This sampling will also allow
an evaluation of matrix effects or interferences that may affect the demonstration. Information
generated through this predemonstration sampling and analysis event may be used to revise the
demonstration sampling and analysis plan, if necessary. A failure to meet the performance goals
at this point could indicate a lack of maturity of the technology and the demonstration would be
canceled.
Verification Organization will provide the predemonstration samples for this portion of the
demonstration. This sampling requirement has the following objectives:
> To provide or verify data on the extent of contamination at each site, and locate
sampling areas for the demonstration
October 31, 1996 3-13 Interim Final Report Version 5.0
-------
* To allow the developers to analyze samples from the demonstration sites in
advance, and, if necessary, refine and calibrate their technologies and revise their
operating instructions
* To allow an evaluation of any unanticipated matrix effects or interferences that
may occur during the demonstration
Results of the predemonstration sample analyses by the developers must be made available
to the Verification Organization two weeks after the receipt of the samples. This will allow
time for analysis of the data and notification if problems exist.
The verification organization will discuss the details of the predemonstration sampling and
analysis activities. Discuss the sample collection, handling, and analysis procedures which were
used. Samples are to be analyzed by both the developer and a reference laboratory. The use of
performance evaluation samples should also be included in the analysis process.
6.4 Field Data
The technology will be operated by the Developer, who will provide the results to the EPA
or Verification Organization. The Developer will be responsible for reducing the raw data into
a presentation format consistent with the evaluation requirements. The Developer will submit all
QC data and a description of how this data may be used to validate the field data.
6.4.1 Field Audit
The EPA or Verification Organization will conduct an audit of all field activities. This
activity will document any deviations from the demonstration plan, use of QC materials,
operational details, and other factors associated with an evaluation of the field technology. This
audit report will be included as part of the Quality Assurance Project Plan (Section 8.0).
6.5 Demonstration Schedule
Predemonstration sampling for this demonstration is planned for insert date. Samples
collected at this time will be used by the developers to test and possibly recalibrate their
technologies using the site-specific samples. The predemonstration data will be used to select
demonstration sampling areas and verify the magnitude and distribution of contaminants at each
site.
Demonstration activities, including sampling and analysis, are scheduled to occur on insert
date. Details of the sampling activities are provided in the next section.
A typical demonstration schedule is provide in Appendix D.
7.0 FIELD OPERATIONS
This section will describe the logistical requirements associated with sample collection and
technology operation. This phase of the demonstration requires close communication between the
October 31, 1996 3-14 Interim Final Report Version 5.0
-------
Developer, Verification Organization and the EPA. Preliminary testing and personnel training
may be required before the actual field study. Successful field operations require detailed planning
and extensive communication. The implementation of the demonstration must be consistent with
the requirements of the study and routine operation of the technology.
7.1 Communication and Documentation
The Verification Organization will communicate regularly with the demonstration
participants to coordinate all field activities associated with this demonstration and to resolve any
logistical, technical, or QA issues that may arise as the demonstration progresses. The successful
implementation of the demonstration will require detailed coordination and constant
communication between all demonstration participants.
All Developer/Verification Organization field activities will be thoroughly documented.
Field documentation will include field logbooks, photographs, field data sheets, and chain-of-
custody forms.
The Verification Organization field team leader will be responsible for maintaining all field
documentation. Field notes will be kept in a bound logbook. Each page will be sequentially
numbered and labeled with the project name and number. Completed pages will be signed and
dated by the individual responsible for the entries. Errors will have one line drawn through them
and this line will be initialed and dated.
All photographs will be logged in the field logbook. These entries will include the time, date,
direction, subject of the photograph, and the identity of the photographer. Specific notes about
each sample collected will be written on sample field sheets and in the field logbook. Any
deviations from the approved final demonstration plan will be thoroughly documented in the field
logbook and provided to the verification organization.
Developer will obtain all equipment needed for field work associated with this demonstration.
Provide a list of all equipment to be used in the demonstration. A table format is suggested.
7.2 Sample Collection Procedures
The Verification Organization will collect and prepare samples using the procedures
described below. All field activities will conform with requirements of the health and safety plan
(HASP) and with all requirements in this demonstration plan.
If unanticipated or unusual situations are encountered that may alter the sampling design,
sampling location, or data quality, the situation will be discussed with the EPA technical lead. Any
deviations from the approved final demonstration plan will be thoroughly documented.
The sampling design should be in this section. Items to include are: The number of sites and
the criteria used for selection; the number of samples and why; how the samples will be distributed
by location, matrix and analyte concentration. For certain types of technologies, e.g. geophysical,
October 31, 1996 3-15 Interim Final Report Version 5.0
-------
where direct sample collection is not applicable, a sample would be defined as a reading or set of
readings at a particular location.
7.2.1 Sampling Procedures
Describe the detailed sample collection procedures. Sufficient detail must be provided to direct
the step by step sample collection process. Identify the tools and collection procedures,
contamination prevention, and decontamination procedures.
7.2.2 Sampling Locations
Identify the sampling locations and how specific sampling locations within the site will be
selected. Considerations would include such things as analytes, concentration, soil type, sampling
depth, etc.
7.2.3 Sample Preparation
Describe the procedures used to preserve or homogenize the sample. Cite differences bet\veen
field analysis and requirements for reference laboratory analysis. Justify any differences bet\veen
the standard method and the field sample preparation requirements
7.2.4 Sample Distribution
The Verification Organization will be responsible for sampling and sample distribution. The
procedures which ensure homogeneity and integrity must be described.
7.2.4.1 Laboratory Samples
Describe the sample storage, packaging and shipping procedures. Include information on
allowable holding times, chain of custody, etc. for the referee samples.
7.2.4.2 Field Samples
Describe the requirements for sample size, format, and short term storage appropriate to the
technology.
7.2.4.3 Archive Samples
Whenever possible, test samples should be preserved and stored for future use. Describe the
procedures and a suitable location for long term storage.
7.2.5 Decontamination and Disposal
Describe the decontamination and/or disposal of all contaminated items. This includes
sampling equipment, protective clothing, and other items. Specify a procedure for separating
hazardous waste from non-hazardous waste.
October 31, 1996 3-16 Interim Final Report Version 5.0
-------
7.3 Personnel Requirements
List any special training or requirements needed to support the operation of the technology.
7.4 Technology Logistical Needs
Provide a list of all normal support equipment needed to operate the technology in the field.
7.4.1 Special Needs
List any special equipment needed to support the requirements of this demonstration.
8.0 QUALITY ASSURANCE PROJECT PLAN (QAPP)
The QAPP for this demonstration specifies procedures that will be used to ensure data quality
and integrity. Careful adherence to these procedures will ensure that data generated from the
demonstration will meet the desired performance objectives and will provide sound analytical
results.
Detailed guidance is provided in Appendix E, Quality Assurance/Quality Control (QA/QC)
Requirements. It is strongly recommended that Appendix E be thoroughly reviewed prior to the
preparation of this section. This appendix is intended to supplement the material provided in this
section. The guidance contained in this section supersedes all other instructions.
8.1 Purpose and Scope
The primary purpose of this section is to outline steps that will be taken by operators of the
insert technology name technology and by the reference laboratory to ensure that data resulting
from this demonstration is of known quality and that a sufficient number of critical measurements
are taken. The EPA considers the demonstration to be classified as a Category II project. This
section of the demonstration plan addresses the key elements that are required for Category II
projects prepared according to guidelines in the EPA guidance documents "Preparation Aids for the
Development of Category II Quality Assurance Project Plans" (Simes 1991) (The relevant parts of
this document are in Appendix E), "Preparing Perfect Project Plans (1989), and the Interim
Guidelines and Specifications for Preparing Quality Assurance Project Plans" (Stanley and Verner
1983).
8.2 Quality Assurance Responsibilities
The Developer project manager is responsible for coordinating the preparation of the QAPP for
this demonstration and for its approval by the EPA project manager and the Verification
Organization. The Developer project manager will ensure that the QAPP is implemented during
all demonstration activities. The Developer QA manager for the demonstration will review and
approve the QAPP and will provide QA oversight of all demonstration activities. The QA audit
function will be the responsibility of the EPA.
October 31, 1996 3-17 Interim Final Report Version 5.0
-------
The entire demonstration plan including the QAPP must be approved by the Developer,
Verification Organization, and EPA before the demonstration can proceed.
Samples will be collected and analyzed on site by the insert technology name technology and
off site by the reference laboratory using EPA-approved methods (see Appendix E, Section 6.1).
Primary responsibility for ensuring that sampling activities comply with the requirements of the
sampling collection procedures (Section 7.2 will rest with the EPA technical lead or Verification
Organization field team leader. QA/QC activities for the insert technology name technology will
include those activities recommended by Developer and those required by the EPA or Verification
Organization to assure the demonstration will provide data of the necessary quality.
QA/QC activities for the reference laboratory analysis of samples will be the responsibility of
the reference laboratory supervisor. If problems arise or any data appear unusual, they will be
thoroughly documented and corrective actions will be implemented as specified in this section. The
QA/QC measurements made by the reference laboratory are dictated by the analytical methods
being used.
8.3 Data Quality Indicators
The data obtained during the demonstration must be of sufficient quality for conclusions to be
drawn on the insert technology name technology. For all measurement and monitoring activities
conducted for EPA, the Agency requires that data quality parameters be established based on the
proposed end uses of the data. Data quality parameters include five indicators of data quality:
representativeness, completeness, comparability, accuracy, and precision.
Data generated by the insert technology name technology will be compared to the data
generated from Fill in details on the reference method(s). High quality, well documented
reference laboratory results are essential for meeting the purpose and objectives of this
demonstration. Therefore, the following indicators of data quality will be closely evaluated to
determine the performance of the technology when measured against data generated by the
reference laboratory.
For examples of discussing data quality parameters, including suggested tabular formats, see
Appendix E, Section 4. The referee data must be of sufficient quality to allow comparison to the
technology. This may require tighter QC limits than are normally used by the reference method.
8.3.1 Representativeness
Representative samples, in general, are samples that contain a reasonable cross-section of the
"population" over which they are to be used to make inferences. The population for demonstrations
analyzed as part of this project includes a variety of media (groundwater, soil, soil-gas) and
contaminants that the innovative technologies are developed to accommodate. In order to test the
innovative technology across this range, care must be taken in site selection, experimental design
and sample planning and in sample collection, preparation, storage and shipping.
In preparing the demonstration plan one will consider the following:
October 31, 1996 3-18 Interim Final Report Version 5.0
-------
1) The design for the set of demonstrations should consider the media and contaminants at
each site, and the site selection process should be performed to try to include the entire
range of these materials for which the technology is applicable. When this is not possible,
the demonstration plan should specify -why the selected demonstration sites provide a fair
indication of the technologies' capabilities.
2) The experimental design and sampling plan should guarantee that the site is being fully
utilized (for its purpose as determined above in 1). For example, a site containing more
than one soil type should utilize a design and sample plan that -will very likely include
these materials.
3) The procedures for collection, preparation, storage and transport of the samples at the
site and the reference laboratory must be specified to assure that when the samples are
analyzed at the laboratory, they are representative of the samples analyzed in the field.
8.3.2 Comparability
Comparability is a quality parameter determined for the most part in the planning stages of the
demonstration, often on the basis of prior knowledge of the innovative technologies' performance
capabilities. First, the innovative technology must be comparable in some way to a reference or
baseline method for the demonstration to be worthwhile. Given this, questions arise as to how the
technologies should be compared. Two issues that must be addressed in this section are: 1) should
the comparison be made quantitatively, semi-quantitatively or qualitatively? and 2) how will
differences be addressed?
The first of these questions refers to whether the sample results will be qualified as detect/non-
detect by the t\vo technologies (innovative and reference) or whether numerical estimates of
contamination levels will be compared. Intermediate cases (semi-quantitative) are possible, for
example, samples could be classified as clean, low, or highly contaminated. The second question
addresses the issue of which method is correct when discrepancies occur between results of the f\vo
methods. For example, will results of the referee laboratory be treated as the true value, or will
imprecision in analytical results be considered in the comparison.
8.3.3 Completeness
Completeness refers to the amount of data collected from a measurement process expressed as
a percentage of the data that would be obtained using an ideal process under ideal conditions.
This subsection should include a listing of the factors that might contribute to incomplete data
along with an estimate of the percentage complete for the demonstration. These factors and
estimates should be provided for both the innovative and the reference technologies. For the
innovative technology these estimates should be included in Section 3.6 with the developers
performance goals and the factors listed in Section 3.5 with limitations of the innovative
technology. For the reference technology, estimates should be included with the developers
performance goals while the factors may be included in Section 3.4, advantages of the innovative
technology.
October 31, 1996 3-19 Interim Final Report Version 5.0
-------
8.3.4 Accuracy
Accuracy is a measure of how close, on average, values of the innovative technology are to the
true values. Inaccuracies or biases are the result of systematic differences between these values.
When comparing the innovative technology to a reference technology difficulties can arise. In
some cases biases can be attributed to the innovative technology. These biases are often the result
of poor calibration. Other possible sources of bias include systematic errors in standards
preparation, biases introduced in the sample extraction, storage and shipping processes and biases
resulting from setup-related differences at the reference laboratory. Only the former of these
sources is likely to be incurred by users of the innovative technologies. Most of the remaining
sources represent inaccuracy that might be avoided through use of the innovative technology.
Consequently every effort should be made by the verification organization, the developers and the
reference laboratory to identify specific sources of accuracy. The design of blanks, duplicates and
performance assessment samples should provide substantiating evidence to support this partitioning
of sources of inaccuracy when results become available.
Differences that appear as inaccuracies in a single demonstration may reflect imprecision in
calibration procedures that might lead to different biases on another day and under different
environmental conditions (and hence become a precision issue on a larger scale). Since there will
be too few demonstrations to analyze variability associated with the calibration procedures, care
should be taken by the developers in making claims concerning this portion of the process.
Accuracy in environmental applications is often best specified as a percentage of the true
contaminant concentration. Several possible measures of accuracy are listed in Section 10.1 of
Appendix E.
Accuracy is clearly among the most important aspects of the innovative technology to be
analyzed in the demonstration. Special care should be given to define the appropriate performance
goal for this factor.
8.3.5 Precision
Precision, in general, refers to the degree of mutual agreement among measurements of the
same materials and contaminants. Environmental applications often involve situations where
"measurements of the same materials" can take on a number of interpretations. In environmental
applications, precision is often best specified as a percentage of contaminant concentration.
Several possible measures of precision are listed in Section 10.1 of Appendix E. The following lists
several possible interpretations of precision for environmental applications.
1) The precision involved in repeated measurements of the same sample without adjusting
the test equipment.
2) The precision involved in repeated measurements of the same sample after reset,
repositioning, or re-calibration of the test equipment or when using different equipment of
the same technology.
3) The precision involved in measurements of materials taken from adjacent locations.
October 31, 1996 3-20 Interim Final Report Version 5.0
-------
4) The precision characteristics of a specific technology in determining contamination at a
specific site or at an arbitrary site.
In general, users of the technology will want to be assured that imprecision in 1) and 2) is
small. The interpretation of precision described in 3) is likely to be too site specific to be of general
interest. The imprecision discussed in 4) is perhaps of most interest as it includes imprecision
resulting from possible differences in the design activities and effects of environmental conditions
such as temperature that would vary from one site characterization to another as well as site and
technology specific sources. If available, this information would provide the potential user with an
estimate of how close a site characterization using this technology would come to providing the
true site contaminate levels. Unfortunately, it is unlikely that the demonstrations will be extensive
enough to provide much information on how this estimate would be provided.
Different components of precision may be important for different technologies or for different
applications of the same technology. The developers claims of precision should be specified in the
developers performance goals, Section 3.6, and should distinguish clearly bet\veen different
interpretations of precision. Methods for evaluating these claims should be specified in the section
on statistical methods for evaluating demonstration results.
8.4 Calibration Procedures and Quality Control Checks
This section describes the calibration procedures and method-specific QC requirements that
apply to both the technology and the referee analyses. It also contains a discussion of the corrective
action to be taken if the QC parameters fall outside of the evaluation criteria.
The quality control checks provide a means of measuring the quality of data produced. The
developer may not need to use all the ones identified in this section. The selection of the
appropriate quality control checks depends on the technology, the experimental design and the
performance goals. The selection of quality control checks will be based on discussions among the
developer, the EPA technical lead, and the verification organization.
8.4.1 Initial Calibration Procedures
Describe the initial calibration procedure for the technology. Initial calibration consists of
those procedures used to establish the qualitative and quantitative performance of the technology
for the analytes of interest prior to making measurements. Include the frequency and acceptance
criteria for initial calibration and the actions taken if criteria are not met. The initial calibration
procedures for the reference method may be incorporated by reference.
8.4.2 Continuing Calibration Procedures
Describe the continuing calibration procedures. Include the frequency and acceptance criteria
and action if criteria are not met. Continuing calibrations are those procedures used to assure the
technology maintains the calibration that was established in the initial calibration. The continuing
calibration for the reference method may be included by reference.
October 31, 1996 3-21 _ Interim Final Report Version 5.0
-------
8.4.3 Method Blanks
Method blanks are used to evaluate technology-induced contamination, which may cause false
positive results. Describe the use of method blanks, the materials used, the frequency, the criteria
for acceptable method blanks and the actions if criteria are not met. Include this information for
both the technology and the reference method.
8.4.4 Spike Samples
The use of spiked samples will depend on the technology. If spiked samples are to be used
specify the procedure, frequency, acceptance criteria, and actions if criteria are not met.
8.4.5 Laboratory Control Samples
Laboratory control samples are samples of known composition that are analyzed periodically
to assure that the analytical system is in control. These are analyzed just like a regular sample.
These may or may not be appropriate for the technology. They may be required for the reference
method. If laboratory control samples are used, their composition should be described in this
section.
8.4.6 Performance Evaluation Materials
Performance evaluation materials are samples whose composition is unknown to the analyst
that are used to evaluate system performance. PE samples will be submitted to the reference
laboratory and to the technology for analysis. The control limits for the PE samples will be used to
evaluate the technology and reference laboratory's method performance.
PE samples come with statistics about each sample which have been derived from the analysis
of the sample by a number of laboratories using EPA-approved methods. These statistics include a
true value of the PE sample, a mean of the laboratory results obtained from the analysis of the PE
sample, and an acceptance range for sample values. The reference laboratory is expected to
provide results from the analysis of the PE samples that meet the performance objectives of the
demonstration.
Describe any specific procedures appropriate to the analysis of the PE materials. It has to be
clear how these samples are going to be used in the demonstration. One use ofPE materials is in
the conduct of a performance audit.
8.4.7 Duplicate Samples
Duplicate samples must be analyzed by the technology to determine the precision of the
analysis. These same samples should be submitted in duplicate for analysis by the reference method
so the precision of the methods can be compared. Precision often represents one of the key
performance claims for the technology, therefore careful selection of the type and number of
duplicate samples is critical to the success of the demonstration. Provide the procedure for
determining samples to be analyzed in duplicate, the frequency and approximate number.
October 31,1996 3-22 Interim Final Report Version 5.0
-------
8.5 Data Reduction, Validation, and Reporting
To maintain good data quality, specific procedures will be followed during data reduction,
validation, and reporting. These procedures are detailed below.
Also see Appendix E, Section 7 for additional guidance.
8.5.1 Data Reduction
Data reduction refers to the process of converting the raw results from the technology into a
concentration or other data format which will be used in the comparison. The procedures to be
used will be technology dependent. The purpose of this step is to provide data which will be used to
verify the performance claims. These data will be obtained from logbooks, instrument outputs, and
computer outputs as appropriate. Describe what will be provided and the procedures used to
assure that the data are correct. The actual comparisons will be performed by the Verification
Organization or EPA.
8.5.2 Data Validation
The operator will verify the completeness of the appropriate data forms and the completeness
and correctness of data acquisition and reduction. The reference laboratory or field team supervisor
will review calculations and inspect laboratory logbooks and data sheets to verify accuracy,
completeness, and adherence to the specific analytical method protocols. Calibration and QC data
will be examined by the individual operators and the laboratory supervisor. Laboratory project
managers and QA managers will verify that all instrument systems are in control and that QA
objectives for accuracy, completeness, and method detection limits have been met.
Analytical outlier data are defined as those QC data lying outside a specific QC objective
window for precision and accuracy for a given analytical method. Should QC data be outside of
control limits, the reference laboratory or field team supervisor will investigate the cause of the
problem. If the problem involves an analytical problem, the sample will be reanalyzed. If the
problem can be attributed to the sample matrix, the result will be flagged with a data qualifier. This
data qualifier will be included and explained in the final analytical report.
8.5.3 Data Reporting
This section contains a list of the data to be reported by both the technology and the reference
method. At a minimum, the data tabulation will list the results for each sample and include
detection limits, reporting units, sample numbers, results, and data qualifiers. All QC information
such as calibrations, blanks and reference samples are to be included. All raw analytical data
should also be reported. All data should be reported in hardcopy and electronically in a common
spreadsheet or database format.
October 31, 1996 3-23 Interim Final Report Version 5.0
-------
8.6 Calculation of Data Quality Indicators
Provide the equations for any data quality indicator calculations that may be used. These
include: precision, relative percent deviation, standard deviation, accuracy, and completeness. For
additional information refer to Appendix E, Section 10.
8.7 Performance and System Audits
The following audits will be performed during this demonstration. These audits will determine
if this demonstration plan is being implemented as intended.
Guidance on conducting audits is available in Appendix E, Section 9
8.7.1 Perform an ceA udit
Performance evaluation (PE) samples will be submitted to the reference laboratory and to the
insert technology name technology for analysis. The control limits for the PE samples will be used
to evaluate the insert technology name technology and the reference laboratory's method
performance.
The insert technology name technology will analyze the PE samples periodically during the
demonstration.
Specify control limits and the corrective measures which will be used if the results exceed the
specified acceptance criteria.
8.7.2 On-Site System Audits
On-site system audits for sampling activities, field operations, and laboratories will be
conducted as requested by the EPA project manager. These audits will be performed by the EPA
Project Manager or the Verification Organization.
8.8 Quality Assurance Reports
QA reports provide the necessary information to monitor data quality effectively. It is
anticipated that the following types of QA reports will be prepared as part of this demonstration.
See Appendix E, Section II for more information.
8.8.1 Status Reports
The developer and verification organization project managers will prepare periodic reports for
the EPA project manager. These reports should discuss project progress, problems and associated
corrective actions, and future scheduled activities associated with the demonstration. When
problems occur, the developer and verification organization project managers will discuss them
with the EPA project manager or EPA technical lead, estimate the type and degree of impact, and
October 31, 1996 3-24 Interim Final Report Version 5.0
-------
describe the corrective actions taken to mitigate the impact and to prevent a recurrence of the
problems. Provide the frequency, format, and content of these reports.
8.8.2 Audit Reports
Any QA audits or inspections that take place in the field or at the reference laboratory while the
demonstration is being conducted will be formally reported by the auditors to the EPA project
manager who will forward them to the developer, Verification Organization QC Manager, and the
Verification Organization project manager for appropriate actions.
8.9 Corrective Action
Each demonstration plan must incorporate a corrective action plan. This plan must include the
predetermined acceptance limits, the corrective action to be initiated -whenever such acceptance
criteria are not met, and the names of the individuals responsible for implementation.
Routine corrective action may result from common monitoring activities, such as:
Performance evaluation audits
Technical systems audits
Calibration procedures
9.0 DATA MANAGEMENT AND ASSESSMENT
The developer, Verification Organization, and EPA each have distinct responsibilities for
managing and analyzing demonstration data. The verification organization is responsible for
managing all the data and information generated during the demonstration. The developer is
responsible for furnishing those records generated by the technology operator. EPA and the
verification organization are responsible for analysis and verification of the data.
There are a variety of pieces of data and information that will be generated during a
demonstration. Each piece of data or information identified for collection in the demonstration
plan will need to be provided to EPA or the Verification Organization. This section should describe
what types of data and information needs to be collected and managed. It should also describe how
the data will be reported to Verification Organization for evaluation.
Innovative Technology Data: The developer is responsible for obtaining, reducing,
interpreting, validating, and reporting the data associated with his technology's performance.
These data should be reported in hard copy and electronic format (e.g., spreadsheet). A data
dictionary should be included as a text file on the same diskette containing the data.
Reference Laboratory Analyses: The raw data and the validated data must be provided to the
Verification Organization. These data should be provided in hard copy and in electronic format. As
with the data generated by the innovative technology, the electronic copy of the laboratory data
should be provided in a spreadsheet, and a data dictionary should be provided. In addition to the
sample results, all QA/QC summary forms for the referee analyses must be provided.
October 31,1996 3-25 Interim Final Report Version 5.0
-------
Other items that must be provided include:
field notebooks;
photographs, slides and videotapes (copies);
results from the use of other field analytical methods;
Hthologic logs;
drilling logs, and;
cone penetrometer traces.
10.0 HEALTH AND SAFETY PLAN
The Health and Safety Plan (HASP) is a very important part of the demonstration plan. It
should be an adaptation or appendix to the existing site HASP with any additions that are specific
to the demonstration. A copy of the site HASP should be available from the site manager or through
the verification organization or EPA contact. Figure 3-1 contains a representative list of topics that
should be addressed in the HASP. The verification organization can provide assistance in
preparing this section of the demonstration plan.
10.1 Health and Safety Plan Enforcement
The Verification Organization project manager, field site supervisor, and site health and
safety officer will be responsible for implementing and enforcing the health and safety plan. The
Verification Organization project manager will ultimately be responsible for ensuring that all
demonstration participants abide by the requirements of this HASP. The Verification
Organization field site supervisor will oversee and direct field activities and is responsible for
ensuring compliance with this HASP.
October 31, 1996 3-26 Interim Final Report Version 5.0
-------
Figure 3-1. Typical Table of Contents from a Health and Safety Plan.
Health and Safety Plan - XYZ, Inc. Site
Introduction
Purpose and Policy
Health and Safety Plan Enforcement for the XYZ, Inc. Site
Project Manager and Field Site Supervisor
Health and Safety Director
Site Health and Safety Officer
Visitors
Site Background
Demonstration-Specific Hazard Evaluation
Exposure Pathways
Inhalation
Dermal Contact
Ingestion
Health Effects
Physical Hazards
Fire
Heat Stress
Mechanical
Unstable/Uneven Terrain
Insect and Other Animal Stings and Bites
Noise
Electrical
Inclement Weather
Training Requirements
Personal Protection
Levels of Protection
Protective Equipment and Clothing
Limitations of Protective Clothing
Duration of Work Tasks
Respirator Selection, Use, and Maintenance
Medical Surveillance
Health Monitoring Requirements
Documentation and Record keeping Requirements
Medical Support and Follow-up Requirements
Environmental Surveillance
Initial Air Monitoring
Periodic Air Monitoring
Monitoring Parameters
October 31, 1996 3-27 Interim Final Report Version 5.0
-------
Figure 3-1. Typical Table of Contents from a Health and Safety Plan Cont'd.
Use and Maintenance of Survey Equipment
Heat Stress Monitoring
Site Control
Site Control Zones
Safe Work Practices
Health and Safety Plan Enforcement
Complaints
Decontamination
Personnel Decontamination
Equipment Decontamination
Emergency Contingency Planning
Injury in the Exclusion or Contamination Reduction Zones
Injury in the Support Zone
Fire or Explosion
Protective Equipment Failure
Emergency Information Telephone Numbers
Hospital Route Directions
October 31, 1996 3-28 Interim Final Report Version 5.0
-------
APPENDIX A
DESCRIPTION OF THE ENVIRONMENTAL MONITORING
MANAGEMENT COUNCIL (EMMC)
A primary goal of this demonstration program is to gain Agency-wide recognition and
acceptance of the results. The Environmental Monitoring Management Council is a key, Agency-
wide organization that represents the Program and Regional Offices regarding the standardization
of various policies and practices. This brief overview highlights the principle mission and purpose
of this body.
October 31, 1996 A-l Interim Final Report Version 5.0
-------
This page intentionally left blank.
October 31, 1996 A-2 Interim Final Report Version 5.0
-------
DESCRIPTION OF THE ENVIRONMENTAL MONITORING
MANAGEMENT COUNCIL
The Environmental Monitoring and Management Council (EMMC) was established in March
of 1990 to:
a) Coordinate Agency-wide policies concerning environmental monitoring issues, especially
in the areas of analytical methods integration, laboratory accreditation, and quality
assurance
b) Address Congressional concern over our ability to make national environmental
assessments
c) Respond to the needs of the Administrator/Deputy Administrator to make decisions based
on credible scientific data.
The EMMC Policy Council is made up of Assistant Administrator (AA), and Regional
Administrator (RA) level personnel and is chaired by the Office of Research and Development
(ORD) AA, and the U.S. EPA Region III RA. The Steering Committee is comprised of Office
Directors and Division Directors. Scientific and engineering program and regional staff provide
direction to the panels and work groups.
The Deputy Administrator reviewed EMMC activities on March 4, 1994, and made the
following decisions:
a) Designated EMMC to be the focal point for internal/external policy on monitoring
information activities
b) Endorsed EMMC Methods Format and Framework for Methods Development
c) Encouraged EMMC to continue developing a national program for laboratory
accreditation
d) Directed EMMC to brief the Science Policy Council (SPC) on its activities including:
the laboratory accreditation activities and associated near term resource needs
options and recommendations for an EPA-wide approach for applying
performance-based methods to monitoring activities.
e) Requested Assistant Administrators and Regional Administrators to continue support of
EMMC activities.
EMMC activities are important to establish better science/credibility for data comparability and
information-sharing. EMMC is working to simplify lab procedures and to effect cost reductions by
eliminating duplication of methods development efforts, avoiding duplication of field, laboratory
and QC efforts, and conducting fewer lab evaluation studies.
October 31, 1996 A-3 Interim Final Report Version 5.0
-------
EMMC issues that will be brought to the Science Policy Council in the future include:
a) The use of EMMC as a model for interaction with Science Committees in the Agency;
b) Procedures to implement a performance-based approach to method development;
c) Elimination of process barriers that prevent use of innovative technology.
This last point is of primary interest to advancing the goals of the Consortium.
The EMMC serves as a forum to determine which methods need integration, provides for
consensus of all offices and serves as a cross-program vehicle for assuring documentation of
methodology and data and for comparability of data.
Integration of four monitoring methods have been completed. These are: Graphite, Furnace
Atomic Absorption, Inductively Coupled Plasma Atomic Emissions Spectrometry, Hot Acid
Extraction for Elemental Analyses (as part of above), and Determination of Purgeable Organic
Compounds by Capillary Column Gas Chromatography.
The EMMC recommends a variety of subjects for Agency-wide acceptance to the Science
Policy Council. The EMMC is currently working on assessment/design activities to ensure
environmental measurement/quality assurance are built into the regulatory process. The EMMC
acts to review specific products and to provide formal EPA responses in the areas that are most
important to achieving nationwide recognition. These activities enhance the value of the
demonstration program by offering a standard process which will receive Agency-wide acceptance
and approval.
October 31 1996 A-4 Interim Final Report Version 5.0
-------
APPENDIX B
ENVIRONMENTAL MONITORING MANAGEMENT COUNCIL METHOD FORMAT
(EMMC)
This appendix describes the standard format that will be used to document EPA's
environmental monitoring methods. The EMMC Method Format was developed to provide
a uniform system for the integrated review of methods by the Work Groups of the
Environmental Monitoring Management Council's Ad Hoc Panel on Methods Integration.
The outline form of the Methods Format was approved by the EMMC Methods Integration
Panel and Chairs of the respective Work Groups on January 24, 1992. While the format
originally was intended for use in documenting methods for use across the various EPA
Program Offices, some of the specifics in this description relate to the special problems of
documenting a performance-based method.
Technologies that require data collection using measurement procedures must include a
detailed method. The Consortium will require the use of the EMMC standard method
format for all methods used in a technology demonstration. After the demonstration is
completed, a revised EMMC method will be prepared as part of the Technology Evaluation
Report.
October 31, 1996 B-1 Interim Final Report Version 5.0
-------
This page intentionally left blank.
October 31,1996 B-2 Interim Final Report Version 5.0
-------
Environmental Monitoring Management
Council (EMMC) Methods Format
1.0 Scope and Application
Use a tabular format whenever possible for:
Analyte list(s)
Chemical Abstract Service (CAS) numbers
Matrices
Method Sensitivity (expressed as mass and as concentration with a specific
sample size)
Include a list of analytes (by common name) and their CAS registry numbers, the
matrices to which the method applies, a generic description of method sensitivity (expressed
both as the mass of analyte that can be quantified and as the concentration for a specific
sample volume or size), and the data quality objectives which the method is designed to
meet. Much of this material may be presented in a tabular format.
2.0 Summary of Method
Sample volume requirements
Extraction
Digestion
Concentration, and other preparation steps employed
Analytical instrumentation and detector system(s), and
Techniques used for quantitative determinations
Summarize the method in a few paragraphs. The purpose of the summary is to provide a
succinct overview of the technique to aid the reviewer or data user in evaluating the method
and the data. List sample volume, extraction, digestion, concentration, other preparation
steps employed, the analytical instrumentation and detector system(s), and the techniques
used for quantitative determinations.
3.0 Definitions
Include the definitions of all method-specific terms here. For extensive lists of
definitions, this section may simply refer to a glossary attached at the end of the method
document.
October 31, 1996 B-3 Interim Final Report Version 5.0
-------
4.0 Interferences
This section should discuss any known interferences, especially those that are specific to
the performance-based method. If known interferences in the reference method are not
interferences in the performance-based method, this should be clearly stated.
5.0 Safety
Above and beyond good laboratory practices
Disclaimer statement (look at ASTM disclaimer)
Special precautions
Specific toxicity of target analytes or reagents
Not appropriate for general safety statements
This section should discuss only those safety issues specific to the method and beyond
the scope of routine laboratory practices. Target analytes or reagents that pose specific
toxicity or safety issues should be addressed in this section.
6.0 Equipment and Supplies
Use generic language wherever possible. However, for specific equipment such as GC
(gas chromatograph) columns, do not assume equivalency of equipment that was not
specifically evaluated, and clearly state what equipment and supplies were tested.
7.0 Reagents and Standards
Provide sufficient details on the concentration and preparation of reagents and standards
to allow the work to be duplicated, but avoid lengthy discussions of common procedures.
8.0 Sample Collection, Preservation and Storage
Provide information on sample collection, preservation, shipment, and
storage conditions.
Holding times, if evaluated
" If effects of holding time were specifically evaluated, provide reference to relevant data,
otherwise, do not establish specific holding times.
9.0 Quality Control
Describe specific quality control steps, including such procedures as method blanks,
laboratory control samples, QC check samples, instrument checks, etc., defining all terms in
Section 3.0. Include frequencies for each such QC operation.
October 31, 1996 B-4 Interim Final Report Version 5.0
-------
10.0 Calibration and Standardization
Discuss initial calibration procedures here. Indicate frequency of such calibrations, rcie
to performance specifications, and indicate corrective actions that must be taken when
performance specifications are not met. This section may also include procedures for
calibration verification or continuing calibration, or these steps may be included in Section
11.0.
11.0 Procedure
Provide a general description of the sample processing and instrumental analysis steps.
Discuss those steps that are essential to the process, and avoid unnecessarily restrictive
instructions.
12.0 Data Analysis and Calculations
Describe qualitative and quantitative aspects of the method. List identification criteria
used. Provide equations used to derive final sample results from typical instrument data.
Provide discussion of estimating detection limits, if appropriate.
13.0 Method Performance
A precision/bias statement should be incorporated in the section, including:
detection limits
source/limitations of data
Provide detailed description of method performance, including data on precision, bias,
detection limits (including the method by which they were determined and matrices to
which they apply), statistical procedures used to develop performance specifications, etc.
Where performance is tested relative to the reference method, provide a side-by-side
comparison of performance versus reference method specifications.
14.0 Pollution Prevention
Describe aspects of this method that minimize or prevent pollution that may be
attributable to the reference method.
15.0 Waste Management
Cite how waste and samples are minimized and properly disposed.
16.0 References
Source documents
Publications
October 31, 1996 B-5 Interim Final Report Version 5.0
-------
17.0 Tables, Diagrams, Flowcharts and Validation Data
Additional information may be presented at the end of the method. Lengthy tables may
be included here and referred to elsewhere in the text by number. Diagrams should only
include new or unusual equipment or aspects of the method.
October 31, 1996 B-6 Interim Final Report Version 5.0
-------
APPENDIX C
OFFICE OF SOLID WASTE METHOD REQUIREMENTS
EPA acceptance of a method associated with an innovative technology can be based, in
part, on the results of a demonstration conducted under the auspices of the Consortium. This
appendix provides an overview of the expectations that the EPA's Office of Solid Waste
(OSW) has regarding the acceptance of a method into Publication SW-846, Test Methods
for Evaluating Solid Waste. The following is a copy of a standard letter that is provided to
those individuals expressing an interest in submitting analytical methods for inclusion in
SW-846.
The Consortium will provide guidance to the developer in preparing the appropriate
data package for the OSW Methods Review Panel. Participation in the demonstration
program does not guarantee acceptance by any EPA Program Office of the analytical
method. The Consortium serves to verify technology performance as described in the
demonstration plan.
October 31, 1996 C-1 Interim Final Report Version 5.0
-------
This page intentionally left blank.
October 31,1996 C-2 Interim Final Report Version 5.0
-------
OFFICE OF SOLID WASTE METHOD REQUIREMENTS
Dear Colleague:
The Methods Section of the Office of Solid Waste is responsible for the promulgation of
rugged and reliable analytical techniques in support of the Resource Conservation and
Recovery Act (RCRA) Program. The methods published in Test Methods for Evaluating
Solid Waste, SW-846, are used to measure the concentration of specific pollutants or to
establish whether a waste stream demonstrates a hazardous characteristic (e.g., ignitabiliry,
corrosivity, reactivity or toxicity).
SW-846 currently provides reliable and sensitive laboratory methods for the analysis of
Appendix VIII analytes. However, some of these methods may be too costly or require too
much analysis time for some applications. The Methods Section also recognizes the savings
that could be achieved by sending only contaminated samples to analytical laboratories for
quantitative analysis. Therefore, the Methods Section has recognized the need for more
rapid, less expensive field screening procedures.
A number of sources have developed reliable, reproducible and cost-effective field or
screening procedures with potential application for the RCRA Program. This letter provides
developers with a description of the type of performance data that is required for an
effective initial evaluation of screening or field procedures. If a developer's data supports
adoption of a new method, the Methods Section will work through the SW-846 Work
Group process to promulgate it. This letter odes not supersede or replace the more rigorous
requirement described in Test Method Equivalency Petitions, EPA/530-SW-87-008,
OSWER Policy Directive No. 9433.00-2 (2/87). That document provides the requirements
for a method equivalency petition which may be used to promulgate a method outside of the
Work Group process.
While screening procedures need not be fully quantitative, they should measure the
presence or absence of target analytes at or below regulatory action levels. Therefore, initial
demonstration of method performance involves measuring the percentage of false negatives
and false positives generated using the procedure for a single sample matrix. Data should be
submitted for split samples analyzed using the developer's technique and an approved SW-
846 quantitative methods. A candidate procedure should ideally produce no false negatives.
Definition of a false negative is a negative response for a sample that contains up to two
times the stated detection level of the target analyte(s). A candidate procedure should
produce no more than 10% false positives. Definition of a false positive is a positive
response for a sample that contains analytes at one half the detection level. Between 20 and
50 samples spiked at twice the detection level should be tested to establish the percentage of
false negatives. It is recommended that a sufficient volume of each spiked sample be
prepared to complete each test with one lot of material. Sufficient randomly selected
aliquots of each spiked matrix should be analyzed by appropriate SW-846 methods to
October 31, 1996 C-3 Interim Final Report Version 5.0
-------
demonstrate sample homogeneity and to characterize the sample in terms of target analytes
and potential interferences.
A separate study should also be conducted to establish the effect of non-target
interferences. A screening procedure should produce no more than 10% false positives for a
set of 20 samples that contains a 100 fold excess of interferences. Positive interferences
should be selected that are chemically related to the target analytes and are environmentally
relevant. Negative interferences (i.e., masking agents) should also be investigated whenever
they are suspected.
Developers should also analyze three different types of samples to provide matrix-
specific performance data. These samples should either be characterized reference materials
or spiked matrices containing known amounts of target analytes. In either case, bulk
samples should be carefully homogenized to reduce sub-sampling errors. The sample
matrices should be selected to represent what is regulated under RCRA (e.g., soil, oil waste
or waste waters), not to provide the best performance data. Blanks should be analyzed with
each set of samples.
Matrix-specific performance data, including detection limits and dynamic range, are
gathered by analyzing ten replicate aliquots of three different sample mattocks spiked at two
concentrations. If spiked samples are used, suggested spiking levels are the matrix-specific
detection limit and 50 times the detection limit. Positive or negative results should be
reported for the low concentration samples. Results for high concentration samples should
be reported as either semi-quantitative results or as positive/negative with the dilution factor
used for the samples. Specific spiking concentrations are provided for selected target
analytes in the attachments to this letter. The low values are normal reporting limits for
routine analyses, and the high value is 50 times the low concentrations. The Methods
Section recognizes that it may not be appropriate to spike all of the target analytes listed
within a chemical class.
If the developer has field data, the Methods Section would welcome the opportunity to
compare the results obtained using the screening procedure with sample concentrations
determined in a laboratory using SW-846 methods.
To summarize, the Methods Section does not require an unreasonable body of data for
the initial evaluation of new techniques. Data will need to be submitted on the percentage of
false negatives, percentage of false positives, sensitivity to method interferences, and
matrix-specific performance data in order to complete the table below. In addition to these
data, the developer should also provide a description of the procedure and copy of any
instructions provided with the test kits.
October 31, 1996 C-4 Interim Final Report Version 5.0
-------
As part of the peer review comments oh a previous version of this document, Oliver
Fordham, Manager of the Inorganic Methods Program, Office of Solid Waste, asked that
the following material be included as part of this appendix. This information is printed as it
was provided by Mr. Fordham.
When reviewing a new monitoring technology, the first question is, "How well does it
work with respect to measurement methodology." How well a technology works is related
to such properties as:
What is the lowest level of the property that the technology can measure in the
types of materials to be tested? This property is commonly referred to as Method
Sensitivity.
In many monitoring situations, while the technology yields a measurement, the
measured value is not a valid or accurate indication of the true value of the property
of the sample. In most environmental monitoring situations, where accuracy is a
problem, the methodology tends to "miss" some of the constituents for which it is
being analyzed. In some cases, the measured value is less than what is actually in
the sample. The difference between the measured value and the true value (i.e., the
actual amount of the substance in the sample being determined) is termed the bias.
While it is generally more common for measurement methods to give a low bias, in
some cases, measurement methods indicate higher levels of a substance than are
actually present and thus show a high bias.
How reproducible is the method? This information is needed to determine what the
minimum difference between samples or between the sample property and some
action or regulatory level needs to be in order for the method to distinguish between
the two within an acceptable level of confidence. This property is commonly
referred to as Method Precision.
For what types of materials or matrices is the technology suitable? What are the
characteristics of the sample that are needed in order to use this method? Such
characteristics include: physical state (e.g., liquid, gas, solid), concentration range
of the analyte of interest (e.g., percent level, ppm level, trace levels), sample
temperature, matrix composition (e.g.,organic, aqueous, inorganic, high salinity).
Accurately knowing what matrix the method is suitable for is critical to a user
trying to select an appropriate technique.
Even when a method is generally suitable for use on a specific type of material, one
often finds that the presence of certain other materials in the sample will interfere
with obtaining reliable results. This property, termed interferences, needs to be
determined.
October 31, 1996 C-5 Interim Final Report Version 5.0
-------
Therefore, the minimum information that EPA requires when evaluating a new
technology is:
What substances or properties can be measured using this technology?
How accurate is the technique?
How precise?
How sensitive?
What matrices is it suitable for use on?
What will interfere with obtaining accurate results?
What is its operating range?
A number of sources have developed screening procedures for use in determining
whether or not a sample possesses a property or analyte concentration above some specified
level. For such tests, the properties of accuracy and precision need to be looked at
differently. While such screening procedures need not be fully quantitative, they should
measure the presence or absence of the target analytes at or above the regulatory or other
action level with a high degree of confidence. Therefore, initial demonstration of method
performance involves measuring the percentage of false negatives (i.e., no test response at
an analyte concentration in the sample above the stated method sensitivity) and false
positive (i.e, test responds to the sample when the sample contains less than the stated
sensitivity). A candidate method should produce no more than 10% false positives. The
Agency will consider a result to be a false positive if a positive test response is produced
with samples containing the analytes of interest one half of the stated test detection level.
We will now examine each of these attributes in terms of the specific data requirements
involved.
Applicable Analytes or Properties
The documentation should describe the specific analytes or properties that the
technology is applicable to. For each analyte listed, data shall be supplied that documents
the performance of the technology on appropriate samples containing that analyte. If the
technology is well enough understood, then claim of applicability can be made on the basis
that a substance of interest has the same properties as a substance that has already been
demonstrated to perform satisfactorily with the technology. However, for such a claim to be
accepted, the developer must be able to describe the boundary conditions for the method'
being examined. The boundary conditions are those points in the spectrum of
sample/analyte properties where the technology ceases to perform well enough for the
intended monitoring purposes.
Bias
Bias is given as the difference between the measured value and the true value for the
property or analyte. Accuracy for methods that measure the concentration of a substance is
generally given as the percent recovery range using laboratory spikes, or preferably,
October 31, 1996 C-6 Interim Final Report Version 5.0
-------
reference samples having analyte concentrations determined by using methods of known
accuracy.
Precision
Precision refers to the variability in measurements that are made on sample that have the
same level of the property or analyte of interest. Precision for all methods except pH are
presented as relative percent difference (RPD) of replicate analyses. Precision of pH
measurements are listed in pH units. In general, data from at least five different materials,
representing the variety of samples to be analyzed using this method, should be tested. For
each material, both single laboratory precision based on triplicate analysis and multi-
laboratory analysis using five or more laboratories shall be reported.
Sensitivity
Sensitivity or method detection limit (MDL) refers to the minimum concentration or
other measure of the property that can confidently be determined by the method. The MDL
is that concentration of the analyte that must be present in the sample for there to be less
than a 5% probability that the analyte is not present in the sample.
In addition to the MDL, the submitter shall describe the minimum concentration that
can reliably be measured and ascribed a value to. This level is generally referred to as the
Minimum Quantitation Limit (MQL).
There are no requirements established by EPA for technology MDL and MQL. The
required sensitivity is a function of the application and the corresponding regulatory or other
action levels. It is the responsibility of the developer to identify the monitoring application
for which method evaluation is being requested and to ensure that the submitted method
possesses the requisite sensitivity.
Matrices
The submitter needs to determine what type of samples the methodology is suitable and
demonstrate the claims using appropriate test materials. Listed in order of preference are
appropriate test materials.
Performance Evaluation (PE) samples which are samples whose matrix properties are
nearly identical to those for which the method is claimed to be suitable for and contains
measurable levels of the analytes of interest. The composition should be unknown to the
analyst for it to be suitable for evaluating system performance. PE samples should come
with statistics about each sample which have been derived from the analysis of the sample
by a number of laboratories using well accepted reference methods. These statistics should
include a true value of the property and a mean of laboratory results obtained from the
analysis of the PE Sample.
October 31, 1996 C-7 Interim Final Report Version 5.0
-------
Where PE materials having a matrix close to that for which the method is being
evaluated are not available, laboratory spiked samples of the appropriate matrix
composition should be used. A spike sample is a sample of the matrix to which a known
amount of the analyte to be determined has been added. The successful use of spiked
samples for method evaluation however depends on several factors. First a procedure for
introducing the analyte of interest into the matrix in such a manner that the spiked sample
will behave analytically like a "real world" sample must be available. For example, if a
method is to be used to determine the concentration of lead in an incinerator ash that has
been subjected to temperatures high enough to glassify the ash, then a spike sample
prepared by adding a solution of a soluble lead salt to incinerator ash, then a spike sample
prepared by adding a solution of a soluble lead salt to incinerator ash, stirring to coat the ash
particles with the lead containing solution and then low temperature drying to remove water
would not yield an acceptable method evaluation material. In the aforementioned glassified
ash, the lead is incorporated into the glass matrix. Such lead is very difficult to solubilize
and then measure. In a spike sample prepared as described above, the lead is all on the
surface and readily available for extraction. Thus a method that is not very aggressive
relative to its ability to "digest" the ash matrix would be a poor tool to measure lead
concentration in the real world samples but show excellent results on the artificial sample.
Where PE materials having a matrix identical are not available to that of those for which
the method is to be used, and preparing laboratory spike samples is not feasible, then PE
samples using a matrix that would be expected to pose a sufficient challenge to the method
should be employed. An example of this situation is a procedure that is to be used for
monitoring the concentration of pesticides in milk from nursing mothers. If reference
samples of contaminated mother's milk is not available, and introducing the pesticide into
the fat component of the milk might cause other problems, reference samples of
contaminated cow's milk could be used.
When submitting an application for EPA evaluation of an innovative monitoring
technology, it is the submitter's responsibility to select the materials that are used in
evaluating the method and to explain why the particular materials are appropriate.
Interferences
For any new technology, a study should be conducted to establish the effect of non-
target analytes on the measurement process. The effect of substances that might yield
positive interferences (i.e., measured values are higher than actual values) or negative
interferences (i.e., measured values are lower than the actual sample concentration), also
known as masking agents, should be investigated in order to determine the methodology's
limitations.
When evaluating the effect of potential interferents, samples with both low
concentrations (1 to 10 times the MDL) and high concentrations (50 to 100 times the MDL)
of the target analytes should be employed.
October 31 1996 C-8 Interim Final Report Version 5.0
-------
Operating Range
While its sensitivity is a primary of the Agency when evaluating a new technology,
some methods can only be used on samples with a narrow concentration range. Also, some
methods require controlled environmental conditions for satisfactory operation. Any such
limitations should be described in the application.
October 31, 1996 C-9 Interim Final Report Version 5.0
-------
This page intentionally left blank.
October 31, 1996 C-10 Interim Final Report Version 5.0
-------
APPENDIX D
REPRESENTATIVE DEMONSTRATION SCHEDULE
The following identifies key events in a typical demonstration process. The analysis of
results from the predemonstration study is the critical step in deciding to go forward with
the demonstration. This is a dress rehearsal of events and is designed to remedy flaws in the
experimented design, sampling process and reference laboratory performance. The
developer also has an opportunity to resolve any matrix effects related to the site.
October 31, 1996 D-l Interim Final Report Version 5.0
-------
This page intentionally left blank.
October 31,1996 D-2 Interim Final Report Version 5.0
-------
REPRESENTATIVE DEMONSTRATION SCHEDULE
CHARACTERIZATION AND MONITORING PROGRAM
Task Completed
Weekl
Week 4
Weeks 4-6
TASK 1) Publication of a CBD announcement and solicitation
of other likely sources for developer participation
TASK 2) Developers' Conference - Time and place will be
noted in announcement for Task 1. Developer describes
technology and learns about the demonstration process.
TASK 3) Review and Select Technologies - Separate into
relevant categories to define the scope of
the demonstration. Six to nine technologies with
some common features or requirements. Notify
developers of selection.
TASK 4) Submission of proposals or letters of intent from the Weeks 6-8
developers. A requirement of the Guidance Manual
which aids in the preparation of the demonstration plan.
TASKS)* Preparation and Review of Demonstration Plan - Weeks 7-14
Based on the Guidance Manual for the Preparation of
Characterization Technology Demonstration Plan.
TASK 6) Selection of appropriate field sites- Concurrent with Task 5. Weeks 10-14
TASK 7) Selection and audit of reference laboratory - Concurrent with Weeks 10-14
Task 5. Predemonstration study in Task 9 confirms selection.
TASK 8)* Review and approve draft demonstration plan by all participants. Weeks 14-16
TASK 9) Predemonstration Sampling - Critical element in final Weeks 15-16
selection of technologies and sites. Final test of reference
laboratory performance.
TASK 10)* Data review from Task 9. Receipt of laboratory data is the Weeks 16-22
major time factor.
TASK 11) Revise and approve final demonstration plan - Weeks 21-23
Critical step in process. Need concurrences from all
participants. Based on data from Tasks 9 and 10.
TASK 12) Conduct field demonstration - Six to nine days at
each site. Visitors Day - Travel and logistics.
Weeks 24-27
October 31, 1996
D-3
Interim Final Report Version 5.0
-------
TASK 13)* Complete audit reports/receive reference laboratory data/ Weeks 27-33
report preliminary findings.
TASK 14)* Prepare Innovative Technology Evaluation Reports - Internal Weeks 29-36
EPA review.
TASK 15) Developer Review. Include draft verification statement. Weeks 36-38
TASK 16)* External Peer Review. Weeks 38-42
TASK 17)* Complete report - issue verification statements Weeks 43-46
* Indicates tasks where less time may be required
NOTE: Given the specific nature of the technologies and other factors associated with a
demonstration, it may be possible to shorten the project completion period by 4 to 6 weeks.
October 31, 1996
D-4
Interim Final Report Version 5.0
-------
APPENDIX E
GUIDANCE FOR ADDRESSING QUALITY ASSURANCE/QUALITY CONTROL
(QA/QC) REQUIREMENTS IN A DEMONSTRATION PLAN
This material was abstracted from the report Preparation Aids for the Development of
Category II Quality Assurance Project Plans, EPA/600/8-91/004. The report was prepared by
Guy F. Simes, National Risk Management Research Laboratory (formerly the Risk Reduction
Engineering Laboratory), Cincinnati, Ohio, Office of Research and Development, U.S.
Environmental Protection Agency. The Agency has determined that the level of quality
assurance required in Category II is the most appropriate for planning and executing a
technology demonstration that leads to the verification of cost and performance.
This report contains guidance that should be useful to a developer or whoever is responsible
for preparing demonstration plans. The technology demonstration plan is expected to include the
elements described in this Appendix. The body of the guidance manual (Chapter 3) often refers
to the appropriate sections of this appendix for additional detail or clarification. Since quality
control measures are viewed as an integral part of a demonstration, the Consortium requires a
single document which combines the technology description and quality control requirements.
This document, as it was originally prepared, is intended to provide the reader with
exhaustive guidance for preparing a stand-alone quality assurance project plan. As mentioned
previously, the Consortium prefers to have all the necessary elements of QA/QC integrated into a
single document - a demonstration plan. Therefore, the tone of the original document has been
modified to provide the reader with useful information for preparing the QA/QC portions of the
demonstration plan.
From the foreword to the subject guidance document...
Perhaps the greatest benefit to the users of this document comes from its emphasis on up-
front planningdo the right thing, right, the first time. While this sounds straightforward,
managers know that determining the right course in a complex and uncertain situation is
anything but simple. Determining customer requirements up-front, and then having the processes
and procedures in place to accomplish them, averts costly mistakes. Resources are conserved in
two ways: by avoiding rework to correct efforts that do not initially meet management or
customer specifications; and by performing to the specifications required and not beyond them.
In these ways, this "Preparation Aids" document can help management achieve its mission with
more effective utilization of diminishing resources.
E. Timothy Oppelt, Director
National Risk Management Research Laboratory
Cincinnati, Ohio
October 31,1996 E-1 Interim Final Report Version 5.0
-------
Table of Contents
1.0 INTRODUCTION E-5
1.1 Purpose of the Quality Assurance Project Plan (QAPP) E-5
1.2 Contents of a QA Project Plan E-5
2.0 PROJECT DESCRIPTION ฃ-7
2.1 General Overview ฃ.7
2.2 The Technology, Site, Facility, or System E-7
2.3 Statement of Project Objectives E-7
2.4 Experimental Design E-8
2.5 Schedule E-8
3.0 PROJECT ORGANIZATION AND RESPONSIBILITIES E-10
4.0 QUALITY ASSURANCE OBJECTIVES E-12
4.1 Determining QA Objectives E-13
4.2 Quantitative QA Objectives: Precision, Accuracy, Method Detection Limit, and
Completeness E-13
4.3 Qualitative QA Objectives: Comparability and Representativeness E-14
4.4 What If QA Objectives Are Not Met? E-15
4.5 Other QA Objectives E-15
5.0 SITE SELECTION AND SAMPLING PROCEDURES E-18
5.1 Sampling Site Selection E-18
5.2 Sampling Site Description E-19
5.3 Sampling Procedures E-19
5.4 Sample Custody E-20
6.0 ANALYTICAL PROCEDURES AND CALIBRATION E-25
6.1 EPA-Approved or Other Validated Standard Methods E-25
6.2 Nonstandard or Modified Methods E-27
6.3 Calibration Procedures and Frequency E-27
7.0 DATA REDUCTION, VALIDATION, AND REPORTING E-28
7.2 Data Validation E-28
7.3 Data Reporting E-31
8.0 INTERNAL QUALITY CONTROL CHECKS E-32
8.1 Types of QC Checks E-32
9.0 PERFORMANCE AND SYSTEMS AUDITS E-34
10.0 CALCULATION OF DATA QUALITY INDICATORS E-36
10.1 Common Data Quality Indicators E-36
10.1.1 Precision E-36
October 31, 1996 E-2 Interim Final Report Version 5.0
-------
10.1.2 Accuracy ฃ.37
10.1.3 Completeness E-37
10.1.4 Method Detection Limit (MDL) E-38
11.0 QUALITY CONTROL REPORTS TO MANAGEMENT E-38
12.0 REFERENCES E-38
LIST OF TABLES
Table E-1. Summary of Planned Analyses (Including QC) for Chemical Treatment of Water
E-9
Table E-2. Expanded Sample Summary forNon-QC and QC Samples E-10
Table E-3. QA Objectives for Precision, Accuracy, and Method Detection Limits (MDL) . E-16
Table E-4. QA Objectives for Precision, Accuracy, and Detection Limits - Alternate Form E-17
Table E-5. Required Detection Limits for Volatile Chlorinated Organic Compounds E-18
Table E-6. Summary of Number of Samples Required for Hypothetical Incineration Test . E-22
Table E-7. Summary of Laboratory Analyses and Sample Quantity Requirement E-23
Table E-8. Required Containers, Preservation Techniques, and Holding Times E-24
Table E-9. Typical Summary Table of Standard Methods and Procedures E-26
Table E-10. Summary Table of Calibration Requirements for Process Measurements E-31
Table E-l 1. Scheduled QC and Calibration E-35
LIST OF FIGURES
Figure 1. Typical project organization flowchart E-l 1
Figure 2. Example of a data reduction, validation and reporting scheme E-30
October 31, 1996 E-3 Interim Final Report Version 5.0
-------
This page intentionally left blank.
October 31,1996 E-4 Interim Final Report Version 5.0
-------
1.0 INTRODUCTION
1.1 Purpose of the Quality Assurance Project Plan (QAPP)
The purpose of a QA Project Plan is to relate project objectives to specific measurements
required to achieve those objectives. For purposes of a demonstration, those objectives are
usually defined in terms of verifying the performance claims of the technology. The QA Project
Plan must provide sufficient detail to demonstrate the following:
Intended measurements are appropriate for achieving project objectives;
Quality control procedures are sufficient for obtaining data of known and adequate
quality; and
Such data will be defensible if challenged technically.
Technology demonstrations require the coordinated efforts of numerous individuals,
including developers, managers, engineers, scientists, and statisticians. The QA Project Plan
must integrate the requirements of everyone involved, in a form that permits an easy and rapid
review. It must also provide unambiguous instructions to the sampling team, the analytical
laboratory, and any other parties responsible for data generation. Finally, the QA Project Plan
must provide sufficient detail to allow a thorough, detailed review by an independent party not
involved in the demonstration.
Because the end use of the data determines the degree of quality assurance that is required,
the Consortium has determined that demonstrations will be conducted as Category II projects.
Category II projects produce results that complement other inputs. These projects are of
sufficient scope and substance that their results could be combined with those from other
projects of similar scope to produce information for making rules, regulations, or policies. In
addition, projects that do not fit this pattern, but have high visibility, are also included in this
category. The technology demonstrations are usually assigned to this category.
1.2 Contents of a QA Project Plan
A QA Project Plan must cover the following topics:
Technology being tested and the objectives of the demonstration (i.e., the
hypothesis to be tested, for purposes of a demonstration, this is technology
performance.);
Type and number of measurements that will be taken to achieve those objectives;
Quality of data that will be required and how that quality will be obtained; and
How the data will be recorded, calculated, reviewed, and reported.
Be sure that the final demonstration plan also includes:
October 31, 1996 E-5 _ Interim Final Report Version 5.0
-------
Any technology-specific information necessary for the sampling team, analytical
laboratory, data reduction team, and other project participants. However, assume
these parties are familiar with standard methods.
Any deviations from standard methods or procedures, or clarification of such
changes whenever there are ambiguities.
The demonstration plan is an integrated document which contains the key elements of a
QAPP and should not be repetitious. For example:
Do not discuss the same subject matter more than once in the demonstration plan.
Although the required elements of the QA Project Plan may appear to overlap in
some cases, repetition leads to confusion and to documents too long for effective
communication. If you are unsure where a subject should be treated, discuss it once
and cross-reference that discussion in the other sections.
Do not repeat material from Section 6.0 (Sampling Plan) of the demonstration plan;
reference the section and page number in the demonstration plan as needed.
Do not repeat material from standard methods that are available from the EPA or
any other official source. Provide detailed citations and assume that the analyst and
reviewer have these methods on-hand.
1.3 Category II Format Notes
As a Category II level activity, demonstration plans must incorporate the following:
Title page.
Technology Demonstration Plan Approval Form (Signature Page).It is important
that the project principals understand and agree on the experimental approach. For
this reason, the Technology Demonstration Plan Approval Form must be signed by
all key project personnel. These signatures, which must be obtained before the
demonstration is started, indicate that the key personnel have read the appropriate
sections of the demonstration plan and are committed to full implementation.
Table of Contents
Document control format. Section number, revision, date, and page should be
recorded in an upper corner of each page. This format requirement is illustrated
below:
Section No. 1.0
Revision: 0
Date: December 1, 1994
Page: 3 of 5
October 31, 1996 E-6 Interim Final Report Version 5.0
-------
2.0 PROJECT DESCRIPTION
The information in this section should be useful for preparing the project description. It
describes the technology or environmental system that is to be tested, the project objectives, a
summary of the experimental design, and the proposed project schedule.
2.1 General Overview
This section provides a brief synopsis of the overall demonstration. The purpose(s) of the
study should be described, along with the decisions that are to be made and the hypothesis to be
tested. Other anticipated uses of the data should also be noted. The type of technology or
environmental system that is to be tested should also be described.
The discussion should also include results of any pertinent or relevant preliminary
investigations or data generated in the pre-demonstration study.
Example: Preliminary investigations of the test site indicated an average pentachlorophenol
concentration of 150 mg/kg, with a range of 10 to 500 mg/kg.
2.2 The Technology, Site, Facility, or System
Provide a description of the technology, facility, or environmental system that will be tested.
Include flow diagrams, maps, charts, etc., as needed. Approximate sampling mass or volumetric
flow rates should be indicated to permit proper evaluation of the sampling or process monitoring
procedures. Additional diagrams are often included to unambiguously describe sampling points.
The discussion should contain enough material to permit a technical reviewer who is unfamiliar
with the specific project to assess the technology and sampling strategy.
2.3 Statement of Project Objectives
The project objectives should be summarized and stated clearly. Avoid scattering statements
of project objectives throughout the sampling and analytical sections of the demonstration plan.
Demonstration objectives should be stated in numerical terms whenever possible:
Poor Statement: The Tri-Corder will be characterized with respect to its ability to
measure semivolatile organic compounds.
Better Statement: The major objective is to demonstrate a Tri-Corder for an accuracy of
95 percent for the compounds listed in Table...
Best Statement: The objective is to demonstrate a Tri-Corder for an accuracy of 90
percent or higher at a confidence level of 95 percent for the compounds listed in Table...
It is common to rank project objectives according to importance (e.g., into primary and
secondary objectives). Although this ranking is not essential, it does help focus effort on the
primary goals of the demonstration.
October 31, 1996 E-7 Interim Final Report Version 5.0
-------
2.4 Experimental Design
List all measurements that will be made during the demonstration, then classify them as
critical or noncritical measurements. Critical measurements are those that are necessary to
achieve project objectives; they may include either on-site physical or chemical measurements.
Noncritical measurements are those used for process control or background information.
Summarize in tabular form all measurements that are planned for each sample. Ideally, this
table should indicate the total number of samples for each sample point, including quality control
and check samples, as illustrated in Table E-l for a hypothetical chemical treatment project.
Relate the individual samples to the sampling points shown in a process diagram, if applicable.
For projects involving a large number of samples or analyses, it may not be possible to include
all QC and check samples in a single table. For such cases, two or more tables may be necessary,
the first to summarize the primary (non-QC) samples, and the others to show which QC samples
are associated with which analyses, as is show in Table E-2. Information on the total number of
all sample types is needed both to perform a comprehensive review and to estimate.
2.5 Schedule
Indicate start-up and ending dates, including those for preliminary field studies and
laboratory activities.
In developing a demonstration timeline, be sure to:
Allow adequate time for document review and revision.
Be realistic regarding a possible disruption in field activities due to weather
conditions.
Coordinate these activities with the appropriate members of the technology
demonstration.
October 31 1996 E-8 Interim Final Report Version 5.0
-------
Table E-1. Summary of Planned Analyses (Including QC) for Chemical
Treatment of Water
Semivolatile Organic
Compounds (SVOC)"
Non-QC (Primary)
Field Sampling Blankc
Field Duplicates
Laboratory Duplicates
Matrix Spikes (MSs)
Matrix Spike Duplicates
(MSDs)
Spare Samples"
Independent Check
Standard
Preliminar
y
Samples
3
1
0
0
1
1
2
0
Number of Tests Performed*
Influent
Water
Sample
Points:
C1
60
1
0
0
3
3
10
1
Effluent
Water
Sample
Points: E2
60
0
0
0
10
10
10
0
Sludge
Sample
Points:
S3
10
0
5
0
2
2
0
0
Grand Total
Metalsc
Non-QC (Primary)
Field Sampling Blank6
Field Duplicates
Laboratory Duplicates
Matrix Spikes (MSs)
Matrix Spike Duplicates
(MSDs)
Independent Check
Standard
Spare Samples"
3
1
0
1
1
0
0
2
60
1
0
3
3
0
1
10
60
0
0
10
10
0
0
10
10
0
5
2
2
0
0
0
Grand Total
Tota
I
133
2
5
0
16
16
22
1
195
133
2
5
16
16
0
1
22
195
a Samples will be divided evenly among the ten anticipated runs. More QC samples are planned for the effluent
than for the influent, since the former samples are expected to exhibit more variability. A matrix spike/matrix
spike duplicate or a laboratory duplicate/matrix spike of the effluent stream will be determined for each treatment
condition, because the effluent matrix may vary significantly with each treatment
b Refers to the twelve compounds listed in Table
c Trip blanks will be collected but not analyzed unless field blanks indicate a contamination problem.
d Not analyzed unless required.
October 31, 1996
E-9
Interim Final Report Version 5.0
-------
This table is provided to illustrate a typical format
Table E-2. Expanded Sample Summary for Non-QC and QC Samples
Measurement
TCLP-Metals
SVOCs-Total
Type
Non-QC Samples
Leachate Duplicate
Leachate Blank
Reference Material
Matrix Spike
Non-QC Samples
Sample Duplicate
Sample Equipment
Blank
Method Blank
Independent Check Std
Matrix Spike
Matrix Spike Duplicate
Raw
Soil
9
1
1
1
0
9
3
1
1
1
1
1
Treated
Soil (28
Days)
9
1
1
1
1
9
0
1
1
1
1
1
Reagent
Mix
1
0
0
0
0
1
0
0
0
0
0
0
Long-Term
Treated
Soils
15
3
3
3
0
0
0
0
0
0
0
0
Total
34
5
5
5
1
19
3
2
2
2
2
2
This table is provided to illustrate a typical format
3.0 PROJECT ORGANIZATION AND RESPONSIBILITIES
The demonstration plan must show that the project organization is adequate to accomplish
demonstration goals, and that all responsibilities have been assigned. This will establish proper
communication links between key personnel. Information flow regarding schedules, meetings
and other commitments regarding the technology demonstration must be defined.
Providing a table or chart illustrating project organization and lines of authority needed to
conduct the technology demonstration is recommended. Identify by name all key personnel for
each organizational entity and give their geographic locations and phone numbers. The
organizational chart should also include all third party participants and their points of contact.
The organizational chart should identify all QA Managers and should illustrate the relationship
to other project personnel. The QA Managers should be organizationally independent of the
project management so that the risk of conflict of interest is minimized. Figure 1 illustrates a
typical project organizational chart.
There should also be a description of the responsibilities of all project participants,
including QA Managers. Be sure to indicate responsibility for each type of analysis, physical
measurement, or technology operation. This summary should designate responsibility for
planning, coordination, sample collection, sample custody, analysis, review, and report
preparation. This document must identify the person responsible for making final
recommendations to the EPA technical lead and the verification entity. A conflict resolution
process may be required to reach a final recommendation.
The frequency and mechanisms of communications among the developer, the QA Manager,
the EPA Project Manager, as well as other participants must be discussed. Provide a regular
October 31, 1996
E-10
Interim Final Report Version 5.0
-------
schedule for progress reports, site visits, and teleconferences, and describe any special
occurrences would trigger additional communication.
In developing this information, be sure to:
Demonstrate that the QA Manager is independent of project organization. Since
each participant may have a QA responsibility, a QA Representative must be
identified for each organization.
Provide names, locations, affiliations, and telephone numbers for key personnel
(those who have authority to make final recommendations to the EPA technical
lead).
Show independence of the confirmatory laboratory from the developer and
verification entity.
Describe who has responsibility for sample collection and chain-of-custody control
procedures. Individuals who have access to the demonstration site must be clearly
identified.
Recognize the technology demonstration timeline when establishing a schedule for
meetings and project deliverables.
October 31,1996 E-11 . Interim Final Report Version 5.0
-------
EPA
PROGRAM MANAGER
EPA
TECHNICAL LEAD
VERIFICATION
ORGANIZATION
DEVELOPER
QUALITY ASSURANCE
MANAGER
SITE MANAGER
(EPA/OWNER)
LABORATORY
MANAGER
SAMPLING
COORDINATION
DEVELOPER
DEMONSTRATION
MANAGER
Figure 1. Typical project organization flowchart
4.0 QUALITY ASSURANCE OBJECTIVES
Quality assurance objectives are specifications that measurements must meet in order to
achieve project objectives. For example, in ordering a pump for a chemical plant, factors such as
capacity, pressure, and materials of construction must be specified. Similarly, precision,
accuracy, detection limits, and completeness must be specified for physical or chemical
measurements. Additional analytical requirements may be described qualitatively in terms of
representativeness and comparability. Quality assurance objectives are needed for all critical
measurements and for each type of sample matrix (soil, water, biota, etc.). Be sure to include
quality assurance objectives for physical as well as chemical measurements.
Once the demonstration performance requirements and QA objectives are established, the
measurement systems must be designed to meet them. The project manager, analytical chemists,
and other principals must agree on the feasibility and appropriate value of these performance
objectives.
October 31,1996
E-12
Interim Final Report Version 5.0
-------
4.1 Determining QA Objectives
QA objectives must be defined in terms of demonstration requirements, and not in terms of
the capabilities of the intended test methods. Of course, the QA objectives must be achievable by
available methods, and for this reason it is important that the laboratory review the QA
objectives. When QA objectives exceed the capabilities of available methods, either the methods
must be modified or the test plan must compensate for these deficiencies. Modifications may
often be as simple as collecting a larger sample. However, if nonstandard or significantly
modified test methods are required, Section 7 (QA Project Plan) of the demonstration plan must
include laboratory validation data to prove that the method is capable of achieving the desired
performance.
The following are examples of how QA objectives can be determined:
Example 1: Biological treatment ofpentachlorophenol (PCP).
Previous experience -with this treatment process has indicated that the output concentration
typically varies by a factor of two under stable operating conditions with uniform feed. To
avoid contributing additional error, the precision of the analytical methods should be
significant less than this value or approximately ฑ30 percent or less.
Example 2: Determination of destruction removal efficiency (DEE) for incineration of a
Dinoseb formulation containing 20 percent wA> Dinoseb.
The purpose of this test is to demonstrate a DRE of 99.99 percent or better. Dinoseb in stack
gases will be collected on a Modified Method 5 sampling train. Extracts of all components
of this train will be combined and reduced to 0.5-mL volume.
To allow a reasonable margin of error, the detection limit required for the Dinoseb
determination will be 40 ug/MM5 train. Previous experience with the determination of
Dinoseb on XAD resin has shown that this detection limit can be achieved routinely with the
method attached as Appendix B. Previous experience has also shown that Dinoseb can be
recovered routinely from a MM5 train with a recovery 2 50 percent and a precision of 230
percent relative percent difference (RPD), and deviations beyond these ranges indicate
analytical problems. Results of a methods validation study performed in our laboratory are
attached as Appendix C.
4.2 Quantitative QA Objectives: Precision, Accuracy, Method Detection Limit, and
Completeness
QA objectives for precision, accuracy, method detection limit, and completeness should be
presented in a QA objectives table similar to those shown in Tables E-3, E-4, and E-5. Be sure to
include QA objectives for all matrix types and to indicate the units in which these QA objectives
are given. Summary tables are very helpful to the confirmatory laboratory which must meet these
objectives. Because precision and accuracy can be measured in various ways, explain the method
to be used. If precision, for instance, is to be determined by duplicates, explain whether sample
October 31,1996 E-13 - Interim Final Report Version 5.0
-------
splitting will occur in the laboratory, during sampling, or at some other stage. Then summarize
all such information in either text or tabular format. Be sure to include the number and type of
samples in the sampling plan.
The following statements are examples of descriptions for precision, accuracy, method
detection limits and completeness:
Precision objectives for all the listed methods except pH are presented as relative percent
difference (RPD) of field duplicates. Precision objectives for pH are listed in pH units and
expressed as limits for field duplicates.
Accuracy objectives for organic compounds and metals are given as percent recovery range
of laboratory matrix spikes. Accuracy objectives for temperature measurements are absolute
deviations in ฐC.
Detection limits are defined as the method detection limit (MDL) multiplied by the dilution
factor required to analyze the sample. MDLs will be determined by replicate extraction and
analysis of seven identical spiked samples ofXAD resin.
Completeness is defined as the number of measurements judged valid compared to the
number of measurements needed to achieve a specified level of confidence in decision
making.
List QA objectives according to compound type, as was done for the semivolatile detection
limits in Table E-3. In other cases, where detection limits are derived from applicable
regulations, list detection limits for individual compounds, as in Table E-5.
Finally, it is important to explain how the QA objectives are to be interpreted in a statistical
sense. QA objectives are often interpreted in a sense that all data must fall within these goals; for
such projects any data that fail to satisfy the QA objectives are rejected and corrective action is
undertaken. However, other interpretations are possible. For example, the project requirements
may be satisfied if the average recovery is within the objectives; that is, excursions beyond the
objectives might be permitted, but the average recovery would have to satisfy the goals.
Whatever the case, it is important to describe in this section how tabulated QA objectives will be
interpreted.
4.3 Qualitative QA Objectives: Comparability and Representativeness
Comparability is the degree to which one data set can be compared to another. For instance,
to evaluate an environmental cleanup process, analyses of the feed and discharge streams must
be comparable. Similarly, to perform a nationwide environmental survey, methods used at
different locations must be comparable. Comparability is achieved by the use of consistent
methods and by traceability of standards to a reliable source. The choice of a comparative
method is a critical element in the demonstration. A discussion of the selection process will
comprise a separate section in the demonstration plan. The laboratory conducting the
confirmatory analysis must be shown to be independent of the technology developer. The
laboratory must also demonstrate knowledge and experience with the selected confirmatory
method.
October 31, 1996 E-14 Interim Final Report Version 5.0
-------
Representativeness is the degree to which a sample or group of samples is indicative of the
population being studied. An environmental sample is representative for a parameter of interest
when the average value obtained from a group of such samples tends towards the true value of
that parameter in the actual environment, as the number of representative samples is increased.
Representativeness is normally achieved by collecting and analyzing a sufficiently large
number of unbiased samples. A critical element in this process is often the cost per analysis. A
choice must be made between conducting many simple analysis, which are often less
informative, versus fewer complex, more expensive, but data rich analysis. The economics of
conducting a demonstration must always be considered in deciding how to design the
confirmatory studies. It should also be understood that the confirmatory method and laboratory
will have certain precision and accuracy limitations. A critical review of laboratory procedures
will be required to insure that a fair evaluation of the demonstration technology can be made.
4.4 What If QA Objectives Are Not Met?
There should be some discussion concerning the impact of not meeting one or more QA
objectives. Will the demonstration be a complete loss? Will some, but not all, of the
demonstration goals still be realized? Will the statistical confidence level be reduced? Are there
legal or regulatory ramifications? Answers to such questions will help provide a critical
perspective on the QA program being applied to this demonstration and the overall value of the
demonstration.
4.5 Other QA Objectives
Some demonstrations may require additional QA objectives, requirements for such QA
measurements should be stated. This may also include a discussion of additional confirmatory
analysis or the use of a referee laboratory.
October 31, 1996 E-15 - Interim Final Report Version 5.0
-------
Table E-3. QA Objectives for Precision, Accuracy, and Method Detection Limits (MDL)
Critical Measurement
Semivolatile organic
compounds" (SVOC)
Volatile chlorinated
organic compounds
Metals
As
Ba
Cd
Cr
Pb
Flow rate
Temperature
Matrix
Water
Soil
Water
Soil/water
Soil/water
Soil/water
Soil/water
Soil/water
Water/air
Water/air
Method
8270
8270
601
7060 (GFAA)
7080 (FAA)
7131 (GFAA)
7190 (FAA)
7421 (GFAA)
Rotameter
Thermocouple
Reporting
Units
M9/L
ug/kG
ug/L
ug/L
ug/L
ug/L
ug/L
pg/L
L/min
ฐC
MDL
20
600
(e)
5'
1000'
0.5'
600'
1'
.
-
Precision
(RPD)'
30
40
50
35
35
35
35
35
.
-
Accuracy
(% Recovery)"
50-150
50-150
50-150
80-120
80-120
80-120
80-120
80-120
ฑ5g
2h
Completeness'
200
200
300
200
200
200
200
200
100
100
* Given as Relative Percent Difference of laboratory duplicates, unless otherwise indicated.
b As percent recovery of matrix spike, unless otherwise indicated.
c Based on the number of measurements needed to achieve % level of confidence in decision making
permit reanalysis, as required.)
d All of the Principal Organic Hazardous Compounds (POHCs) of interest to this project will be required to
compounds determined by this method need not satisfy these objectives.
MDLs are given in Table . (See example Table 3-3)
' MDL for water samples. MDL for soils is 100 times greater. Reporting units for soils will be mg/kg.
0 Maximum permitted deviation against volumetric standard.
h Absolute deviation against ASTM thermometer.
(Extra samples will be collected to
meet these objectives. Other
This table is provided to illustrate a typical format
-------
Table E-4. QA Objectives for Precision, Accuracy, and Detection Limits - Alternate
Form
Data
Quality Parameter
Method of
Determination
Frequency
Required
Objective*-"'
Semivolatile Organic Compounds
Precision
-Water
- TCLP leachates
Accuracy
-Water& TCLP leachates
- Water & TCLP leachates
- Water
- MM5 train
Detection limits
- Water
- MM5 train
Field duplicate
Duplicate leaching
of laboratory - split
sample
Laboratory matrix
spike
Surrogate
NIST standard
reference materials
Spike of XAD resin
7 analyses of spiked
clean water
3 analyses of spiked
XAD resinฎ
1/test condition
1/test condition
1 water/test condition
1 leachate/test
condition
All samples
1 /project
3/project
1 before project
1 before project
RPD <50
RPD<60
Recovery = 50-150%
(b)
Recovery = 50-150%
Recovery = 50-150%
MDL*10
ug/l_neutrals
MDL s20
|jg/L,phenols
MDL s100 pg/L, bases
MDL s20 pg
* RPD = relative percent difference.
MDL = method detection limit.
b As specified in Method 8270.
Note (1) Objectives must be met for all critical measurements. (See Section 1.2 of this document.)
Note (2) MDLs must be calculated according to Equation (8), Section 9.0 of this document, which takes
into account the number of low-level samples analyzed.
This table is meant to illustrate a typical format
October 31,1996
E-17
- Interim Final Report Version 5.0
-------
Table E-5. Required Detection Limits for Volatile Chlorinated Organic Compounds
Compound
1,1,1 -Trichloroethane
1,1-Dichloroethane
1,1-Dichloroeth8ne
Vinyl chloride
1 ,2-Dichloroethane
Perchloroethylene
Regulatory
Threshold (pg/L)
5
5
5
2
1
5
Required
MDL (pg/L)
0.5
0.5
0.5 i
0.2
0.1
0.5
1 Method detection limits for these compounds are critical and will be determined experimentally by the
laboratory before sample collection is started. MDLs must be well below the Regulatory Threshold in
order to demonstrate that treated waters meet discharge limits.
This table is provided to illustrate a typical format
5.0 SITE SELECTION AND SAMPLING PROCEDURES
Provide a discussion which describes a plan for site selection and sampling which is
responsive to the demonstration objectives. A detailed discussion of the sampling plan
development process is presented in Volume 2 of SW-846, Test Methods for Evaluating
Solid Waste (3rd Ed.). Since SW-846 will likely undergo revisions, be sure to refer to the
latest revision or edition.
The narrative should explain the overall sampling strategy, the specific sampling
procedures that will be employed, sample custody, record keeping, and shipping
rec'.'irements.
5.1 Sampling Site Selection
For most technology demonstration projects, the sampling sites will have been
identified and preliminary data will be available. Explain how specific sampling locations
will be selected for the technology demonstration. The following information should be
provided:
A qualitative statement of the sample population to be represented by the
samples;
A description of the statistical method or scientific rationale to be used in
selecting sample locations;
October 31,1996
E-18
Interim Final Report Version 5.0
-------
Descriptions of the type of sampling strategy (e.g., simple, stratified,
systematic random sampling);
A description of the sample types (air, water, soil, biota);
A qualitative statement regarding potential sources of sample contamination;
and,
A discussion of the extent to which site selection will affect the validity of
the resulting data and demonstration objectives.
5.2 Sampling Site Description
Provide relevant charts, maps, sampling grids, or tables specifying the exact sampling
locations. Note any site modifications or additions that will be needed prior to sampling
and describe all locations and access points for critical process measurements such as
flow rates, pressures, or temperatures. Discuss any site-specific factors that may affect
sampling procedures. For each analyte and each sampling point, list the frequency of
sample collection and the total number of samples collected. This summary can best be
prepared in tabular form, as shown in Table E-6. Note that the numbers given in this table
may differ from the number of analyses, since some samples may be analyzed in
replicate, while others may be analyzed only on a contingency basis. Explain the
statistical basis for the sampling scheme, as necessary.
5.3 Sampling Procedures
Describe the specific procedures that will be used for collecting and preserving
samples.
Discuss each sampling procedure that will be employed. For EPA-approved
procedures, a reference is sufficient; other sampling procedures should be
summarized hi the text, and additional details should be provided in an
Appendix.
Prepare a list of analytes, sample volumes to be collected, and the amount of
sample that is required for each analysis (see Table E-7). Note whether the
required amount is intended for matrix spike/matrix spike duplicate
determinations or for a single determination. Be sure to review this table to
ensure that the sample volume or mass is sufficient for all intended
confirmatory analyses.
Describe any compositing or sample splitting procedures that will be
employed in the field or laboratory.
October 31,1996 E-19 Jnterim Final Report Version 5.0
-------
Describe any sampling equipment that will be used, and how this equipment
will be calibrated.
Explain how sample containers will be cleaned to prevent sample
contamination, and how new sample containers will be checked for
contaminants.
Describe the containers used for sample collection, transport, and storage for
each sample type. Include sample preservation methods, noting specific
reagents, equipment, supplies, etc., required for sample preservation, and the
specific time requirements for shipping samples to the laboratory. Note
refrigeration conditions and holding times that will be employed. (See, for
example, Table E-8.)
Describe the procedures used to record sample history, sampling conditions,
and any other pertinent information; include examples of forms that will be
employed. Describe the numbering sequence to ensure that each sample will
be assigned a unique number.
Include an example of the sample label to be used.
5.4 Sample Custody
Occasionally samples are spilled, contaminated, accidentally evaporated to dryness,
or otherwise compromised before or during sampling and analysis. Sample custody
allows detection of such problems should they occur and minimizes such occurrences by
assigning responsibility for all stages of sample handling. Sample custody is maintained
when the samples are in a secure area or are in the view of, or under the control of, a
particular individual. Records of everyone handling samples are maintained so that a
sample history can be reconstructed later, should the need arise.
In developing appropriate procedures, be sure to:
Give the names of all sample custodians in the field and in each laboratory.
Give examples of forms that will be used to maintain sample custody in the
field, during shipping, and in the laboratory.
Seal shipping containers with chain-of-custody seals. Give an example of the
seal that will be used.
Describe procedures that will be used to maintain chain of custody during
transfer from the field to the laboratory and within the laboratory,
October 31,1996 E-20 Interim Final Report Version 5.0
-------
Provide for archiving of all shipping documents and other paperwork
received at the laboratory with the samples.
Chapter 9 of SW-846 (3rd Ed.) provides additional discussion of shipping and
custody procedures. Since SW-846 will likely undergo revisions, be sure to refer to the
latest version.
Table E-6. Summary of Number of Samples Required for Hypothetical Incineration
Test
Description/Use
SVOCs1
Non-QC Samples + MS + MSD"
Sample Duplicates
Sample Blanks
Spare Samples'
Modified Method 5
Non-QC Samples
Sample Blanks (see text)
VOCs1
Non-QC Samples
Sample Blanks
Trip Blanks"
Spare Samples0
Slurry
Feed
(F1)
10
3
1
5
0
0
20
1
1
20
Scrubbe
r
Sump
(S1)
10
0
1
5
0
0
20
1
1
20
Scrubber
Blowdow
n
(S2)
10
0
0
5
0
0
20
1
1
20
Ash
(S3)
5
0
0
5
0
0
10
1
1
10
Stack
(S4)
0
0
0
0
4
1
0
0
0
0
Total
35
3
2
20
4
1
70
4
4
70
* As designated in Section .
" Each sample will provide enough material for an original sample plus a matrix spike and a matrix
spike duplicate.
c Not analyzed unless first analysis fails.
" Not analyzed unless sample blank is contaminated.
This table is meant to illustrate a typical format
October 31,1996
E-21
Interim Final Report Version 5.0
-------
Table E-7. Summary of Laboratory Analyses and Sample Quantity Requirement
Stream
Test Soil
Treated soil
Scrubber makeup
scrubber liquor
Scrubber solids
Stack
Sampling Method
Shelby tubes
ASTM-C172/
composite'2'
Grab
Thief
Method 5
Midget M-5(2)
Metals trainฎ
Modified Method 5
NIOSH-IOOaฎ
Analysis Parameter
Semivolatiles
Volatiles
Metals scan
Dioxins/furans
EP toxicity
TCLP
Higher heating value
Chlonne
Moisture
Semivolatiles
Volatiles
Metals scan
Dioxina/furans
TCLP
Semivolatiles
Volatiles
Metals scan
Dioxins/furans
Semivolatiles
Volatiles
Metals scan
Dioxin/furans
Partioilate
HCI
Metals
Semivolatiles
Dioxins/furans
Volatiles
Container Size
250 mL
250 ml
250 mL
250 mL
250 mL
250 mL
250 mL
250 mL
250 mL
250 mL
250 mL
250 mL
250 mL
250 mL
1 L
40 mL VGA vial
1 L
1 L
250 mL
250 mL
250 mL
250 mL
Sample Quantity
Required for
Analysis
50 g
50 g
100 g
50 g
250 g
500 g
250 g
250 g
100g
50 g
50 g
100g
50 g
500 g
1 L
40mLVOAvial
1L
1L
50 g
50 g
100g
50 g
900 L
900L
900L
3000 L
3000 L
20 L
Note (1) Indicate whether the listed quantity suffices for replicates, spikes, and other QC purposes, or
if it is sufficient for only a single analysis. In the latter case, indicate how extra material will
be provided for QC purposes.
Note (2) Copies of these methods would have to be appended because they are not readily available.
This table is meant to illustrate a typical format
October 31, 1996
E-22
Interim Final Report Version 5.0
-------
Table E-8. Required Containers, Preservation Techniques, and Holding Times
Measurement
Extractable organics
Pesticides, PCBs
Metals (except mercury
and chromium VI)
Mercury
Chromium VI
pH
Residue
Organic carbon, total
Sulfide
Type-
G
Teflon-lined septum
G
Teflon-lined septum
P.G
P.G
P,G
P,G
P.G
P.G
P.G
Preservation"
Cool to 4ฐC,
protect from light
Cool to 4ฐC, pH 5-9
HN03 to pH <2
HN03 to pH < 2
Cool, 4ฐC
None required
Cool, 4ฐC
Cool, 4ฐC, HCI or
H2S04topH<2
Cool to 4ฐC; add zinc
acetate plus NaOH to
pH<9
Maximum Holding
Times0
7 days until extraction,
40 days after extraction
7 days until extraction,
40 days after extraction
6 months
28 days
24 hours
Analyze water
immediately
(on site), none specified
for soil
7 days
28 days
7 days
a Polyethylene (P) or glass (G).
b Sample will be preserved immediately upon sample collection.
c Samples will be analyzed as soon as possible after collection. The times listed are the maximum
times that samples will be held before analysis and still be considered valid. All data obtained
beyond the maximum holding times will be flagged.
This table is meant to illustrate a typical format
October 31, 1996
E-23
Interim Final Report Version 5.0
-------
6.0 ANALYTICAL PROCEDURES AND CALIBRATION
The information in this section can be used in preparing Sections 6 and 7 of the
demonstration plan. It describes all of the field and laboratory procedures used for both chemical
and physical measurements. Sample preparation methods and cleanup procedures (such as
extraction, digestion, column cleaning) should also be included.
All methods must be appropriate for their intended use and be described in sufficient detail.
When coupled with appropriate QC procedures this should provide enough detail to permit the
analytical chemists or other measurement specialists to carry out their procedures.
Most confirmatory analysis will rely on EPA-approved methods that have been validated for
environmental samples. Standardized procedures from other organizations, such as the ASTM
and the American Public Health Association, are also commonly employed when approved or
validated EPA methods are unavailable or inappropriate. When used, nonstandard methods must
be described in detail. It is essential to consult the analytical laboratory and other measurement
specialists for guidance because they know the intricacies and limitations of the methods. Close
coordination with the EPA technical lead will also be required since they must approve selection
of the confirmatory method.
Demonstrations often involve many different kinds of measurements. A separate table must
be provided for each type of test. Specify the parameter to be measured, the sample type, the
method number (when available), the title, the method type, and a reference (See Table E-9).
Note the revision number of the method or the edition number of a publication, since nominally
identical methods may be modified by these updates. Provide additional project-specific
information as necessary in the demonstration plan.
6.1 EPA-Approved or Other Validated Standard Methods
EPA-approved or similar validated methods can be incorporated by reference. Once a
method is cited, do not repeat information that is already found in the method. Some additional
information is almost always required, however, to assure that demonstration-specific
requirements will be met. The following considerations apply:
Some EPA-promulgated methods contain only general procedure descriptions and
lack specific QC requirements or applicable validation data. For example, EPA
Method 18 provides general requirements for GC analysis of stack gases, but
contains no QC requirements. For such methods, validation data pertinent to the
specific project must be appended. A preliminary method validation can be
specified as a subtask of the demonstration; in that case specific procedures and
acceptance criteria for method validation must also be included as part of the
demonstration plan.
Other EPA-approved standard methods, such as those found in Standard Methods
for the Examination of Water and Wastewater, give operating procedures but omit
most QC and calibration requirements. This information must be provided in this
October 31, 1996 E-24 Interim Final Report Version 5.0
-------
section of the demonstration plan. Be sure to specify the frequency, acceptance
criteria, and corrective action plans for all QC procedures and calibrations.
It is not necessary to include copies of methods or sections of methods from
SW-846, the Code of Federal Regulations, or Methods for Chemical Analysis of
Water and Waste because these sources are readily available. However, do append
ASTM and NIOSH (National Institutes for Occupational Safety and Health)
procedures because they may not be so readily available to other project principals
or reviewers.
EPA-approved or similarly validated methods that are significantly modified are
considered to be invalid and are treated as described in the next section.
Certain EPA methods such as those found in SW-846 specify most operating
details, including quality control and calibration requirements. Such procedures,
however, frequently allow the user to specify certain other options to satisfy project
objectives. For example, for multianalyte methods such as GC/MS, the user will
typically specify the target compound list that is required for the project. Matrix
spike compounds are chosen from the compounds of interest to the particular
project, and are not necessarily those recommended in the method. List the
project-specific target compounds to be used as calibration check compounds and
matrix spike compounds. Specify the acceptance criteria for matrix spike
compounds. In certain cases, isotopically labeled forms of the project analytes may
be included as surrogates.
Table E-9. Typical Summary Table of Standard Methods and Procedures
Parameter
Polychlorinated
Biphenyls
(Leaching
Tests)
Sample
Type'
L/LTL
U/RM/T
3520/3540
Extracts
3520/3540
Method
Number
EPA Method
3520
EPA Method
3540/3620
EPA Method
8080
EPA Method
680 (Backup)
Method Title
Continuous Liquid-
Liquid Extraction
Soxhlet Extraction/
Florisil Column
Cleanup
Organochlorine
Pesticides and
PCBs
Determination of
Pesticide PCBs
in Water and Soil/
Sediment by GC/MS
Method Type
Extraction
Extraction/
Cleanup
GC/ECD
GC/MS
Source
SW-846
SW-846
SW-846
EPA
aS = Treated waste, while still a slurry
U = Untreated waste
RM = Reagent mix
T .= Treated water
LT = Treated water, long-term monitoring
L = Leachate
LTL = Leachate, long-term monitoring
This table is meant to illustrate a typical format
October 31, 1996
E-25
Interim Final Report Version 5.0
-------
The following is an example of how one might specify various options allowed by Method
8080, a validated method from SW-846 which contains QC and calibration requirements:
Example: Method 8080.
This method will be employed to determine the 19 pesticides listed in Table _-_ of this QA
Project Plan. Although PCBs can also be determined by this method, the GC will not be
calibrated for these products unless they are observed The matrix spike compounds will be
the six critical pesticides listed previously in this QA Project Plan. The surrogates will be
tetrachlorometaxylene (not dibutylchlorendate). Detection will be by electron capture
detection (ECD). Quantitation will be by external calibration. Acceptance criteria for
surrogate recovery will not be determined by control charts, but must be within a 50-150
percent range. Acceptance criteria for matrix spike matrix spike duplicates are those stated
in Table _- of this QA Project Plan Extraction and cleanup procedures are described
below...
6.2 Nonstandard or Modified Methods
Any nonstandard procedure must be described in detail in the format of an EMMC method
and appended to the demonstration plan. Validation data applicable to the expected samples must
also be included. Validation data must demonstrate that the analytes of interest can be
determined without interferences in the expected matrices, and that precision, accuracy, and
detection limits will be adequate for the intended use of the data. The method is validated only
for samples expected from the demonstration, and not for general environmental usage. Once the
SOP is written and the validation data are accepted, the method is validated only for use on that
specific project and on the specific sample matrix associated with that demonstration. This
validation process for a project specific analysis does not constitute EPA approval for other
projects or matrix types.
All nonstandard methods must be validated prior to approval of the demonstration plan.
6.3 Calibration Procedures and Frequency
Discuss the calibration procedures for each analytical or measurement system used to obtain
data for critical measurements. Each description should include the specific calibration
procedure to be used and the frequency of calibration verification.
In the case of standard EPA-approved methods that include calibration procedures,
a reference to those methods suffices. Simply list the required frequency and
acceptance criteria in a summary table, such as that shown in Table E-10. For
nonstandard methods, reference can be made to the SOP. Describe all other
calibration procedures in detail.
A list of calibration standards, including source, traceability, and verification of
purity, must be included.
October 31, 1996 E-26 Interim Final Report Version 5.0
-------
For process measurements (e.g., flow, mass, etc.) and for chemical or physical
analyses, specify the planned frequency of initial and continuing calibration checks
and the acceptance criteria for all calibration measurements. For routine, scheduled
calibrations, include the calibration frequency and acceptance criteria in, along with
a summary of QC requirements.
For physical measurements such as temperature and pressure, calibration statements can be
quite brief.
Example: All thermocouples intended for use in the range 50" C to 350ฐ C will be calibrated
versus an NIST-traceable thermometer at a minimum of two temperatures, by placing both in an
oven simultaneously and noting any difference between readings. The thermocouple readout
must be within 2" C of the corrected mercury thermometer or the thermocouple device will be
replaced or corrected.
7.0 DATA REDUCTION, VALIDATION, AND REPORTING
Describe how data will be reduced, validated and reported. Deliverables that will be
required from the analytical laboratory must also be specified. This material is provided for
information purposes only, since the EPA will be responsible for this element of the
demonstration process.
Provide an overall schematic of data flow, such as shown in Figure 2. This flow chart
indicates the entire process of data handling, collection, transfer, storage, recovery, and review
for both field and laboratory operations.
7.1 Data Reduction
Name the individuals responsible for data reduction. For a technology
demonstration, this will be the EPA or a verification entity.
Summarize the data reduction procedures that are specific to this demonstration.
Data reduction procedures that are part of standard methods or SOPs should not be
repeated here other than to note any deviations.
Summarize the planned statistical approach including formulas, units, and
definition of terms. Do not simply reference a "standard text."
Explain how results from blanks will be treated in calculations.
7.2 Data Validation
Name the individuals responsible for data validation at each stage of data reduction.
This will be a responsibility of the EPA or a verification entity.
Describe the procedures that will be used for determining outliers. Describe the
guidelines that will be employed for flagging or validating data.
October 31,1996 E-27 Interim Final Report Version 5.0
-------
SAMPLE RECEIPT
SAMPLE PREPARATION
DAT/
DATA
DATA
1
ป
SAMPLE ANALYSIS
)
1
I
DATA ACQUISITION
AND REDUCTION
V
RAW DATA ANALYSIS
BY LAB ANALYSTS
k APPROVED? , YES
ANALYTICAL/QC
DATA REVIEW
BY LAB SUPERVISOR
APPROVED? YES
FINAL DATA REVIEW BY
PROJECT AND Q.A.
MANAGERS
APPROVED? T YES
REPORT
PREPARATION
I
FINAL REPORT
REVIEW BY PROJECT
MANAGER
1
L. 1
j 1
i
i
1
REVIEW RAW DATA,
NO RFTAMAI V7ET IA/UCDC'
INDICATED
A
1
1
NO REVIEW DATA, TAKE
h PnDDCfTIV/C AfTIOM
* OwKrvCU 1 IVt f\\s 1 IVJN,
WHERE INDICATED
T
i
i
i
i
i
REVIEW REPORT,
NO TAKE CORRECTIVE ACTION
REANALYZE WHERE
INDICATED
REPORT APPROVED? YES
V
RELEASE REPORT
Figure 2. Example of a data reduction, validation and reporting scheme.
October 31, 1996
E-28
Interim Final Report Version 5.0
-------
Table E-10. Summary Table of Calibration Requirements for
Measurements
Process
Parameter
Flow rate of
dopant feed
Secondary
volume
standard
Secondary time
standard
Temperature
(50-300ฐC)
Temperature
(300-800ฐC)
Flow rate in
stack
Measurement
Classification
Critical
Critical
Critical
Noncritical
Critical
Critical
Device
Mass flow meter
Dry test meter
Stopwatch
Thermometer
Thermocouple
Pitot tube and
manometer
Calibration
Procedure
Compare to
calibrated dry
test meter
NIST-traceable
spirometer
NIST time base
Comparison to
certified
thermometer
Comparison to
NIST-calibrated
thermocouple
Measure pitot
orifice with
NIST-traceable
micrometer;
compare
manometer
markings to
NIST-calibrated
meter stick
Frequency
Before/after
field test,
weekly
Before/after
field test
Before field test
Before field test
Before field test
Before field test
Acceptance*
Criteria
%5
2%
.05 sec/min
5ฐC
5ฐC
1%
3 Maximum allowable deviation from standard.
This table is meant to illustrate a typical format
7.3 Data Reporting
Name the individuals responsible for the various types of reports. This will be a
responsibility of the EPA or a verification entity.
Indicate the units for each measurement and each matrix, and whether data will be
reported on a wet, dry, or some other reduced basis. If this requirement has been
covered in other sections, do not repeat it here. Indicate data storage requirements
that will be expected of the laboratory once the project is complete. Will the
laboratory need to maintain a complete set of raw data for six months? one year?
Five years? Also indicate how long the actual samples will be stored, in case
reanalysis is required.
List the deliverables that will be expected from the laboratory and from the field
operations. Will the data package include all raw data sufficient to recalculate any
October 31, 1996
E-29
Interim Final Report Version 5.0
-------
result, if need be? What type of QC data will be reported? Reporting requirements
may vary, depending on the intended use of the data.
Summarize the data that will be included in the final report. What QC data will be
included? Will analytical and other measurement data be partially reduced before
they are reported, or will all individual measurements be reported?
Example: The final report will contain the folio-wing analytical data:
All analytical results from all primary samples. Data judged to be outliers will be included,
along with a justification for excluding this information from further interpretation.
All individual results from standard reference materials, independent check samples,
replicates, and matrix spikes, including initial and final concentrations.
Data from the continuous emission monitors will be reported as 15-minute averages and,
when appropriate, will be summarized graphically.
For Category II projects, this section should contain a statement that the final report will
include a QA section that documents QA/QC activities and results. These results must be readily
correlated to the primary data and must clearly indicate the limitations of the data and the range
of validity of the conclusions. The final report should also include a summary of the original QA
objectives, and a statement regarding whether or these objectives were met. If QA objectives
were not met, include an explanation of the impact of not meeting the project's QA objectives.
8.0 INTERNAL QUALITY CONTROL CHECKS
This section describes all internal quality control (QC) checks that will be used throughout
the project, including field and laboratory activities of all organizations involved. The QC
procedures that are specified should follow from the QA objectives previously stated. This
section describes how the QA specifications will be met.
8.1 Types of QC Checks
Examples of QC checks that should be considered include the following:
Samples
- Collocated, split, replicate
Spikes
- Matrix spikes and matrix spice duplicates
- Spiked blanks
- Surrogates and internal standards
Blanks
October 31,1996 E-30 Interim Final Report Version 5.0
-------
- Sampling, field, trip, method, reagent, instrument
- Zero and span gases
Others
- Standard reference materials (complex natural materials, pure solutions)
- Mass tuning for mass analysis
- Confirmation on second column for gas chromatographic analyses
- Control charts
- Independent check standard
- Determinations of detection limits
- Calibration standards
- Proficiency testing of analysts
- Any additional checks required by the special needs of your project
Include QC checks for process measurements as well.
Most of this information can be summarized in a table, as shown in Table E-l 1. This table
should designate the types of QC procedures, required frequencies, associated acceptance
criteria, and corrective action that will occur if the acceptance criteria are not met. When QC
procedures are referenced to a standard method that describes an exact procedure, additional
discussion is not normally needed. However, standard methods lacking exact QC procedures or
nonstandard methods require a detailed explanation, either in this section of the QA Project Plan
or in an appended Standard Operating Procedure. The tabular format shown in Table E-l 1 is also
convenient for summarizing routine and ongoing calibration requirements. If routine calibration
is summarized in this section, a reference to that effect should be included in the section on
analytical procedures.
The accompanying text must assure that there are no ambiguities. Particularly troublesome
terms are "duplicate" or "replicate." The text should explain precisely how and when replicates
are taken. Do replicate samples, for instance, refer to samples collected simultaneously or
sequentially in the field; to samples collected at the same sample point but at different times; to
samples that are split upon receipt in the laboratory? The term "QC check sample" must also be
carefully defined. Indicate at which point matrix spiking occurs.
Exact procedures for preparing the numerous kinds of blanks must be described fully in the
text. Never assume that a term such as "field blank" will mean the same to the sampling team or
to a reviewer that it does to you.
Specify the compounds or elements that will be employed as matrix spikes and surrogates.
Some standard methods recommend surrogate and matrix spike compounds, which may be
incorporated by reference when appropriate. Typically, some of the matrix spike compounds
must be selected on a project specific basis.
In some cases, it may also be necessary to provide additional discussion of potential
problems that might be expected with certain QC procedures, along with the proposed solutions.
For example, spiking samples in the field is frequently less reliable and more difficult than
October.31, 1996 E-31 Interim Final Report Version 5.0
-------
spiking in the laboratory, due to contamination and less controlled conditions. If field spiking is
required, then a discussion of procedures that minimize such problems is also required.
Because standard methods often include extensive QC requirements, it is natural to ask why
such QC procedures must be summarized in this section. Why not simply state that QC will be
performed as required in the method? In some cases, EPA standard methods are sufficiently
complete for this approach, but frequently they are not. Many EPA-approved methods do not
include specific QC procedures. The more complete EPA methods allow options, such as the
choice of matrix spike compounds, or the use of either control charts or fixed acceptance limits.
The analytical and measurement component of a demonstration will require specific QC
guidelines to document technology performance.
Be sure to:
Identify the stage at which replication and spiking occur. Avoid using terms such as
"sample replicate" without explaining how such replication will be performed.
Explain exactly how blanks will be prepared.
9.0 PERFORMANCE AND SYSTEMS AUDITS
The demonstration plan should describe the QA audits planned for monitoring any system
used for obtaining critical measurements. A schedule of all audits should be included, along with
identification of responsible personnel. It should indicate what audit reports will be generated
and who will receive them. If no audits are planned, include an explanation.
A Technical Systems Audit (TSA) is a qualitative evaluation of all components of the total
measurement system, including technical personnel and QA management. This type of audit
includes a careful evaluation of both field and laboratory QC procedures. TSAs are normally
performed before, or shortly after, measurement systems are operational; they should also be
performed on a regularly scheduled basis throughout the project.
After measurement systems are operational and begin generating data, Performance
Evaluation Audits (PEA) are conducted periodically to determine the bias of the total
measurement system(s). As part of a PEA, the laboratory analyzes a performance evaluation
sample. Long-term projects should provide for regularly scheduled PEAs.
Audits of Data Quality (ADQ) are retrospective evaluations of data. Typically, a
representative portion of the results in an analytical report is reviewed in detail, starting with raw
data and chromatograms, and proceeding through the calculation of final results. ADQs are often
used to resolve specific questions regarding the quality of a data set.
October 31, 1996 E-32 Interim Final Report Version 5.0
-------
Table E-11. Scheduled QC and Calibration
Section
Number
in Method
SVOCs (EPA 8270)
7.3.1
7.3.1
7.3.3
7.3.4
7.3.4
7.4
8.1,8.6
8.6
8.9
8.2,8.6
8.5
Procedure
DFTPP tune
Inertness for DDT
5-pt ICAI
Continuing
calibration
Matrix spike
Replicate spike
Surrogate recovery
Method blank
Proficiency Test
Frequency
ofQC
Procedure
12 hours
12 hours
Initially and
as needed
12 hours
Each batch s20
Each batch s20
Each sample
Before any samples;
each blank
Each new analyst
Acceptance Criteria
Table 3/8270
DDE.DDD<20%ofDDT
RSD of CCC compounds <30%
RRTs within 0.06
RRF for all SPCCs * .05
RF for all SPCCS 2 .05
RF for all CCCs within 30% of ICAL
RT of IS within 30 sec of last CCC
Area of IS within factor of two of last CCC
See Table 6
None for duplicates
From control charts
Default = Table 8
No significant interference to target
target analytes
See Table 6
Corrective Action
Re-run before sample analysis
Repair before sample analysis
Re-calibrate
Correct before analysis, otherwise
repeat ICAL
Run independent QC reference
standard If reference standard okay,
accept data, otherwise reject data.
None
Repeat analysis
Find and remove interference
Re-do before samples are analyzed.
This table is provided to illustrate a typical format
-------
10.0 CALCULATION OF DATA QUALITY INDICATORS
The following discussion describes how data quality indicators should be calculated and
reported. As a minimum, equations must be provided for precision, accuracy, completeness, and
method detection limits. In addition, equations must be given for other project-specific
calculations, such as mass balance, confidence ranges, etc.
Listed below are general guidelines for calculating the more common data quality indicators.
10.1 Common Data Quality Indicators
10.1.1 Precision
If calculated from duplicate measurements, relative percent difference is the normal measure
of precision:
(C.-C,) * 100%
RPD - - -
C2)/2
(1)
where: RPD = relative percent difference
C, = larger of the two observed values
C2 = smaller of the two observed values.
If calculated from three or more replicates, use relative standard deviation rather than RPD:
RSD = (S/Y) x 100% (2)
where: RSD = relative standard deviation
S= standard deviation
7= mean of replicate analyses.
Standard deviation is defined as follows:
5 =
^
n-1
where: S = standard deviation
y-t = measured value of the ith replicate
y = mean of replicate measurements
n = number of replicates.
October 31, 1996 E-34 Interim Final Report Version 5.0
-------
For measurements, such as pH, where the absolute variation is more appropriate, precision is
usually reported as the absolute range, D, of duplicate measurements:
D = |m, - m2| (4)
where: D = absolute range
m, = first measurement
m2 = second measurement
The standard deviation, s, given above, can also be used.
10.1.2 Accuracy
For measurements where matrix spikes are used, calculate the percent recovery as follows:
S-U
%R = 100% x
C
(5)
where: %R = percent recovery
S = measured concentration in spiked aliquot
U- measured concentration in unspiked aliquot
CM = actual concentration of spike added
When a standard reference material (SRM) is used:
%R = 100% x
C
srm
(6)
where: %R = percent recovery
Cm = measured concentration of SRM
Cjn,, = actual concentration of SRM.
10.1.3 Completeness
Completeness is defined as follows for all measurements:
%C = 100% x f \ (7)
V n)
where: %C = percent completeness
V= number of measurements judged valid
n = total number of measurements necessary to achieve a specified level of
confidence in decision making.
Note: This more rigorous definition of completeness is an improvement of the conventional
definition in which "n" is replaced by "T," the total number of measurements.
October 31,1996 E-35 Interim Final Report Version 5.0
-------
10.1.4 Method Detection Limit (MDL)
MDL is defined as follows for all measurements:
MDL = t. , x s (8)
(n~i) v '
where: MDL = method detection limit
s = standard deviation of the replicate analyses
t^ = students' t-value for a one-sided 99% confidence level and
a standard deviation estimate with n-l degrees of freedom
11.0 QUALITY CONTROL REPORTS TO MANAGEMENT
Part of the discussion in Section 7 of the demonstration plan should identify the individuals
responsible for preparing QA reports, and should describe the type and frequency of reports
(weekly oral presentations and discussions, monthly written reports, etc.) that will be used to keep
project management informed. As a minimum, such reports include:
Changes in the demonstration plan;
Summary of QA/QC programs and accomplishments
Results of technical systems and performance evaluation audits
Significant QA/QC problems, recommended solutions, and results of corrective
actions
Data quality assessment in terms of precision, accuracy, representativeness,
completeness, comparability, and method detection limits
Discussion of whether the QA objectives were met and the resulting impact on
decision making
Limitations on use of the data obtained from the technology.
12.0 REFERENCES
References, if any, can be included in the body of the text, as footnotes, or collected in a
separate section of the demonstration plan. References must uniquely identify the cited material. In
particular, when citing various compendia of standard methods published by the EPA, ASTM,
American Public Health Association, etc., be sure to include the edition number, since such
methods can change substantially from one edition to the next.
October 31,1996 E-36 Interim Final Report Version 5.0
-------
End of Document
October 31, 1996 E-37 Interim Final Report Version 5.0
------- |