United States       Office of Research and      EPA/600/R-00/045
         Environmental Protection   Development         March 2000
         Agency	Washington, D.C. 20460	



   EPA   Environmental Technology


         Verification Report





         Explosives Detection Technology





         Research International, Inc.


         FAST 2000™
                      Security


                    Certification


                  Program











                Oak Ridge National Laboratory
ETY  ETT ET

-------
                THE ENVIRONMENTAL TECHNOLOGY VERIFICATION
                                        PROGRAM
           4>EPA
                                                            ontl
                                                                Oak Ridge National Laboratory
                         Joint Verification  Statement
     TECHNOLOGY TYPE:   EXPLOSIVES DETECTION
APPLICATION:
                              MEASUREMENT OF EXPLOSIVES IN
                              CONTAMINATED WATER
     TECHNOLOGY NAME:  FAST 2000™

     COMPANY:              Research International, Inc.
ADDRESS:

WEB SITE:
18706 142nd Avenue ME
Woodinvffle, WA 98072-8523

www.resrchintl.ccmi
                                                           PHONE: (425) 486-7831
                                                           FAX: (425) 485-9137
The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification
Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through
performance verification and dissemination of information. The goal of the ETV Program is to further
environmental protection by substantially accelerating the acceptance and use of improved and cost-effective
technologies. ETV seeks to achieve this goal by providing high quality, peer reviewed data on technology
performance to those involved in the design, distribution, financing, permitting, purchase, and use of
environmental technologies.

ETV works in partnership with recognized standards and testing organizations and stakeholder groups
consisting of regulators, buyers, and vendor organizations, with the full participation of individual technology
developers. The program evaluates the performance of innovative technologies by developing test plans that
are responsive to the needs of stakeholders, conducting field or laboratory tests (as appropriate), collecting
and analyzing data, and preparing peer-reviewed reports. All evaluations are conducted in accordance with
rigorous quality assurance protocols to ensure that data of known and adequate quality are generated and that
the results are defensible.

The Department of Defense (DoD) has a similar verification program known as the Environmental Security
Technology Certification Program (ESTCP). The purpose of ESTCP is to demonstrate and validate the most
promising innovative technologies that target DoD's most urgent environmental needs and are projected to
EPA-VS-SCM-45
                     The accompanying notice is an integral part of this verification statement.
                                                                                   March 2000

-------
pay back the investment within 5 years through cost savings and improved efficiencies. ESTCP
demonstrations are typically conducted under operational field conditions at DoD facilities. The
demonstrations are intended to generate supporting cost and performance data for acceptance or validation of
the technology. The goal is to transition mature environmental science and technology projects through the
demonstration/ validation phase, enabling promising technologies to receive regulatory and end user
acceptance in order to be fielded and commercialized more rapidly.

The Oak Ridge National Laboratory (ORNL) is one of the verification organizations operating under the Site
Characterization and Monitoring Technologies (SCMT) program. SCMT, which is administered by EPA's
National Exposure Research Laboratory, is one of 12 technology areas under ETV. In this demonstration,
ORNL evaluated the performance of explosives detection technologies. This verification statement provides a
summary of the test results for Research International's (RI's) FAST 2000™. This verification was
conducted jointly with the Department of Defense's (DoD's) Environmental  Security Technology
Certification Program (ESTCP).

DEMONSTRATION DESCRIPTION
This demonstration was designed to evaluate technologies that detect and measure explosives in soil and
water. RI elected to analyze only water samples with the FAST 2000. The demonstration was conducted at
ORNL in Oak Ridge, Tennessee, from August 23 through September 1, 1999. Spiked samples of known
concentration were used to assess the  accuracy of the technology. Explosives-contaminated water samples
from Tennessee, Oregon, and Louisiana with concentrations ranging from 0 to 25,000 |jg/L were analyzed.
The primary constituents in the samples were 2,4,6-trinitrotoluene (TNT); isomeric dinitrotoluene (DNT),
including both 2,4-dinitrotoluene and 2,6-dinitrotoluene; hexahydro-l,3,5-trinitro-l,3,5-triazine (RDX); and
octahydro-l,3,5,7-tetranitro-l,3,5,7-tetrazocine (HMX). The results of the water analyses conducted under
field conditions by the FAST 2000 were compared with results from reference laboratory analyses of
homogenous replicate samples determined using EPA SW-846 Method 8330. Details of the demonstration,
including a data summary and discussion of results, may be found in the report entitled Environmental
Technology Verification Report: Explosives Detection Technology—Research International, Inc., FAST
2000™,  EPA/600-R-00/045.

TECHNOLOGY DESCRIPTION
The FAST 2000 is based on a displacement assay that uses antibodies and fluorescence as a means of
detection. The unit (weighing 2.8 Ib, with dimensions of 6 x 15.5 x 16 cm) can be easily carried into the field
and plugged directly into a portable PC for on-site data acquisition and analysis.  The key  elements of the
sensor are (1) antibodies specific for the analyte; (2) signal molecules that are similar to the analyte but are
labeled with a fluorophore (a cyanine-based fluorescent dye, Cy5) to enable  fluorescence detection; and (3) a
fluorescence detector. For analysis, the analyte-specific antibodies are immobilized onto a solid support and
then saturated with the fluorescently labeled signal molecule, creating an antibody/signal molecule complex.
Monoclonal antibodies (the Naval Research Laboratory's 11B3 TNT and Strategic Diagnostics RDX) are
immobilized onto porous membrane supports and saturated with the fluorescent  tag. The  membrane is  inserted
into a disposable  coupon and placed in the FAST  2000, and the buffer flow is started by a computer
command. Once the fluorescence background signal due to unbound Cy5 has stabilized (generally
15-20 minutes), the biosensor is ready for sample injection. If the sample contains the target analyte, a
proportional amount of the labeled signal molecule is displaced from the antibody and detected by the
fluorimeter downstream. The coupon and membrane can be used for repeated assays. The life of the
membrane is dependent upon the number and concentration of positive assays that are run. The reporting limit
for both TNT and RDX was 20
EPA-VS-SCM-45           The accompanying notice is an integral part of this verification statement.              March 2000

-------
VERIFICATION OF PERFORMANCE
The following performance characteristics of the FAST 2000 were observed:

Precision: For the water samples, the mean relative standard deviations (RSDs) for RDX and TNT were
52% and 76%, respectively.

Accuracy: The mean recoveries for RDX and TNT were 192% and 316%, respectively.

False positive/false negative results: Of the 20 blank water samples, RI reported RDX in 4 samples (24%
false positives) and TNT in 16 samples (80% false positives). Three of the RDX results were reported as
"ME," which indicated that the sample had "matrix effects" and the result could not be reported by the FAST
2000. False positive and false negative results were also determined by comparing the FAST 2000 result to
the reference laboratory result on environmental and spiked samples (e.g., whether the FAST 2000 reports a
result as a nondetect that the reference laboratory  reported as a detect, and vice versa). For RDX, 2% of the
results were false positives relative to the reference laboratory result, while  16% of the TNT results were
reported as false positives. RI reported a small fraction of the samples (3% for each analyte) as nondetects
(i.e., false negatives)  when the laboratory reported a detect.

Completeness: Approximately 80% of the water analyses were complete. Approximately 18% of the RDX
results and 21% of the TNT results  were reported as "matrix effects," where a result could not be obtained.

Comparability: A one-to-one sample comparison of the FAST 2000 results and the reference laboratory
results was performed for all samples  (spiked and  environmental) that were reported above the reporting
limits.  The correlation coefficient (r) for the comparison of the entire water data set for TNT was 0.23, and
the slope (m) of the linear regression line was 1.81. When comparability was assessed for specific
concentration ranges, the r value did not change dramatically for TNT, ranging from 0.14 to 0.21 depending
on the concentration ranges selected. RDX correlation with the reference laboratory for water was higher ®
= 0.63, m = 1.60). Examination of the  data indicated that the RDX results were usually higher than those of
the reference laboratory. However, for specific environmental sample matrices (such as the samples from the
Louisiana Army Ammunition Plant), the FAST 2000 results were generally lower than those  of the reference
laboratory. This indicated the possibility  of a matrix-dependent effect.

Sample Throughput: Operating under the outdoor conditions, the RI team, usually consisting of three
operators, accomplished a sample throughput rate  of approximately three samples per hour for the water
analyses. Separate instruments were used for the TNT and RDX analyses. Typically, two operators analyzed
samples while one operator performed data analysis, but the technology can be run by a single trained
operator.
EPA-VS-SCM-45           The accompanying notice is an integral part of this verification statement.              March 2000

-------
Overall Evaluation: The verification team found that the FAST 2000 was relatively simple for the trained
analyst to operate in the field, requiring less than an hour for initial setup. The overall performance of the
FAST 2000 for the analysis of water samples was characterized as imprecise and biased high for TNT, and
imprecise and biased high (but matrix-dependent) for RDX. As with any technology selection, the user must
determine if this technology is appropriate for the application and the project data quality objectives. For more
information on this and other verified technologies, visit the ETV web site at http://www.epa.gov/etv.
Gary J. Foley, Ph.D.
Director
National Exposure Research Laboratory
Office of Research and Development
David E. Reichle, Ph.D.
Associate Laboratory Director
Life Sciences and Environmental Technologies
Oak Ridge National Laboratory
Jeffrey Marqusee, Ph.D.
Director
Environmental Security Technology Certification Program
U.S. Department of Defense
   NOTICE: EPA and ESTCP verifications are based on evaluations of technology performance under specific,
   predetermined criteria and appropriate quality assurance procedures. EPA, ESTCP, and ORNL make no expressed or
   implied warranties as to the performance of the technology and do not certify that a technology will always operate as
   verified. The end user is solely responsible for complying with any and all applicable federal, state, and local
   requirements. Mention of commercial product names does not imply endorsement or recommendation.

-------
                                            EPA/600/R-00/045
                                               March 2000
Environmental Technology
Verification Report

Explosives Detection Technology

Research International, Inc.
FAST 2000™
                          By
                       Amy B. Dindal
                    Charles K. Bayne, Ph.D.
                    Roger A. Jenkins, Ph.D.
                   Oak Ridge National Laboratory
                  Oak Ridge, Tennessee 37831-6120
                       Eric N. Koglin
                 U.S. Environmental Protection Agency
                   Environmental Sciences Division
                 National Exposure Research Laboratory
                   Las Vegas, Nevada 89193-3478
             This demonstration was conducted in cooperation with the
                   U.S. Department of Defense
             Environmental Security Technology Certification Program

-------

-------
                                           Notice
The U.S. Environmental Protection Agency (EPA), through its Office of Research and Development (ORD),
and the U.S. Department of Defense's Environmental Security Technology Certification Program (ESTCP)
Program, funded and managed, through Interagency Agreement No. DW89937854 with Oak Ridge National
Laboratory, the verification effort described herein. This report has been peer and administratively reviewed
and has been approved for publication as an EPA document. Mention of trade names or commercial products
does not constitute endorsement or recommendation for use of a specific product.

-------
                                       Table of Contents
     List of Figures  	    v
     List of Tables   	   vii
     Acknowledgments  	   k
     Abbreviations and Acronyms  	   xi

1    INTRODUCTION  	   1

2    TECHNOLOGY DESCRIPTION	   3
     General Technology Description  	    3
     Preparation of Standards	    3
     Sample Preparation and Analysis	    4
     Cross-Reactivity	    4

3    DEMONSTRATION DESIGN	   5
     Objective	    5
     Demonstration Testing Location and Conditions  	    5
     Soil Sample Descriptions   	    5
          Sources of Samples	    5
              Iowa Army Ammunition Plant 	    5
              Louisiana Army Ammunition Plant	    5
              Milan Army Ammunition Plant	    5
              Volunteer Army Ammunition Plant   	    5
              Fort Ord Military Base  	    6
          Performance Evaluation Samples 	    6
          Soil Sample Preparation	    6
     Water Sample Descriptions  	    7
          Sources of Samples	    7
          Performance Evaluation Samples 	    7
          Water Sample Preparation	    7
     Sample Randomization 	    7
     Summary of Experimental Design 	    7
     Description of Performance Factors  	    7
          Precision   	    8
          Accuracy  	    8
          False Positive/Negative Results  	    8
          Completeness 	    9
          Comparability 	    9
          Sample Throughput 	    9
          Ease of Use  	    9
          Cost	    9
          Miscellaneous Factors 	    9

4    REFERENCE LABORATORY ANALYSES 	   10
     Reference Laboratory Selection   	   10
     Reference Laboratory Method  	   10
     Reference Laboratory Performance  	   10
                                                  iii

-------
5    TECHNOLOGY EVALUATION	   13
     Objective and Approach  	   13
     Precision  	   13
     Accuracy   	   13
     False Positive/False Negative Results  	   13
     Completeness  	   14
     Comparability  	   14
     Sample Throughput 	   17
     Ease of Use  	   17
     Cost Assessment  	   17
          FAST 2000 Costs  	   18
          Reference Laboratory Costs  	   19
          Cost Assessment Summary 	   19
     Miscellaneous Factors  	   19
     Summary  of Performance  	   20

6    TECHNOLOGY UPDATE AND REPRESENTATIVE APPLICATIONS  	   22
     Technology Update 	   22
     Representative Applications  	   23
     Refereed Papers  	   23

7    REFERENCES   	   24

     APPENDIX:
     RI's FAST 2000 Sample  Results Compared with Reference Laboratory Results 	   25
                                                IV

-------
                                    List of Figures
1    The FAST 2000  	    3
2    Comparability of reference laboratory water results with FAST 2000 results for all TNT
     concentrations   	   15
3    Comparability of reference laboratory water results with FAST 2000 results for vendor RDX
     concentrations less than 500 |j,g/L   	   16
4    Range of percent difference values for RDX and TNT  	   17
5    The FAST 6000  	   22

-------
VI

-------
                                     List of Tables
 1     FAST 2000 Cross-reactivity  	   4
 2     Summary of Soil and Water Samples	   8
 3     Summary of the Reference Laboratory Performance for Soil Samples  	   11
 4     Summary of the Reference Laboratory Performance for Water Samples   	   11
 5     Summary of the Reference Laboratory Performance on Blank Samples  	   12
 6     Summary of the FAST 2000 Precision for Water Samples  	   13
 7     Summary of the FAST 2000 Accuracy for Water Samples 	   13
 8     Summary of FAST 2000 False Positives on Blank Water Samples  	   14
 9     Summary of the FAST 2000 Detect/Nondetect Performance Relative to the Reference Laboratory
       Results  	   14
10     FAST 2000 Correlation with Reference Data for Various Vendor Water Concentration Ranges  .   14
11     Evaluation of FAST 2000 Comparison with Reference Laboratory Results by Matrix  	   16
12     Estimated Analytical Costs for Explosives-Contaminated Samples	   20
13     Performance Summary for the FAST 2000 Water Analyses 	   21
                                             Vll

-------

-------
                                  Acknowledgments
The authors wish to acknowledge the support of all those who helped plan and conduct the demonstration,
analyze the data, and prepare this report. In particular, we recognize Dr. Thomas Jenkins (U.S. Army, Cold
Regions Research and Engineering Laboratory) and Dr. Michael Maskarinec (Oak Ridge National
Laboratory), who served as the technical experts for this project. We thank the people who helped us to
obtain the samples from the various sites, including Dr. Jenkins, Danny Harrelson (Waterways Experiment
Station), Kira Lynch (U.S. Army Corps of Engineers, Seattle District), Larry Stewart (Milan Army
Ammunition Plant), and Dick Twitchell and Bob Elmore (Volunteer Army Ammunition Plant). For external
peer review, we thank Dr. C. L. Grant (Professor Emeritus, University of New Hampshire). For internal peer
review, we thank Stacy Barshick of Oak Ridge National Laboratory and Harry Craig of EPA Region 10. The
authors also acknowledge the participation of Lisa Shriver-Lake, Paul Charles, Paul Gauger, David Holt, and
Charles Patterson of the Naval Research Laboratory and Ann Wilson of Research International, who
performed the analyses during the demonstration.

For more information on the Explosives Detection Technology Demonstration contact
Eric N. Koglin
Project Technical Leader
Environmental Protection Agency
Environmental Sciences Division
National Exposure Research Laboratory
P.O.  Box 93478
Las Vegas, Nevada 89193-3478
(702) 798-2432
koglin. eric@epa.gov
Amy B. Dindal
Technical Lead
Oak Ridge National Laboratory
Chemical and Analytical Sciences Division
P.O .Box 2008
Oak Ridge, TN 37831- 6120
(865) 574-4863
dindalab@ornl.gov
For more information on Research International's FAST 2000 contact

Elric Saaski
Research International
18706 142nd Avenue NE
Woodinville, WA 98072-8523
(425) 486-7831
resrchintl@aol. com
www. resrchintl. com
                                               IX

-------

-------
                          Abbreviations and Acronyms
2-Am-DNT        2-amino-4,6-dinitrotoluene, CAS # 35572-78-2
4-Am-DNT        4-amino-2,6-dinitrotoluene, CAS # 1946-51-0
CFI               Continuous Flow Immunosensor
CRREL           U.S. Army Cold Regions Research and Engineering Laboratory
2,4-DNT          2,4-dinitrotoluene,  CAS # 121-14-2
2,6-DNT          2,6-dinitrotoluene,  CAS # 606-20-2
DNT             isomeric dinitrotoluene (includes both 2,4-DNT and 2,6-DNT)
DoD              U.S. Department of Defense
EPA              U.S. Environmental Protection Agency
ERA             Environmental Resource Associates
ESTCP            Environmental Security Technology Certification Program
ETV              Environmental Technology Verification Program
fn                false negative result
fp                false positive result
GC               gas chromatography
HMX             octahydro-l,3,5,7-tetranitro-l,3,5,7-tetrazocine, CAS # 2691-41-0
HPLC            high-performance liquid chromatography
IMS              ion mobility spectrometry
LAAAP           Louisiana Army Ammunition Plant
ME               matrix effects
MLAAP          Milan Army Ammunition Plant
NERL            National Exposure Research Laboratory (EPA)
NRL              U.S. Naval Research Laboratory
ORNL            Oak Ridge National Laboratory
PE               performance evaluation sample
QA               quality assurance
QC               quality control
RDX             hexahydro-l,3,5-trinitro-l,3,5-triazine, CAS # 121-82-4
RI                Research International, Inc.
RSD              relative standard deviation
SCMT            Site Characterization and Monitoring Technologies Pilot of ETV
SD               standard deviation
TNB              1,3,5-trinitrobenzene, CAS # 99-35-4
TNT              2,4,6-trinitrotoluene, CAS #  118-96-7
                                              XI

-------

-------
                               Section 1 — Introduction
The U.S. Environmental Protection Agency (EPA)
created the Environmental Technology Verification
Program (ETV) to facilitate the deployment of
innovative or improved environmental technologies
through performance verification and dissemination
of information. The goal of the ETV Program is to
further environmental protection by substantially
accelerating the acceptance and use of improved
and cost-effective technologies. ETV seeks to
achieve this goal by providing high-quality, peer-
reviewed data on technology performance to those
involved in the design, distribution, financing,
permitting, purchase, and use of environmental
technologies.

ETV works in partnership with recognized standards
and testing organizations and stakeholder groups
consisting of regulators, buyers, and vendor
organizations, with the full participation of individual
technology developers. The program evaluates the
performance of innovative technologies by
developing verification test plans that are responsive
to the needs  of stakeholders, conducting field or
laboratory tests (as appropriate), collecting and
analyzing data, and preparing peer-reviewed reports.
All evaluations are conducted in accordance with
rigorous quality assurance (QA) protocols to ensure
that data of known and adequate quality are
generated and that the results are defensible.

ETV is a voluntary program that seeks to provide
objective performance information to all of the
participants in the environmental marketplace and to
assist them in making informed technology decisions.
ETV does not rank technologies or compare their
performance, label or list technologies as acceptable
or unacceptable, seek to determine "best available
technology," or approve or disapprove technologies.
The program does not evaluate technologies at the
bench or pilot scale and does not conduct or support
research. Rather, it conducts and reports on testing
designed to describe the  performance of
technologies under a range of environmental
conditions and matrices.
The program now operates 12 pilots covering a
broad range of environmental areas. ETV has begun
with a 5-year pilot phase (1995-2000) to test a wide
range of partner and procedural alternatives in
various pilot areas, as well as the true market
demand for and response to such a program. In
these pilots, EPA utilizes the expertise of partner
"verification organizations" to design efficient
processes for conducting performance tests of
innovative technologies. These expert partners are
both public and private organizations, including
federal laboratories, states, industry consortia, and
private sector entities. Verification  organizations
oversee and report verification activities based on
testing and  QA protocols developed with input from
all major stakeholder/customer groups associated
with the technology area. The verification described
in this report was administered by the Site
Characterization and Monitoring Technologies
(SCMT) Pilot, with Oak Ridge National Laboratory
(ORNL) serving as the verification organization. (To
learn more  about ETV, visit ETV's Web site at
http://www.epa.gov/etv.) The SCMT pilot is
administered by EPA's National Exposure Research
Laboratory (NERL), Environmental Sciences
Division, in Las Vegas, Nevada.

The Department of Defense (DoD) has a similar
verification program known as the  Environmental
Security Technology Certification Program
(ESTCP). The purpose of ESTCP  is to demonstrate
and validate the most promising innovative
technologies that target DoD's most urgent
environmental needs and are projected to pay back
the investment within 5 years through cost savings
and improved efficiencies. ESTCP responds to
(1) concern over the slow pace and cost of
remediation of environmentally contaminated sites on
military installations, (2) congressional direction to
conduct demonstrations specifically focused on new
technologies, (3) Executive Order 12856, which
requires federal agencies to place high priority on
obtaining funding and resources needed for the
development of innovative pollution prevention
programs and technologies for installations and in
acquisitions, and (4) the need to improve defense

-------
readiness by reducing the drain on the Department's
operation and maintenance dollars caused by real
world commitments such as environmental
restoration and waste management. ESTCP
demonstrations are typically conducted under
operational field conditions at DoD facilities.  The
demonstrations are intended to generate supporting
cost and performance data for acceptance or
validation of the technology. The goal is to transition
mature environmental science and technology
projects through the  demonstration/ validation phase,
enabling promising technologies to receive regulatory
and end user acceptance in order to be fielded and
commercialized more rapidly. (To learn more about
ESTCP, visit ESTCP's web site at
http://w\Bv.estcpJ.org.)

EPA's ETV program and DoD's ESTCP program
established a memorandum of agreement in 1999 to
work cooperatively with ESTCP on the verification
of technologies that are used to  improve
environmental  cleanup and protection at both DOD
and non-DOD  sites.  The verification of field
analytical technologies for explosives detection
described in this report was conducted jointly by
ETV's SCMT  pilot and ESTCP. The verification
was conducted at ORNL in Oak Ridge, Tennessee,
from August 23 through September 1, 1999. The
performances of two field analytical techniques for
explosives were determined under field conditions.
Each technology was independently evaluated by
comparing field analysis results with those obtained
using an approved reference method, EPA SW-846
Method 8330. The verification was designed to
evaluate the field technology's ability to detect and
measure explosives in soil and water. The primary
constituents in the samples were 2,4,6-trinitrotoluene
(TNT); isomeric dinitrotoluene (DNT), including both
2,4-dinitrotoluene (2,4-DNT) and 2,6-dinitrotoluene
(2,6-DNT);hexahydro-l,3,5-trinitro-l,3,5-triazine
(RDX); and octahydro-l,3,5,7-tetranitro-l,3,5,7-
tetrazocine (HMX). Naturally contaminated
environmental soil samples, ranging in concentration
from 0 to approximately 90,000 mg/kg, were
collected from DoD sites in California, Louisiana,
Iowa, and Tennessee, and were used to assess
several performance characteristics. Explosives-
contaminated water samples  from Tennessee,
Oregon, and Louisiana with concentrations ranging
from 0 to 25,000 |jg/L were also evaluated. This
report discusses the performance of the Research
International, Inc., FAST 2000™ instrument for the
analysis of water samples only. Research
International elected not to analyze the soil samples.

-------
                      Section 2 — Technology Description
In this section, the vendor (with minimal editorial changes by ORNL) provides a description of the
technology and the analytical procedure used during the verification testing activities.
General Technology Description
The Continuous Flow Immunosensor (CFI) is based
on a displacement assay that utilizes antibodies and
fluorescence as a means of detection.  The
technology was originally developed by the U.S.
Naval Research Laboratory (NRL). The field-
portable version of the CFI, the FAST 2000, has
been engineered and manufactured by Research
International, Inc. (RI). The FAST 2000 unit (shown
in Figure 1) can be easily carried into the field
(weight: 2.8 Ib; dimensions: 6 x 15.5 x 16 cm) and
plugged directly into a portable PC for on-site data
acquisition and analysis.
Figure 1. The FAST 2000.

The key elements of the sensor are (1) antibodies
specific for the analyte; (2) signal molecules that are
similar to the analyte but are labeled with a
fluorophore (a cyanine-based fluorescent dye, Cy5)
to enable fluorescence detection; and (3) a
fluorescence detector. For analysis, the analyte-
specific antibodies are immobilized onto a solid
support and then saturated with the fluorescently
labeled signal molecule, creating an antibody/signal
molecule complex. Monoclonal antibodies (the Naval
Research Laboratory's 11B3 TNT and Strategic
Diagnostics RDX) are immobilized onto porous
membrane supports and
saturated with the fluorescent tag using the detailed
protocols outlined in draft U.S. EPA Method 4655.
The membrane is inserted into a disposable coupon
and placed in the FAST 2000, and the buffer flow is
started by a computer command. Once the
fluorescence background signal due to unbound Cy5
has stabilized (generally 15-20 min), the biosensor is
ready for sample injection. If the sample contains the
target analyte, a proportional amount of the labeled
signal molecule is displaced from the antibody and
detected by the fluorimeter downstream. The
coupon and membrane can be used for repeated
assays. The life of the membrane is dependent upon
the number and concentration of positive assays that
are run.

At the time of the demonstration, the cost of
purchasing the FAST 2000 was $23,650. Instrument
purchase included the FAST 2000 instrument
designed for use with an immunoassay-based
sensor; a data acquisition card and a cable linking
the instrument to the laptop computer; a fluid storage
unit; one assay coupon kit; the software required to
run the instrument and analyze data; and an
instruction manual. This price did not include the cost
of the laptop. The FAST 2000 could also be leased
for $1970 per month.

Preparation of Standards
The TNT and RDX calibration standards were
prepared by drying down 20 |jL of stock explosive
standard (1,000,000 (ig/L stored in acetonitrile) with
a nitrogen air stream. Using a micropipettor, 2.0 mL
of system flow buffer (10 mM sodium
monophosphate, 2.5% ethanol, and 0.01% Tween,
pH 7.4) was added to the tube to dissolve the
explosive residue, forming  a 10,000- |jg/L explosive
standard. Serial dilutions of the 10,000-|jg/L standard
were made in flow buffer to obtain 25-, 50-, 100-,
250-, 500-, and 1,000-ng/L standards.

-------
Sample Preparation and Analysis
Sample preparation was minimal. Briefly, 40 |j,L of
0.5 M sodium phosphate/0.5% Tween 20 and 50 |jL
ethanol were added to 1.91 mL of water sample.
Samples and standards (150 |jL) were injected into
the FAST 2000 using a l-|jL Hamilton gas-tight
syringe. Sample analyses using the FAST 2000
immunosensor were initiated with an injection of the
500-|jg/L explosive standard. After duplicate
injections of a water sample were analyzed, a
second standard was analyzed. The concentration of
the second standard was based on the response
determined from the sample. The analyst confirmed
the computer-calculated peak area, which
corresponded to the start of the peak and the end of
the peak, as
designated by the analyst. The peak area of the
closest standard was then compared to the peak
area from each sample injection to acquire a
concentration for that injection of the sample. The
calculated concentrations from the duplicate sample
injections were averaged to determine the final result
for the sample. The reporting limit for both TNT and
RDX was 20
Cross-Reactivity
Table 1 contains a list of compounds that may
interfere with the analyses because they are known
to cross-react with the TNT or RDX assay.
Approximate levels of cross-reactivity, in terms of
relative response to the antibody, are provided in the
table.
     Table 1. FAST 2000 Cross-reactivity
Compound
RDX
TNT
HMX
2-Nitrotoluene
3-Nitrotoluene
4-nitrotoluene
Nitrobenzene
1 ,3-Dinitrobenzene
1,3,5-
Trinitrobenzene
Anti-RDX
antibody
cross-
reactivity
(%)
100
1.8
4.8
1.9
2.6
3.0
1.9
2.8
3.8
Anti-TNT
antibody
cross-
reactivity3
(%)
1
100
5
9
ND
ND
16
ND
600
Compound
Tetryl
2,4-Dinitrotoluene
2,6-Dinitrotoluene
Trinitroglycerin
2-Amino-4,6-
dinitrotoluene
4-Amino-2,6-
dinitrotoluene
1 ,2-Dinitrogly cerin
1,3-Dinitroglycerin
Dinitroethylene
glycol
Anti-RDX
antibody
cross-
reactivity
(%)
0.95
3.1
1.1
1.4
1.3
1.8
1.8
1.3
1.9
Anti-TNT
antibody
cross-
reactivity3
(%)
38
20
4
ND
21
1
ND
ND
ND
     ND = not determined.

-------
                       Section 3 — Demonstration Design
Objective
The purpose of this section is to describe the
demonstration design. It is a summary of the
technology demonstration plan (ORNL 1999).

Demonstration Testing Location and
Conditions
The verification of field analytical technologies for
explosives was conducted at the ORNL Freels Bend
Cabin site, in Oak Ridge, Tennessee. The site is
somewhat primitive, with no running water, but the
vendors were provided with some shelter (porch
overhang) and electrical power. The temperature
and relative humidity were monitored during field
testing. Over the ten days of testing, the average
temperature was 77°F, and ranged from 60 to 88°F.
The average relative humidity was 67%, and ranged
from 35 to 96%.

The samples used in this study were brought to the
demonstration testing location  for evaluation by the
vendors. Explosives-contaminated soils from Army
ammunition plants in Iowa, Louisiana, and
Tennessee and a former Army base in California
(Fort Ord) were used in this verification. In addition,
explosives-contaminated water samples were
analyzed from DoD sites in Oregon, Louisiana, and
Tennessee. Because samples were obtained from
multiple DoD sites, the samples represented a
reasonable cross section of the population of
explosives-contaminated matrices, such that the
versatility of the field technology could be evaluated.
The vendors had the choice of analyzing either soil
or water samples, or both matrices.  More specific
details about the samples are presented below.

Soil Sample  Descriptions
The primary constituents in the soil  samples were
TNT, DNT, RDX, and HMX. The samples also
contained trace amounts of 2-amino-4,6-
dinitrotoluene (2-Am-DNT) and 4-amino-2,6-
dinitrotoluene (4-Am-DNT), which  are degradation
products of TNT. The total concentration of
explosives ranged from 0 to approximately
90,000 mg/kg. The following sections describe the
sites from which the samples were collected.
Sources of Samples
Iowa Army Ammunition Plant
Currently an active site, the Iowa Army Ammunition
Plant was constructed to load, assemble, and pack
various conventional ammunition and fusing systems.
Current production includes 120-mm tank rounds,
warheads for missiles, and mine systems. During the
early years of use, the installation used surface
impoundments, landfills, and sumps for disposal of
industrial wastes containing explosives. The major
contaminants in these samples were TNT, RDX, and
HMX.

Louisiana Army Ammunition Plant
The Louisiana Army Ammunition Plant (LAAAP),
near Shreveport, Louisiana, is a government-owned
facility that began production in 1942. The facility is
currently an Army Reserve plant. Production items
at LAAAP have included metal parts for artillery
shells; the plant also loads, assembles, and packs
artillery shells, mines, rockets, mortar rounds, and
demolition blocks. As a result of these activities and
the resulting soil and groundwater contamination,
EPA placed LAAAP on the National Priorities List
of contaminated sites (Superfund) in 1989. The
major constituents in the samples from this site were
TNT, RDX, and HMX, with trace levels of 1,3,5-
trinitrobenzene (TNB), DNT, 2-Am-DNT, and 4-
Am-DNT.

Milan Army Ammunition Plant
Currently active, the Milan Army Ammunition Plant
(MLAAP) in Milan, Tennessee, was established in
late 1940 as part of the pre-World War II buildup.
The facility still has ten ammunition loading,
assembly, and packaging lines. Munitions-related
wastes have resulted in soil contamination. The
primary  contaminants in these soils were RDX and
TNT.

Volunteer Army Ammunition Plant
The Volunteer Army Ammunition Plant, in
Chattanooga, Tennessee, was built in 1941 to
manufacture TNT and DNT. All production ceased
in 1977. Past production practices resulted in
significant soil and groundwater contamination. In
the samples from this  site, concentrations of TNT

-------
and DNT ranged from 10 to 90,000 mg/kg, with
significantly smaller concentrations of Am-DNT
isomers.

Fort Ord Military Base
Fort Ord, located near Marina, California, was
opened in 1917 as a training and staging facility for
infantry troops and was closed as a military
installation in 1993. Since then, several nonmilitary
uses have been established on the site: California
State University at Monterey Bay has opened its
doors on former Fort Ord property, the University of
California at Santa Cruz has established a new
research center there, the Monterey Institute of
International Studies will take over the officer's club
and several  other buildings, and the post's airfield
was turned  over to the city of Marina. The Army
still occupies several buildings.

An Army study conducted in 1994  revealed that the
impact areas at the inland firing ranges of Fort Ord
were contaminated with residues of high explosives
(Jenkins, Walsh, and Thorne 1998). Fort Ord is on
the National Priorities List of contaminated sites
(Superfund), requiring the installation to be
characterized and remediated to a condition that
does not pose unacceptable risks to public health or
the environment. The contaminant present at the
highest concentration (as much as 300 mg/kg) was
HMX; much lower concentrations of RDX, TNT,
2-Am-DNT, and 4-Am-DNT are present.

Performance Evaluation  Samples
Spiked soil  samples were obtained from
Environmental Resource Associates (ERA, Arvada,
Colo.). The soil was prepared using ERA's
semivolatile blank soil matrix. This  matrix was a
40% clay topsoil that had been dried,  sieved, and
homogenized. Particle size was 60 mesh and
smaller. The samples, also referred to as
performance evaluation (PE) samples, contained
known levels of TNT and RDX. The  concentrations
that were evaluated contained 10, 50,  100, 250, and
500 mg/kg of each analyte. Prior to the
demonstration, ORNL analyzed the spiked samples
to confirm the concentrations. The  method used was
a modified Method 8330, similar to the reference
laboratory method described in Section 4. For the
demonstration, four replicates were prepared at each
concentration level.

Blank soil samples were evaluated to determine the
technology's ability to identify samples with no
contamination (i.e., to ascertain the false positive
error rate). The soil was collected in Monroe
County, Tennessee, and was certified by ORNL to
be free of contamination prior to verification testing.
A reasonable number of blanks (N = 20) was chosen
to balance the uncertainty for estimating the false
positive error rate and the required number of blank
samples to be measured.

Soil Sample Preparation
A few weeks prior to the demonstration, all of the
soil samples were shipped in plastic Ziplock bags at
ambient temperature to ORNL. The samples were
stored frozen (<0°C) prior to preparation. To ensure
that the developers and the reference laboratory
analyzed comparable samples, the soils were
homogenized prior to sample splitting. The process
was as follows. The sample was kneaded in the
Ziplock bag to break up large clumps. Approximately
1500 g of soil was poured into a Pyrex pan, and
debris was removed. The sample  was then air-dried
overnight. The sample was sieved using a 10-mesh
(2-mm particle size) screen and placed in a 1-L
widemouthed jar. After thorough mixing with a metal
spatula, the sample was quartered. After mixing
each quarter, approximately 250 g from each quarter
was placed back in the 1-L widemouthed jar, for a
total sample amount of approximately 1000 g.
Analysis by the ORNL method confirmed sample
homogeneity (variability of 20% relative standard
deviation or less for replicate measurements).  The
sample was then split into subsamples for analysis
during the demonstration.  Each 4-oz sample jar
contained approximately 20 g of soil. Four replicate
splits of each soil sample were prepared for each
participant. The design included a one-to-one pairing
of the replicates, such that the vendor and reference
lab samples could be directly matched. Three
replicate sets of samples were also prepared for
archival storage. To ensure that degradation did not
occur, the soil samples were frozen (<0°C) until
analysis (Maskarinec et al. 1991).

-------
Water Sample Descriptions
Sources of Samples
Explosives-contaminated water samples from
Tennessee, Oregon, and Louisiana were analyzed.
The contamination in the water samples ranged in
concentration from 0 to about 25,000 |Jg/L. Water
samples were collected from LAAAP, MLAAP,
and Volunteer, described in the previous section (see
"Sources of Samples"). Water samples were also
obtained from Umatilla Chemical Depot, described
below.

Umatilla Chemical Depot is located in northeastern
Oregon. The mission of the facility recently changed
to storage of chemical warfare ammunition. Once
the chemicals are destroyed, the installation is
scheduled to close. Several environmental sites have
been identified for cleanup prior to base closure.
One site has explosives-contaminated groundwater;
the cleanup identified for this site is to pump and
treat the water with granulated activated carbon.
The major contaminants in these samples were
TNT, RDX, HMX, and TNB. According to a
remedial investigation conducted at the site, these
samples were not contaminated with any chemical
warfare agents.

Performance Evaluation Samples
Water samples of known concentration were
prepared by the U.S. Army Cold Regions Research
and Engineering Laboratory (CRREL) in Hanover,
New Hampshire. These samples were used to
determine the technology's  accuracy. The
concentrations of TNT and RDX in the spiked
distilled water samples were 25, 100, 200, 500, and
1000 |jg/L for each analyte; four replicates were
prepared at each concentration. Prior to the
demonstration, ORNL analyzed the spiked samples
to confirm the concentrations.

Distilled water obtained from ORNL was used for
the blanks. As with the soil samples, 20 blank
samples were analyzed.

Water Sample Preparation
The water samples were collected in 2.5-gal carboys
approximately 7 to 10 days prior to the start of the
demonstration and shipped on ice to ORNL. To
ensure that degradation did not occur, the samples
were stored under refrigeration until analysis (~4°C)
(Maskarinec et al. 1999). Sample splitting was
performed in a small laboratory cold room, which
was maintained at 4°C. To prepare the water
sample, a spout was attached to the 2.5-gal carboy,
and the water sample was split by filling multiple
250-mL amber glass bottles. As with the soil
samples, four replicate splits of each water sample
were prepared for each participant, and three sets of
samples were also prepared for archival storage.

Sample Randomization
The samples were randomized in two stages. First,
the order in which the filled jars were distributed
was randomized so that the same developer did not
always receive  the first jar filled for a given sample
set. Second, the order of analysis was randomized so
that each participant analyzed the same set of
samples, but in a different order. Each jar was
labeled with a sample number. Replicate samples
were assigned unique (but not sequential) sample
numbers.  Spiked  materials and blanks were labeled
in the same manner, such that these quality control
samples were indistinguishable from other  samples.
All samples were analyzed blindly by both the
developer and the reference laboratory.

Summary of Experimental Design
The distribution of samples from the various sites is
described in Table 2. A total of 108 soil samples
were analyzed, with approximately 60% of the
samples being naturally contaminated environmental
soils, and the remaining 40% being spikes and
blanks. A total  of 176 water samples were  analyzed,
with approximately 75% of the samples being
naturally contaminated environmental water, and the
remaining 25% being spikes and blanks. Four
replicates were analyzed for each sample type.  For
example, four replicate splits of each of three Fort
Ord soils were analyzed, for a total of 12 individual
Fort Ord samples.

Description  of Performance Factors
In Section 5, technology performance is evaluated in
terms of precision, accuracy, completeness, and
comparability, which are indicators of data  quality
(EPA 1998). False positive and negative results,
sample throughput,  and ease of use are also

-------
evaluated. Each of these performance
characteristics is defined in this section.
 Table 2. Summary of Soil and Water Samples
Sample
source or
type
FortOrd
Iowa
LAAAP
MLAAP
Umatilla
Volunteer
Spiked
Blank
Total
No. of soil
samples
12
4
16
20
0
12
24
20
108
No. of water
samples
0
0
80
20
24
8
24
20
176
Precision
Precision is the reproducibility of measurements
under a given set of conditions. Standard deviation
(SD) and relative standard deviation (RSD) for
replicate results are used to assess precision, using
the following equation:

 RSD = (SD / average concentration) x  100% .
                                          (Eq. 1)

The overall RSD is characterized by three summary
values:

•  mean — i.e., average;
•  median — i.e., 50th percentile value, at which
   50% of all individual RSD values are below and
   50% are above; and
•  range — i.e., the highest and lowest RSD values
   that were reported.

The average RSD may not be the best
representation of precision, but it is reported for
convenient reference. RSDs greater than  100%
should be viewed as indicators of large variability
and possibly non-normal distributions.
Accuracy
Accuracy represents the closeness of the tech-
nology's measured concentrations to known (in this
case, spiked/PE) values. Accuracy is assessed in
terms of percent recovery, calculated by the
following equation:

   % recovery = (measured concentration /
    known concentration)  x 100% .
                                          (Eq. 2)

As with precision, the overall percent recovery is
characterized by three summary values: mean,
median, and range.

False Positive/Negative Results
A false positive (fp) result is one in which the
technology detects explosives in the sample when
there actually are none (Berger, McCarty, and Smith
1996). A false negative (fn) result is one in which
the technology indicates that no explosives are
present in the sample, when there actually are
(Berger, McCarty, and Smith 1996). The evaluation
of fp and fn results is influenced by the actual
concentration in the sample and includes an
assessment of the reporting limits of the technology.
False positive results are assessed in two ways.
First, the results are assessed relative to the blanks
(i.e., the technology reports a detected  value when
the sample is a blank). Second, the results are
assessed on environmental and spiked samples
where the analyte was not detected by the reference
laboratory (i.e., the reference laboratory reports a
nondetect and the field technology reports a
detection). False negative results, also assessed for
environmental and spiked samples, indicate the
frequency that the technology reported a nondetect
(i.e., < reporting limits) and the reference laboratory
reported a detection. Note that the reference
laboratory results were confirmed by the ORNL
laboratory so that fp/fn assessment would not be
influenced by faulty laboratory data. The reporting
limit is considered in the evaluation. For example, if
the reference laboratory reported a result as
0.9 mg/kg,  and the technology's paired result was
reported as below reporting limits (<1 mg/kg), the
technology's result was considered correct and not a
false negative result.

-------
Completeness
Completeness is defined as the percentage of mea-
surements that are judged to be usable (i.e., the
result is not rejected). The acceptable completeness
is 95% or greater.

Comparability
Comparability refers to how well the field technology
and reference laboratory data agree. The difference
between accuracy and comparability is that whereas
accuracy is judged relative to a known value,
comparability is judged relative to the results of a
standard or reference procedure, which may or may
not report the results accurately. A one-to-one
sample comparison of the technology results and the
reference laboratory results is performed in
Section  5.

A correlation coefficient quantifies the linear
relationship between two  measurements (Draper
and Smith 1981). The correlation coefficient is
denoted by the letter r; its value ranges from -1 to
+1, where 0 indicates the  absence of any linear
relationship. The value r = -1 indicates a perfect
negative linear relation (one measurement decreases
as the second measurement increases); the value r =
+1 indicates a perfect positive linear relation (one
measurement increases as the second measurement
increases). The slope of the linear regression line,
denoted by the letter m, is related to r. Whereas r
represents the linear association between the vendor
and reference laboratory concentrations, m
quantifies the amount of change in the vendor's
measurements relative to  the reference laboratory's
measurements. A value of+1  for the slope indicates
perfect agreement. Values greater than 1  indicate
that the  vendor results are generally higher than the
reference laboratory, while values less than 1
indicate that the vendor results are usually lower
than the reference laboratory. In addition, a direct
comparison between the field technology and
reference laboratory data is performed by evaluating
the percent difference (%D) between the measured
concentrations, defined as

%D = ([field technology] - [reflab]) I (reflab)
      x 100%                            (Eq. 3)
The range of %D values is summarized and reported
in Section 5.

Sample  Throughput
Sample throughput is a measure of the number of
samples that can be processed and reported by a
technology in a given period of time. This is reported
in Section 5 as number of samples per hour times the
number of analysts.

Ease of Use
A significant factor in purchasing an instrument or a
test kit is how easy the technology is to use. Several
factors are evaluated and reported on in Section 5:

•   What is the required operator skill level (e.g.,
    technician, B.S., M.S., or Ph.D.)?
•   How many operators were  used during the
    demonstration? Could the technology be run by a
    single person?
•   How much training would be required in order to
    run this technology?
•   How much subjective decision-making is
    required?

Cost
An important factor in the consideration of whether
to purchase a technology is cost. Costs involved with
operating the technology and the standard reference
analyses are estimated in Section 5. To account for
the variability in cost data and assumptions, the
economic analysis is presented  as a list of cost
elements and a range of costs for sample analysis.
Several factors affect the cost of analysis. Where
possible, these factors are addressed so that decision
makers can independently complete a site-specific
economic analysis to suit their needs.

Miscellaneous Factors
Any other information that might be useful to a
person who is considering purchasing the technology
is documented in Section 5. Examples of information
that might be useful to a prospective purchaser are
the amount of hazardous waste  generated during the
analyses, the ruggedness of the  technology, the
amount of electrical or battery power necessary to
operate the technology, and aspects of the
technology or method that make it user-friendly or
user-unfriendly.

-------
               Section 4 — Reference Laboratory Analyses
Reference Laboratory Selection
The verification process is based on the presence of
a statistically validated data set against which the
performance goals of the technology may be
compared. The choice of an appropriate reference
method and reference laboratory are critical to the
success of the demonstration. To assess the
performance of the explosives field analytical
technologies, the data obtained from demonstration
participants were compared to data obtained  using
conventional analytical methods. Selection of the
reference laboratory was based on the experience of
prospective laboratories with QA procedures,
reporting requirements, and data quality parameters
consistent with the goals of the program. Specialized
Assays, Inc. (currently part of Test America, Inc.),
of Nashville, Tennessee, was selected to perform
the analyses based on ORNL's experience with
laboratories capable of performing explosives
analyses using EPA SW-846 Method  8330. ORNL
reviewed Specialized Assays' record of laboratory
validation performed by the U.S. Army Corps of
Engineers (Omaha, Nebraska). EPA and ORNL
decided that, based on the credibility of the Army
Corps program and ORNL's prior experience with
the laboratory, Specialized Assays would be  selected
to perform the reference analyses.

ORNL conducted an audit of Specialized Assays'
laboratory operations on May 4, 1999.  This
evaluation focused specifically on the  procedures
that would be used for the analysis of the
demonstration samples. Results from this audit
indicated that Specialized Assays was  proficient in
several areas, including quality management,
document/record control, sample control, and
information management. Specialized Assays was
found to be compliant with implementation of
Method 8330 analytical procedures. The company
provided a copy  of its QA plan, which details all of
the QA and quality control (QC) procedures for all
laboratory operations (Specialized Assays 1999).
The audit team noted that Specialized Assays had
excellent procedures in place for data  backup,
retrievability, and long-term storage. ORNL
conducted a second audit at Specialized Assays
while the analyses were being performed. Since the
initial qualification visit, management of this
laboratory had changed because Specialized Assays
became part of Test America. The visit included
tours of the laboratory, interviews with key
personnel, and review of data packages.  Overall, no
major deviations from procedures were observed
and laboratory practices appeared to meet the QA
requirements of the technology demonstration plan
(ORNL 1999).

Reference Laboratory Method
The reference laboratory's analytical method,
presented in the technology demonstration plan,
followed the guidelines established in EPA SW-846
Method 8330  (EPA 1994). According to Specialized
Assays' procedures, soil samples were prepared by
extracting 2-g samples of soil in acetonitrile  by
sonication for approximately 16 h. An  aliquot of the
extract was then combined with a calcium chloride
solution to precipitate out suspended particulates.
After the solution was filtered, the filtrate was ready
for analysis. For the water samples, 400 mL of
sample were combined with sodium chloride and
acetonitrile in a separatory funnel. After mixing and
allowing the solutions to separate, the bottom
aqueous layer was discarded and the organic layer
was collected. The acetonitrile volume was reduced
to 2 mL, and the sample was diluted with 2 mL of
distilled water for a final volume of 4 mL. The
sample was then ready for analysis. The analytes
were identified and quantified using a high-
performance liquid chromatograph (HPLC)  with a
254-nm UV detector. The primary analytical column
was a C-18 reversed-phase column with
confirmation by a secondary cyano column.  The
practical quantitation limits were 0.5 |jg/L for water
and 0.5 mg/kg for soils.

Reference Laboratory Performance
ORNL validated all of the reference laboratory data
according to the procedure described in the
demonstration plan (ORNL 1999). During the
validation, the following aspects of the data were
reviewed: completeness of the data package,
adherence to holding time requirements, correctness
                                               10

-------
of the data, correlation between replicate sample
results, evaluation of QC sample results, and
evaluation of spiked sample results. Each of these
categories is described in detail in the demonstration
plan. The reference laboratory reported valid results
for all samples, so completeness was 100%.
Preanalytical holding time requirements for water (7
days to extract; 40 days to analyze) and soil (14 days
to extract; 40 days to analyze) were met.  A few
errors were found in a small portion of the data
(-4%). Those data were corrected for transcription
and calculation errors that were identified during the
validation. One data point, a replicate Iowa soil
sample, was identified as suspect. The result for this
sample was 0.8 mg/kg; the results from the other
three replicates averaged 27,400 mg/kg. Inclusion or
exclusion of this data point in the evaluation of
comparability with the field technology (reported in
Section 5) did not significantly change the r value, so
it was included in the analysis. The reference
laboratory results for QC samples were
       Table 3.  Summary of the Reference Laboratory Performance for Soil Samples


Statistic


Mean
Median
Range
Accuracy
(% recovery)

RDX
N = 20
102
99
84-141

TNT
N = 20
100
96
76-174
Precision"
(% RSD)

DNT*
NR=3C
56
32
14-123

HMX
NR=13
29
30
12-63

RDX
NR=13
25
21
4-63

TNT
NR=18
29
25
2-72
      "Calculated from those samples where all four replicates were reported as a detect.
      6DNT represents total concentration of 2,4-DNT and 2,6-DNT.
      CNR represents the number of replicate sets; N represents the number of individual samples
        Table 4.  Summary of the Reference Laboratory Performance for Water Samples


Statistic


Mean
Median
Range
Accuracy
(% recovery)

RDX
N = 20
91
87
65-160

TNT
N = 20
91
91
66-136
Precision"
(% RSD)

DNT*
NR = 7C
30
30
8-80

HMX
NR = 20
20
17
6-49

RDX
NR=29
22
17
5-66

TNT
NR=28
24
20
5-86
        "Calculated from those samples where all four replicates were reported as a detect.
        6DNT represents total concentration of 2,4-DNT and 2,6-DNT.
        CNR represents the number of replicate sets; N represents the number of individual samples
                                                   11

-------
flagged when the results were outside the QC
acceptance limits. The reference laboratory results
were evaluated by a statistical analysis of the data.
Due to the limited results reported for the other
Method 8330 analytes, only the results for the
major constituents in the samples (total DNT,
TNT, RDX, and HMX) are evaluated in this
report.

The accuracy and precision of the reference
laboratory results for soil and water are
summarized in Tables 3 and 4, respectively.
Accuracy was assessed using the spiked samples,
while precision was assessed using the results
from both spiked and environmental samples. The
reference laboratory results were unbiased
(accurate) for both soil and water, as mean
percentage recovery values were near 100%. The
reference laboratory results were precise; all but
one of the mean RSDs were less than 30%. The
one mean RSD that was greater than 30% (soil,
DNT, 56%) was for a limited data set of three.

Table 5  presents the laboratory results for blank
samples. A false positive result is identified as any
detected result on a known blank. The
concentrations of the false positive water results
were low (<2 |Jg/L). For the soil samples, one
false positive detection appeared to be a
preparation error because the  concentration was
near 70,000 mg/kg. Overall, it was concluded that
the reference laboratory  results were unbiased,
precise,  and acceptable for comparison with the
field analytical technology.
       Table 5.  Summary of the Reference Laboratory Performance on Blank Samples
Statistic
Number of data points
Number of detects
% of fp results
Soil
DNT
20
0
0
HMX
20
0
0
RDX
20
0
0
TNT
20
2
10
Water
DNT
20
1
5
HMX
20
0
0
RDX
20
2
10
TNT
20
4
20
                                                 12

-------
                      Section 5 — Technology Evaluation
Objective and Approach
The purpose of this section is to present a
statistical evaluation of the FAST 2000 data and
determine the instrument's ability to measure
explosives-contaminated water samples. RI elected
not to analyze the soil samples described in Section
3. The technology's precision and accuracy are
presented for RDX and TNT. Performance was
evaluated on separate FAST 2000 systems: that is,
one FAST 2000 instrument was employed to
determine RDX concentrations, while a second
was used to determine TNT concentrations.
Differences in performance levels between the
two analytes could be due either to differences in
analyte properties or to differences between the
two instruments. In addition, an evaluation of
comparability through a one-to-one comparison
with the reference laboratory data is presented.
Other aspects of the technology (such as cost,
sample throughput, hazardous waste generation,
and logistical operation) are also evaluated in this
section. The Appendix contains the raw data
provided by the vendor that were used to assess
the performance of the FAST 2000.

Precision
Precision is the reproducibility of measurements
under a given set of conditions. Precision was
determined by examining the results of blind
analyses for four replicate samples. Data were
evaluated only for those samples where all four
replicates were reported as a detection. For
example, for RDX, NR = 22 represents a total of 88
sample analyses (22  sets of four replicates). A
summary of the overall precision of the FAST 2000
for the water sample results is presented in Table
6. The mean RSDs were 52% and 76% for RDX
and TNT, respectively, indicating that the water
analyses were imprecise.

Accuracy
Accuracy represents the closeness of the FAST
2000's measured concentrations to the known
content of spiked samples. A summary of the
FAST 2000's overall accuracy is presented in
Table 7. For the water samples, the mean recoveries
for RDX and TNT were 192% and
 Table 6.   Summary of the FAST 2000 Precision
           for Water Samples
Statistic
Mean
Median
Range
RSDfl
RDX
NR = 22*
52
46
8-142
TNT
NR=12
76
80
36-143
 "Calculated from only those samples where all four replicates
 were reported as a detect.
 6NR represents the number of replicate sets
 Table 7.   Summary of the FAST 2000 Accuracy
           for Water Samples
Statistic
Mean
Median
Range
Recovery
(%)
RDX
N = 20
192
168
81-580
TNT
N= 19
316
197
82-1,110
316%, respectively. This indicated that the FAST
2000's performance for the spiked samples was
biased high because the mean recoveries (and the
medians) were greater than 100%.

False Positive/False Negative Results
Table 8 shows the FAST 2000's performance for the
20 blank samples. RI reported the presence of RDX
in four samples (24% fp  results) and TNT in!6
samples (80% fp results). Note that the RDX data are
                                               13

-------
evaluated for only 17 of the 20 blank water
samples. For RDX, RI reported three of the blanks
as "ME" (matrix effects), indicating that the FAST
2000 could not generate a result because of matrix
interferences.


 TableS.   Summary of FAST 2000 False
           Positives  on Blank Water Samples
Statistic
Number of data points
Number of detects
% of fp results
Number reported as "ME"
RDX
17
4
24%
3
TNT
20
16
80%
0
Table 9 summarizes the FAST 2000's fp and fn
results for all spiked and environmental samples by
comparing the FAST 2000 result with the
reference laboratory result.(See Section 3 for a
more detailed discussion of this evaluation.)  For
the water samples, 2% of the RDX results and
16% of the TNT results were reported as false
positives relative to the reference laboratory results
(i.e., the laboratory reported the analyte as a
nondetect and RI reported it as a detect). A  small
fraction of the samples (3% for each analyte) were
reported as nondetects by RI (i.e., false negatives)
for samples where the laboratory reported a
detect.

Completeness
Completeness is defined as the percentage of the
176 results  that are judged to be useable (i.e., the
result was not rejected). These results where RI
reported "ME" (31  for RDX and 37 for TNT) are
considered  incomplete. Therefore, completeness
was 82% for RDX and 79% for TNT.

Comparability
Comparability refers to how well the field
technology and reference laboratory data agreed.
A one-to-one sample comparison of the FAST
2000 results and the reference laboratory results
was performed for all spiked and environmental
samples that were reported above the reporting
limits. In Table 10, the comparability of the water
 Table 9.    Summary of the FAST 2000
            Detect/Nondetect Performance
            Relative to the Reference Laboratory
            Results
Statistic
Number of data points"
Number of fp results
% of fp results
Number of fn results
% of fn results
Number reported as "ME"
RDX
128
2
2%
4
3%
28
TNT
119
19
16%
4
3%
37
 " Excludes those values reported as "ME."

results are presented in terms of correlation
coefficients (r) and slopes (m). The r value for the
comparison of the entire data set of TNT results was
0.23 (m = 1.81). As shown in Table 10, if
comparability is assessed for specific concentration
ranges, the r value does not change dramatically for
TNT.  Depending on the concentration ranges
selected, the r value ranged from 0.14 to 0.21.


 Table 10.  FAST 2000 Correlation With
            Reference Data for Various Vendor
            Water Concentration Ranges
Concentration
range
All values"
<500 ugl*
>500 ugl
>1,000 pg/L
RDX
r
0.63
0.39
0.60
0.50
m
1.60
0.08
1.58
1.35
TNT
r
0.23
0.14
0.16
0.21
m
1.81
0.01
1.49
2.77
 " Excludes those values reported as "< reporting limits."
 b Based on RFs reported values.
A plot of the FAST 2000 results versus the reference
laboratory results for all TNT concentrations is
presented in Figure 2. The solid line on the graph
represents an ideal one-to-one correspondence
between the two measurements, while the dashed line
                                                14

-------
          40,000 •
          35,000 -
       f
       "3
          30,000 -
       O)

       fe
          25,000 •
          20,000 &   O
          15,000 •
          10,000 •
           5,000 -
      •
      O
      4,

      A
LAAAP
MLAAP
Spike
Umatilla
Volunteer
one-to-one correspondence
linear regression line
                                                                              — n

1,000    2,000     3,000    4,000    5,000    6,000    7,000

             Reference Laboratory TNT Water Results (jxg/L)
                                                                                    8,000   9,000
 Figure 2.     Comparability of reference laboratory water results with FAST 2000 results for all TNT concentrations. The
             slope of the linear regression line is 1.81 and the intercept is 2,135 (ig/L. For clarity, one outlying MLAAP data
             point that is included in the regression analysis was excluded from the graph.
is the linear regression line. Overall, the FAST
2000's TNT results were generally higher than
those of the reference laboratory, as indicated by
the fact that the majority of the data points are
above the solid line. For RDX, the correlation of
the FAST 2000 results with the reference
laboratory results was higher than for TNT, with a
calculated r value of 0.63 and m  of 1.60. Figure 3,
a plot of the RDX comparability  data for
concentrations less than 500 |Jg/L, shows an
interesting trend that further elaborates on the
accuracy data previously presented. While the
accuracy results were biased high for RDX spiked
into distilled water, Figure 3 indicates that several
of the FAST 2000 data were lower than the
reference laboratory data. Further investigation of
these data showed that the majority of the RDX
results on the LAAAP samples were lower than
the reference laboratory's matching results. The
FAST 2000 results were generally higher for the
spiked and MLAAP samples, and evenly
                             distributed higher and lower than the reference
                             laboratory results for the Umatilla samples. This
                             evaluation, summarized in Table 11,  suggests a matrix-
                             dependent effect. It should be noted, however, that
                             the largest number of samples were  analyzed from
                             LAAAP; it is not known whether a similar trend
                             would be observed with the samples from the other
                             sites had more samples been analyzed. The evaluation
                             of the TNT sample data by matrix concurred with the
                             conclusion presented in the
                             accuracy section, that the TNT results were generally
                             biased high.

                             Another metric of comparability is the percent
                             difference (%D) between the reference laboratory
                             and the FAST  2000  results. The ranges of %D values
                             for TNT and RDX are presented in Figure 4.
                             Acceptable %D values would be between -25% and
                             25%, or near the middle of the x-axis of the plot. For
                             TNT, the %D values were mostly greater than 75%.
                             For RDX, the %D values were distributed among the
                                                  15

-------


1
~— '
f/3
"3
M

^
X

o
0
o
H
GO
fe





2000 -i /
1800 -
1600 -
1400 -


1200 -

1000 -
800 -


600 -

400 -
9DD -
2,\J\J
i
ol
/
/ « LAAAP
/ O MLAAP

/ _). spike
,/
/ A Umatilla
/ 	 one-to-one correspondence
/ .....
/ ------- linear regression line



r = 0.39

^.4 • ! » »
-£- y-A ..---•-•-•--••""""
m


0 500 1000 1500 2000 2500 3000 3500
Reference Laboratory RDX Water Results (ug/L)
Figure 3.         Comparability of reference laboratory water results with FAST 2000 results for vendor RDX concentrations less
                 than 500 |ig/L.  The slope of the linear regression line is 0.08 and the intercept
                 is 162 |_ig/L.
 Table 11.  Comparison of FAST 2000 and Reference Laboratory Results by Matrix
Sample or
source type
Spiked
LAAAP
MLAAP
Umatilla
Volunteer
RDX
Nfl
20
51
13
13
0
r»
0.46
0.61
0.56
0.80
n/a
m c
1.59
1.61
0.89
0.63
n/a
Comparison
to reference
laboratory d
High
Low
High
Low and
High
n/a
TNT
N«
19
42
15
12
4
r»
0.59
0.36
0.11
-0.32
0.99
mc
1.39
1.41
1.84
-1.30
1.63
Comparison
to reference
laboratory d
High
High
High
High
High
 " Number of samples, excluding those reported as "ME" or as a nondetect.
 6 Correlation coefficient; FAST 2000 results versus reference laboratory results.
 c Slope of linear regression line; FAST 2000 results versus reference laboratory results.
 d Represents the majority of the measurements compared to the reference laboratory results.
                                                        16

-------
                                   Range of percent difference values
 Figure 4.  Range of percent difference values for RDX and TNT.
various ranges, with the greatest number of samples
having %D values greater than 75%. This supports
the conclusion that the FAST 2000 RDX results
were generally higher than those of the reference
laboratory.

Sample Throughput
Sample throughput is representative of the estimated
amount of time required to prepare and analyze the
sample. Operating under the outdoor conditions, the
RI/ORNL team, usually consisting of three
operators, accomplished a sample throughput rate of
approximately three samples per hour for the water
analyses. Separate instruments were used for the
TNT and RDX analyses. Typically, two operators
analyzed samples while one operator performed data
analysis.

Ease of Use

Three operators were typically used for the
demonstration because of the number of
demonstration samples and working conditions, but
the technology can be operated by a single person.
Approximately one day of training would be
necessary to operate the FAST 2000. RI offers
training at its facility or at the user's facility. No
particular level of educational training is required for
the operator, but technician-level skills in chemical
techniques would be advantageous.

Cost Assessment
The purpose of this economic analysis is to estimate
of the range of costs for an analysis of explosives-
contaminated water samples using the FAST 2000
and a conventional analytical reference laboratory
method. The analysis was based on the results and
experience gained from this demonstration, costs
provided by RI, and representative costs provided by
the reference analytical laboratories that offered to
analyze these samples. To account for the variability
in cost data and assumptions, the economic analysis
is presented as a list of cost elements and a range of
costs for sample analysis by the FAST 2000
instrument and by the reference laboratory.
                                                17

-------
Several factors affected the cost of analysis. Where
possible, these factors were addressed so that
decision makers can complete a site-specific
economic analysis to suit their needs. The following
categories are considered in the estimate:

•   sample shipment costs,
•   labor costs,
•   equipment costs, and
•   waste disposal costs.

Each of these cost factors is defined and discussed
and serves as the basis for the estimated cost ranges
presented in Table 12. This analysis assumed that
the individuals performing the analyses were fully
trained to operate the technology. Costs for sample
acquisition and pre-analytical sample preparation,
which are tasks common to both methods, were not
included here.

FAST 2000 Costs
The costs associated with using the FAST 2000
included labor, equipment, and waste disposal costs.
No sample shipment charges were associated with
the cost of operating the FAST 2000 instrument
because the samples were analyzed on-site.

Labor
Labor costs included mobilization/demobilization,
travel, per diem expenses and on-site labor.

•  Mobilization/demobilization. This cost element
    included the time for one person to prepare for
    and travel to each site. This estimate ranged
    from 5 to 8 h, at a rate of $50/h.
•   Travel. This element was the cost for the
    analyst(s) to travel to the site. If the analyst is
    located near the site, the cost of commuting to
    the  site (estimated to be  50 miles at $0.30/mile)
    would be minimal ($15). The estimated cost of
    an analyst traveling to the site for this
    demonstration ($1000) included the cost of
    airline travel and rental  car fees.
•  Per diem expenses. This cost element included
    food, lodging, and incidental expenses. The
    estimate ranged from zero (for a local site) to
    $150/day for each analyst.
•  Rate. The cost of the on-site labor was
    estimated at a rate of $30-75/h, depending on
    the required expertise level of the analyst. This
    cost element included the labor involved with the
    entire analytical process, comprising sample
    preparation, sample management, analysis, and
    reporting.

Equipment
Equipment costs included mobilization/
demobilization, rental fees or purchase of equipment,
and the reagents and other consumable supplies
necessary to complete the analysis.

•   Mobilization/demobilization. This included the
    cost of shipping the equipment to the test site. If
    the site is local, the cost would be zero. For this
    demonstration, the cost of shipping equipment
    and supplies was estimated at $460.
•   Instrument purchase. At the time of the
    demonstration, the cost of purchasing the FAST
    2000 was $23,650. The instrument purchase
    included a FAST 2000 instrument designed for
    use with an immunoassay-based sensor;  a data
    acquisition card and a cable linking the
    instrument to the laptop computer; a fluid
    storage unit; one assay coupon kit; the software
    required to run the instrument and analyze data;
    and instruction manual. This price does not
    include the cost of the laptop computer. The
    instrument can be leased for $1970 per month.
•   Reagents/supplies. These items are consumable
    and are purchased on a per sample basis. At the
    time of the demonstration, the cost of the
    reagents and supplies needed to prepare  and
    analyze water samples using the FAST 2000
    was $43 per sample. This cost included the
    sample preparation supplies, assay supplies, and
    consumable reagents. An ampule of standard
    was also available for approximately $22.

Waste Disposal
Waste disposal costs are based on the 1999
regulations for disposal of explosives-contaminated
waste. The analyses performed using the FAST
2000 instrument generated approximately 18  L of
aqueous waste. ORNL's cost to dispose of the
explosives-contaminated aqueous waste at a
commercial facility was estimated at $165 per 5 5-gal
drum (the size that was used to contain this amount
                                                 18

-------
of aqueous waste). There are mostly likely additional
costs for labor associated with the waste disposal,
but those costs are not estimated here.

Reference Laboratory Costs
Sample Shipment
Sample shipment costs to the reference laboratory
included the overnight shipping charges, as well as
labor charges associated with the various
organizations involved in the shipping process.

•  Labor. This cost element included all of the
   tasks associated with the shipment of the
   samples to the reference laboratory. Tasks
   included packing the shipping coolers, completing
   the chain-of-custody documentation, and
   completing the shipping forms. The estimate to
   complete this task ranged from 2 to 4 h at $50
   per hour.
•  Overnight shipping. The overnight express
   shipping service cost was estimated to be $50
   for one 50-lb cooler of samples.

Labor, Equipment, and Waste Disposal
The labor bids from commercial analytical reference
laboratories that offered to perform the reference
analysis for this demonstration ranged from $150 to
$188 per sample. The bid was dependent on many
factors, including the perceived difficulty of the
sample matrix, the current workload  of the
laboratory, and the competitiveness of the market.
This rate was a fully loaded analytical cost that
included equipment, labor, waste disposal, and report
preparation.

Cost Assessment Summary
An overall cost estimate for use of the FAST 2000
instrument versus use of the reference laboratory
was not made because of the extent of variation in
the different cost factors, as outlined in Table 12.
The overall costs for the application  of each
technology will be based on the number of samples
requiring analysis, the sample type, and the site
location and characteristics. Decision-making
factors, such as turnaround time for results, must
also be weighed against the cost estimate to
determine the value of the field technology's
providing immediate answers versus the reference
laboratory's provision of reporting data within 30
days of receipt of samples.

Miscellaneous Factors
The following are general observations regarding the
field operation and performance of the FAST 2000
instrument:

•  The system, which weighed approximately 3 Ib,
   was easily transportable.
•  The RI/NRL team completely disassembled
   their work station at the close of each day. It
   took the team less than an hour each morning to
   prepare for sample analyses.
•  The FAST 2000 was interfaced to the notebook
   computer through a PCMCIA card, through
   which both data and power connections were
   made. RI claimed that the instrument could run
   exclusively off of the computer's battery power.
   During the demonstration, the team found that
   the instrument worked best when the battery
   was removed and the computer was plugged
   into an electrical outlet.
•  Sample preparation was minimal.
•  Each sample was analyzed in duplicate, with the
   average concentration reported as the result. At
   least two calibration standards were analyzed
   with each sample.
•  To analyze the 176 water samples, the RI/NRL
   team used 33 TNT-labeled membranes and 18
   RDX-labeled membranes, averaging 5 samples
   per membrane for TNT and 10 samples per
   membrane for RDX.
•  Data processing was performed using an NRL-
   written software program rather than with the
   data acquisition software package supplied with
   the instrument. The results were dependent on
   the user's designation of the start and the end of
   the analyte peak.
                                                19

-------
 Table 12.  Estimated Analytical Costs for Explosives-Contaminated Samples
Analysis method: FAST 2000
Analyst/manufacturer: Research International
Sample throughput: 3 samples/h (for water)
Cost category Cost ($)
Sample shipment 0
Labor
Mobilization/demobilization 250-400
Travel 1 5-1 ,000 per analyst
Per diem expenses 0-1 50/day per analyst
Rate 30-7 5/h per analyst
Equipment
Mobilization/demobilization 0-460
Instrument purchase price 23,650
Instrument lease price 1 ,970 per month
Reagents/supplies 43 per sample
Waste disposal 165
Analysis method: EPA SW-486 Method 8330
Analyst/manufacturer: Reference laboratory
Typical turnaround: 21 working days
Cost category Cost ($)
Sample shipment
Labor 100-200
Overnight shipping 50-150
Labor
Mobilization/demobilization Included"
Travel Included
Per diem expenses Included
Rate 1 50-1 88 per sample
Equipment Included
Waste disposal Included
  ' "Included" indicates that the cost is included in the labor rate.
Summary of Performance
A summary of performance is presented in Table
13. Precision, defined as the mean RSD, was 52%
and 76% for RDX and TNT water sample results,
respectively. Accuracy, defined as the mean percent
recovery relative to the spiked concentration, was
192% and 316% for RDX and TNT, respectively.
Approximately 80% of the water analyses were
complete;  20% were reported as "matrix effects,"
where a result could not be determined. Comparison
with Method 8330 results for homogeneous replicate
splits indicated that the TNT results were generally
higher than the reference laboratory results, while
the RDX results were usually higher, but depended
on the matrix analyzed. Of the 20 blank water
samples, RI reported RDX in 4 samples (24% fp)
and TNT in 16 samples (80% fp). False positive and
false negative results were also determined by
comparing the FAST 2000 result to the reference
laboratory  result on environmental and spiked
samples. For RDX, 2% of the results were fp
relative to  the reference laboratory result, while 16%
of the TNT results were reported as false positives.
RI reported a small fraction of the samples (3% for
each analyte) as nondetects (i.e., false negatives)
when the laboratory reported a detect.

The demonstration found that the FAST 2000 was
relatively simple for the trained analyst to operate in
the field, requiring less than an hour for initial setup.
The sample throughput of the FAST 2000 was
approximately three samples per hour. Three
operators analyzed samples during the
demonstration, but the technology can be run by a
single trained operator. The overall performance of
the FAST 2000 for the analysis  of water samples
was characterized as imprecise and biased high for
TNT, and imprecise and biased high (but matrix-
dependent) for RDX.
                                               20

-------
Table 13.  Performance Summary for the FAST 20000 Water Analyses
Feature/parameter
Precision
Accuracy
False positive results on blank samples
False positive results relative to
reference laboratory results
False negative results relative to
reference laboratory results
Comparison with reference laboratory
results
Completeness
Weight
Sample throughput
Power requirements
Training requirements
Cost
Hazardous waste generation
Overall evaluation
Performance summary
Mean RSD
RDX: 52%
TNT: 76%
Mean recovery
RDX: 192%
TNT: 316%
RDX: 24%
TNT: 80%
RDX: 2%
TNT: 16%
RDX: 3%
TNT: 3%
r (all results) m (all results)
RDX: 0.63 1.60
TNT: 0.23 1.81
Median %D Range of %D values
RDX: 10% -94% to 8,167%
TNT: 125% -96% to 157,000%
RDX: 82%
TNT: 79%
2.8 Ib
3 samples/h (three operators)
Connect to portable PC (use battery or electrical power)
One day instrument-specific training
Instrument purchase: $23,650
Instrument monthly lease : $ 1 ,970
Supplies per sample: $43
18 L aqueous waste for 176 samples
RDX: biased high (but matrix-dependent); imprecise
TNT : biased high; imprecise
                                           21

-------
                     Section 6 — Technology Update and
                           Representative Applications
In this section, the vendor (with minimal editorial changes by ORNL) provides information regarding
new developments with its technology since the verification activities. In addition, the vendor provides
a list of representative applications in which its technology has been used.
Technology Update
As an outcome of the EPA trials, a decision has
been made to improve the software algorithms used
to quantify the assay data. After the trials it was
discovered that approximately 10% of the errors in
the FAST data were due to user error in analyzing
the data. This was purportedly due to operator
fatigue, after running assays for 10 hours a day over
the period of several days. To avoid this problem in
the future, the FAST software will be modified so
that the assay quantification process is automated,
eliminating the possibility of user error in
postprocessing of the data.

In addition to the software improvements mentioned
above, the Flow Assay Sensing and Testing system
(FAST 2000) has now been replaced with a second-
generation instrument, the FAST 6000 (shown in
Figure 5). Research International has developed the
FAST 6000 under contract to the U.S. Naval
Research Laboratory (NRL). Unlike the FAST
2000, which requires connection to a laptop
computer to run, the FAST 6000 is capable of stand-
alone operation, running on a rechargeable lithium-
ion battery pack. The instrument can be purchased
in either a single-channel or a six-channel
configuration. The form factor for a single-channel
      Figures. The FAST 6000.
coupon is the same as that of a six-channel coupon,
allowing single-channel FAST 6000 systems to be
upgraded to six-channel systems at a later time if the
user so desires. The six-channel instrument will
significantly improve sample throughput and reduce
the time needed for analysis of multiple analytes.

The FAST 6000 has a built-in 386 computer and
4x16 character LCD display. A computer-
controlled pump and valves completely automate the
assay process for the user. Assay results are
displayed on the LCD display, and the assay data is
saved in an internal 2-MB FLASH disk. Assay data
files taken into the field can be downloaded to a
desktop computer at a later time via an RS-232 link.
An advanced Windows-based software program has
been developed to allow the user to transfer assay
data and recipe files between an IBM-compatible
personal computer and the FAST 6000. Files can be
saved to the computer hard disk for later viewing
and analysis. After a recipe is optimized, it can be
transferred to the FAST 6000 for use in the field.
This is useful for new assay development, in which
timing and flow rates are being optimized by
modifying easy to use assay recipes.

The software provides a real-time display of the data
from the FAST 6000 in both a table format and a
graphical format.  The Windows-based software
program also allows the user to run the FAST 6000
from a remote computer. With the addition of an
optional RF data link, the FAST 6000 system can be
run from a remote computer up to 20 miles away
from the FAST 6000. This technology also makes it
possible to create an array of systems that are
operated and monitored by a single central
computer.

The FAST 6000 has been improved considerably
through the addition of a positive displacement pump
and redesigned electronics with higher signal-to-
                                               22

-------
noise ratios. The ability to simultaneously run
multiple assays for different analytes on a single
system also represents a significant improvement.
Research International continues to develop and
improve the FAST technology. Work is under way
on a six-channel version of the FAST that will be
used to detect explosives underwater. This  system is
being designed to automatically sample salt water at
depths up to 300 feet and run assays for TNT and
RDX.

Representative Applications
1995: Prototypes of the laboratory version of the
continuous flow immunosensor participated in field
demonstrations with funding from the Strategic
Environmental Research and Development Program
(SERDP). It was tested at Umatilla Army Depot in
Hermiston, Oregon, and SUBASE Bangor in
Bangor, Washington, in collaboration with U.S. EPA
Region 10. Results for the continuous flow
immunosensor can be found in several refereed
papers (see below). EPA coordinator Harry Craig
and associates have written a report of the  field
trials and has a proceedings paper describing both
sensors.

1997: At the National Environmental Technology
Test Site (NETTS) at the Volunteer Army
Ammunition Plant in Chattanooga, Tennessee, an
on-site demonstration of FAST 2000 was conducted
September 23-27. Groundwater was  tested for TNT
in samples from 10 monitoring wells during a 4-day
trial. The demonstration was conducted as  part of a
SERDP research program. Also, Harry Craig, EPA
Region 10, purchased two FAST 2000 units for
monitoring at SUBASE Bangor and Umatilla Army
Depot.

1997-1998: Three field trials for groundwater
analysis and one for soil were performed using the
FAST 2000 to perform on-site analysis for validation
studies. The first groundwater test for this project
was conducted June 23-27, 1997 at

SUBASE Bangor, Bangor, Washington The second
site was Umatilla Army Depot in Hermiston,
Oregon, where the second field trial took place
August 4-8, 1997. The third field trial  site was the
Naval Surface Weapons Center in Crane, Indiana,
where groundwater tests were performed
September 8-12, 1997. The soil field trial was held
April 27- May 1, 1998 at Manchester, Washington,
on samples from Umatilla Army Depot. Harry Craig,
EPA Region 10, coordinated the sites and sample
collection and provided non-developer operators for
these trials.

Refereed Papers
Bart, J. C., L. L. Jidd, K. E. Hoffman, A. M.
Wilkins, P. T. Charles, and A. W. Kusterbeck. 1997.
"Application of a Portable Immunosensor to Detect
Explosives TNT and RDX in Groundwater
Samples." Environmental Science and
Technology 31(5): 1505-11.

Bart, J. C., L. L. Jidd, and A. W. Kusterbeck.  1997.
"Environmental Immunosensors for the Explosive
RDX using a Fluorescent-Dye-Labeled Antigen and
the Continuous Flow Immunosensor." Sensors and
Actuators B 38-39, 411-18.

Craig, H., G Furguson, A. Markos, A. Kusterbeck,
L. Shriver-Lake, T. Jenkins, and P. Thorne.  1996.
"Field Demonstration of On-Site Analytical Methods
for TNT and RDX in Groundwater." Pp. 204-19 in
Proceedings of the Great Plains-Rocky
Mountain Hazardous Substance Research Center
(HSRC)/ Waste Education and Research
Consortium (WERC) Joint Conference on the
Environment, May 21-23, Albuquerque, N.M.

Kusterbeck, A. W., P. R. Gauger, and P. T.
Charles. 1997. "Portable Flow Immunosensor for
Detecting Drugs and Explosives." SPIE
2937:191-96.
                                               23

-------
                               Section 7 — References
Berger, W., H. McCarty, and R-K. Smith. 1996. Environmental Laboratory Data Evaluation. Genium
Publishing Corp., Schenectady, N.Y.

Draper, N. R., andH. Smith.  1981. Applied Regression Analysis. 2nd ed. John Wiley & Sons, New York.

EPA (U.S. Environmental Protection Agency). 1994. "Method 8330: Nitroaromatics and Nitramines by High
Performance Liquid Chromatography (HPLC)." In Test Methods for Evaluating Solid Waste: Physical/
Chemical Methods, Update II. SW-846. U.S. Environmental Protection Agency, Washington, D.C.,
September.

EPA (U.S. Environmental Protection Agency). 1998. EPA Guidance for Quality Assurance Project Plans.
EPA QA/G-5, EPA 600/R-98/018. U.S. Environmental Protection Agency, Washington, D.C., February.

Jenkins, T. F., M. E. Walsh, and P. G. Thorne. 1998. "Site Characterization for Explosives Contamination at a
Military Firing Range Impact Area." Special Report 98-9. U.S. Army Cold Regions Research and
Engineering Laboratory, Hanover, N.H. Available at http://www.crrel.usace.army.mil/

Maskarinec, M. P., C. K. Bayne, L. H. Johnson, S. K. Holladay, R. A. Jenkins, and B. A. Tomkins. 1991.
Stability of Explosives in Environmental Water and Soil Samples. ORNL/TM-11770. Oak Ridge National
Laboratory, Oak Ridge, Term., January.

ORNL (Oak Ridge National Laboratory). 1999. "Technology Demonstration Plan: Evaluation of Explosives
Field Analytical Techniques." Oak Ridge National Laboratory, Oak Ridge, Term., August.

Specialized Assays, Inc. 1999. "Comprehensive Quality Assurance Plan." SAL-QC-Rec 5.0.  January 6.
                                              24

-------
                     Appendix

RFs FAST 2000 Sample Results Compared with Reference
                 Laboratory Results
Sample site
or type
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Blank
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Sample
no.
6
6
6
6
7
7
7
7
8
8
8
8
9
9
9
9
10
10
10
10
5
5
5
5
6
6
6
6
7
7
Sample
replicate
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
RDX«
Otg'L)
RI
<20.0
<20.0
<20.0
<20.0
<20.0
ME'
22
<20.0
81
131
ME
60
<20.0
<20.0
<20.0
<20.0
<20.0
ME
<20.0
<20.0
84
58
181
153
203
162
176
162
ME
ME

RefLab
1.5
0.5
<0.5
<0.5
0.5
0.5
O.5
O.5
0.5
O.5
0.5
0.5
O.5
0.5
O.5
0.5
O.5
O.5
0.5
0.5
200
159
180
158
210
238
160
256
O.5
0.5
TNT"
(|ig/L)
RI
46
65
<20.0
226
97
124
160
<20.0
116
172
196
194
<20.0
120
112
112
48
81
<20.0
46
1280
2862
757
311
213
207
ME
273
ME
59

RefLab
1.9
1.2
O.5
O.5
0.5
0.9
O.5
O.5
0.5
O.5
O.5
1.3
O.5
0.5
O.5
0.5
O.5
O.5
0.5
0.5
170
136
151
138
162
176
113
178
0.5
0.5
RI
analysis
order 6
2068
2069
2005
2112
2142
2065
2038
2012
2045
2044
2027
2050
2025
2078
2165
2146
2176
2034
2042
2004
2123
2104
2049
2033
2031
2163
2094
2159
2130
2026
                        25

-------
Sample site
or type
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Sample
no.
7
7
8
8
8
8
9
9
9
9
10
10
10
10
11
11
11
11
12
12
12
12
13
13
13
13
14
14
14
14
15
15
15
Sample
replicate
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
RDX«
Otg'L)
RI
ME
20
ME
<20.0
ME
ME
894
434
1300
323
1099
67
364
906
<20.0
<20.0
<20.0
28
ME
<20.0
ME
<20.0
<20.0
<20.0
<20.0
<20.0
<20.0
<20.0
<20.0
26
ME
ME
ME

RefLab
0.5
<0.5
0.5
1.1
O.5
0.5
1760
1390
1410
1640
560
470
520
256
18.6
17.2
13
13.9
89
34
52
104
2.5
1.8
2
2.3
11.8
11.4
14
14
O.5
O.5
O.5
TNT"
(Rg/L)
RI
ME
25
<20.0
217
ME
ME
1986
700
2331
ME
ME
ME
<20.0
138
106
ME
ME
250
ME
ME
184
ME
287
20
ME
179
1396
ME
367
ME
ME
ME
ME

RefLab
0.5
O.5
0.5
0.5
O.5
0.5
300
240
320
330
65
40
30
28
0.5
O.5
O.5
0.5
101
59
134
131
1.1
O.5
O.5
0.5
O.5
O.5
0.5
0.5
O.5
O.5
O.5
RI
analysis
order 6
2139
2039
2003
2103
2010
2160
2153
2052
2157
2134
2119
2150
2116
2035
2121
2066
2147
2030
2096
2155
2089
2076
2154
2014
2156
2141
2082
2086
2120
2067
2013
2070
2124
26

-------
Sample site
or type
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Louisiana
Sample
no.
15
16
16
16
16
17
17
17
17
18
18
18
18
19
19
19
19
20
20
20
20
21
21
21
21
22
22
22
22
23
23
23
23
Sample
replicate
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4

RI
ME
<20.
<20.
<20.
29
514
328
170
312
3367
31270
55700
16097
73070
5330
7260
7684
430
398
589
526
863
756
561
701
3942
166
702
6891
2924
564
991
887
RDX«
Otg'L)
RefLab
0.5
,0 <0.5
,0 0.5
,0 O.5
0.8
2160
2720
2600
1760
19600
16700
22800
18400
6100
3100
3500
4900
570
350
380
315
940
1180
1410
1130
3780
2960
2780
2680
2340
1430
1710
1930
TNT"
(Rg/L)
RI
700
<20.0
1005
715
ME
1899
925
832
ME
ME
358
664
38230
5582
19114
11674
1322
3310
12960
5666
4864
6904
8467
4860
2283
1501
ME
2220
1739
59
917
221
ME

RefLab
0.5
O.5
0.5
O.5
0.7
1360
1620
1580
1240
6900
5800
8400
6000
3100
1600
2400
2500
1720
950
990
985
440
490
540
400
1280
1080
1210
1000
1520
850
1040
1260
RI
analysis
order 6
2028
2118
2115
2168
2058
2102
2006
2032
2093
2151
2040
2062
2114
2085
2173
2143
2110
2001
2113
2164
2172
2161
2057
2088
2075
2002
2135
2117
2071
2041
2009
2011
2072
27

-------
Sample site
or type
Louisiana
Louisiana
Louisiana
Louisiana
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Milan
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Sample
no.
24
24
24
24
6
6
6
6
7
7
7
7
8
8
8
8
9
9
9
9
10
10
10
10
7
7
7
7
8
8
8
8
Sample
replicate
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4

RI
583
672
730
434
744
756
582
657
1622
492
826
549
ME
55
24
ME
ME
ME
ME
<20.
166
220
210
503
106
275
197
271
22
91
56
44
RDX«
Otg'L)
RefLab
1770
3000
2260
1980
9
235
250
170
670
660
580
650
<50.0
<50.0
120
<50.0
36
<10.0
19
,0 <10.0
93
91
84
96
83
88
88
65.5
17
19
22
19
TNT"
(Rg/L)
RI
9440
5260
1637
102
110
292
375
258
3984
6936
3807
ME
3536
96893
20080
3770
2130
28546
20450
7370
72
169
<20.0
1193
116
257
ME
168
316
225
145
140

RefLab
1260
2500
1860
1810
80
100
105
60
3600
3800
2960
2650
320
1610
540
2800
<10.0
<10.0
13
<10.0
154
149
150
167
19.8
22
20.5
17.4
72
77
90.5
66
RI
analysis
order 6
2097
2162
2109
2083
2149
2175
2023
2101
2127
2063
2140
2128
2132
2091
2043
2131
2136
2137
2166
2051
2111
2079
2107
2133
2020
2074
2092
2036
2148
2126
2169
2015
28

-------
Sample site
or type
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Spike/PE
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Umatilla
Sample
no.
9
9
9
9
10
10
10
10
11
11
11
11
12
12
12
12
1
1
1
1
2
2
2
2
3
3
3
3
4
4
4
4
5
5
5
5
Sample
replicate
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4

RI
<20.
<20.
<20.
<20.
332
282
331
338
5802
810
2045
1640
896
410
476
521
77
57
130
157
ME
<20.
30
ME
20
31
<20.
<20.
ME
<20.
58
ME
289
343
159
150
RDX«
Otg'L)
RefLab
,0 0.5
,0 42
,0 <0.5
,0 0.5
188
320
146
210
650
1480
840
810
460
480
430
470
234
200
228
142
0.5
,0 O.5
2.6
1.5
27
23
,0 20
,0 27
15
,0 4.8
12
15
348
296
316
248
TNT"
(Rg/L)
RI
553
394
300
2224
157
<20.0
121
195
432
1139
800
883
1000
818
3451
1068
103
712
ME
ME
ME
323
36
ME
108
276
<20.0
145
ME
220
51
244
405
ME
<20.0
325

RefLab
185
244
185
212
O.5
1.1
0.5
O.5
350
680
550
420
930
1020
930
910
42
34
32
20
0.5
0.6
1.3
1.3
146
117
109
127
57
27
83
96
0.5
0.5
O.5
O.5
RI
analysis
order 6
2108
2019
2105
2122
2047
2167
2048
2099
2087
2138
2029
2018
2080
2059
2007
2016
2021
2144
2054
2060
2170
2100
2037
2064
2008
2073
2098
2084
2125
2056
2061
2171
2077
2158
2090
2152
29

-------
Sample site
or type
Umatilla
Umatilla
Umatilla
Umatilla
Volunteer
Volunteer
Volunteer
Volunteer
Volunteer
Volunteer
Volunteer
Volunteer
Sample
no.
6
6
6
6
4
4
4
4
5
5
5
5
Sample
replicate
1
2
3
4
1
2
3
4
1
2
3
4

RI
ME
ME
25
ME
<20.
ME
ME
<20.
<20.
ME
<20.
ME
RDX«
Otg'L)
RefLab
5.1
3.5
3.3
5.9
,0 <0.5
0.5
<0.5
,0 1.8
,0 <5.0
<5.0
,0 <5.0
<50.0
TNT"
(Rg/L)
RI
ME
226
605
<20.0
ME
ME
224
231
ME
2276
1812
ME

RefLab
28
22.5
12.3
20.8
54
44.5
63
105
840
1290
1130
890
RI
analysis
order 6
2129
2106
2046
2145
2055
2017
2081
2022
2053
2174
2024
2095
" The data are presented exactly as reported. Note that the data are not consistently reported with the same number of significant
figures.
4 These are the sample numbers from which the analysis order can be discerned. For example, 2001 was analyzed first, then 2002,
etc.
c "ME" indicates that the sample contained matrix effects and the result could not be reported by the FAST 2000.
                                                          30

-------