United States
         Environmental Protection
         Agency
           Office of
           Research and Development
           Washington, DC 20460
EPA/600/R-03/147
May 2004
&EPA
Innovative Technology
 Verification Report
   Field Measurement Technology for
     Mercury in Soil and Sediment

    Ohio Lumex's RA-915+/RP-91C
           Mercury Analyzer

-------
                                  EPA/600/R-03/147
                                  May 2004
     Innovative  Technology
        Verification Report

Ohio Lumex's RA-915+/RP-91C
          Mercury Analyzer
                  Prepared by

        Science Applications International Corporation
                 Idaho Falls, ID
              Contract No. 68-C-00-179
                Dr. Stephen Billets
          Characterization and Monitoring Branch
            Environmental Sciences Division
            Las Vegas, Nevada 89193-3478
          National Exposure Research Laboratory
           Office of Research and Development
           U.S. Environmental Protection Agency

-------
                                       Notice
The U.S. Environmental Protection Agency through its Office of Research and Development funded
and managed the research described here under contract  to Science Applications  International
Corporation. It has been subjected to the Agency's peer and administrative review and has been
approved for publication as an EPA document.  Mention of trade names or commercial products does
not constitute endorsement or recommendation for use.

-------
                UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                                    Office of Research and Development
                                        Washington, DC 20460

               MEASUREMENT AND MONITORING TECHNOLOGY PROGRAM
                                 VERIFICATION STATEMENT


TECHNOLOGY TYPE: Field Measurement Device

APPLICATION: Measurement for Mercury

TECHNOLOGY NAME: Ohio Lumex Co.'s RA-915+/RP-91C Mercury Analyzer

COMPANY:    Ohio Lumex Co.
ADDRESS:     9263 Ravenna Rd., Unit A-3
              Twinsburg, OH 44087

WEB SITE: http://www.ohiolumex.com

TELEPHONE: (888)  876-2611

VERIFICATION PROGRAM  DESCRIPTION

The U.S. Environmental Protection Agency (EPA) created the Superfund Innovative Technology Evaluation (SITE) and
Measurement and Monitoring Technology (MMT) Programs to facilitate deployment of innovative technologies through
performance  verification and information dissemination.  The goal  of these programs is to further environmental
protection by  substantially accelerating the acceptance and use of improved and cost-effective technologies. These
programs assist and inform those involved in  design, distribution,  permitting, and  purchase of environmental
technologies. This document summarizes results of a  demonstration of the RA-915+/RP-91C Mercury Analyzer
developed by Ohio Lumex Co.

PROGRAM OPERATION

Under the SITE and  MMT Programs, with the full participation of the technology developers, the EPA evaluates and
documents the performance of innovative technologies  by developing demonstration plans, conducting field tests,
collecting and analyzing demonstration data, and  preparing reports.  The technologies are evaluated under rigorous
quality assurance (QA) protocols to produce well-documented data of known quality. The EPA National Exposure
Research Laboratory, which demonstrates field sampling, monitoring, and measurement technologies, selected Science
Applications International Corporation as the verification organization to assist in field testing five field measurement
devices for mercury in soil and sediment. This demonstration was funded by the SITE Program.

DEMONSTRATION  DESCRIPTION

In May 2003, the  EPA conducted a field demonstration  of the RA-915+/RP-91C and four other field measurement
devices for mercury  in soil and  sediment.  This verification statement focuses on the  RA-915+/RP-91C; a similar
statement has been  prepared for each of the  other four devices.  The performance of the RA-915+/RP-91C was
compared to that of an off-site laboratory using the reference method, "Test Methods for Evaluating Solid Waste" (SW-
846) Method 7471 B (modified).  To verify a wide range of  performance attributes, the demonstration had both primary
and secondary objectives.  The primary objectives were:

    (1) Determining  the instrument sensitivity with respect to the Method Detection  Limit (MDL) and  Practical
       Quantitation  Limit (POL);

-------
    (2) Determining the analytical accuracy associated with the field measurement technologies;
    (3) Evaluating the precision of the field measurement technologies;
    (4) Measuring the amount of time required for mobilization and setup, initial calibration, daily calibration, sample
        analysis, and demobilization; and
    (5) Estimating the costs associated with mercury measurements for the following four categories: capital, labor,
        supplies, and investigation-derived waste (IDW).

Secondary objectives for the demonstration included:

   (1) Documenting the ease of use, as well as skills and training required to properly operate the device;
   (2) Documenting potential health and safety concerns associated with operating the device;
   (3) Documenting the portability of the device;
   (4) Evaluating the device durability based on its materials of construction and engineering design; and
   (5) Documenting the availability of the device and associated spare parts.

The RA-915+/RP-91C analyzed 56 field soil samples, 26 field sediment samples, 42 spiked field samples, and 73
performance evaluation (PE) standard reference material (SRM) samples in the demonstration. The field samples were
collected in four areas contaminated with mercury, the spiked samples were from these same locations, and the PE
samples were obtained  from a commercial provider.

Collectively, the environmental and PE samples provided the different matrix types and the different concentrations of
mercury needed to perform a comprehensive evaluation of the RA-915+/RP-91C.  A complete description of the
demonstration  and a summary of the results are available in the Innovative Technology Verification Report: "Field
Measurement Technology for Mercury in Soil and Sediment—Ohio Lumex Co.'s RA-915+/RP-91C Mercury Analyzer"
(EPA/600/R-03/147).

TECHNOLOGY DESCRIPTION

The RA-915+ Mercury Analyzer is a portable AA spectrometer with a 10-meter (m) multipath optical cell and Zeeman
background correction.  Mercury is detected without preliminary accumulation  on a gold trap. Mercury samples are
heated to  750-800°C, causing organic materials to be decomposed and mercury to be vaporized in a carrier gas of
ambient air. The airflow carries the vaporized mercury to be carried to the analytical cell. The RA-915+ includes a built-
in  test cell for field performance verification. The  operation of the  RA-915+ is  based  on the principle of differential,
Zeeman AA spectrometry combined with high-frequency modulation of polarized light. This combination eliminates
interferences and provides the highest sensitivity. A mercury lamp is placed in a permanent magnetic field in which the
254-nm resonance line  is split into three polarized components, two of which are circularly polarized in the opposite
direction.  These two components (o- and o+) pass through a polarization modulator, while the third component (n) is
removed.  One o component passes  through the absorption cell; the other  o component  passes outside of the
absorption cell and through the test cell.  In the absence of mercury vapors, the intensity of the two o com ponents are
equal. When mercury vapor is present in the absorption cell, mercury atoms cause a proportional, concentration-related
difference in the intensity of the o components.  This difference in intensity is what is measured by the instrument. The
unit can be used with the optional RP-91C for an  ultra-low mercury detection  limit in water samples using the "cold
vapor" technique.  For direct mercury determination  in complex matrices without sample pre treatment, including liquids,
soils and sediments,  the instrument will be operated with the  optional RP-91 C accessory, as was done during the
demonstration.

During the demonstration, no extraction or sample digestion was required.  Individual samples were mixed manually
using a quartz injection spoon. This same spoon was used to transfer the sample directly to the RP-91C sample
injection port after the sample was weighed on a digital balance. The sample  weight was manually recorded.  The
sample was analyzed, and  the device displayed the mercury concentration in parts per million, which is equivalent to
a soil concentration in milligrams per kilogram.
                                                   IV

-------
ACTION LIMITS

Action limits and concentrations of interest vary and are project specific. There are, however, action limits which can
be considered as potential reference points. The EPA Region IX Preliminary Remedial Goals for mercury are 23 mg/kg
in residential soil and 310 mg/kg in industrial soil.

VERIFICATION OF  PERFORMANCE

To  ensure data  usability,  data  quality indicators for  accuracy,  precision,  representativeness,  completeness,
comparability, and sensitivity were assessed for the reference method based on project-specific QA objectives. Key
demonstration findings are summarized below for the primary objectives.

Sensitivity: The two primary sensitivity  evaluations performed for this demonstration were the MDL and PQL.  Both
will vary dependent upon whether the matrix is  a soil, waste, or aqueous solution. Only soils/sediments were tested
during this demonstration, and therefore, MDL calculations and PQL determinations for this evaluation are limited to
those matrices.  By definition, values measured below the PQL should not be considered accurate or precise and those
below the MDL are not distinguishable from background noise.

Method Detection  Limit - The evaluation of an  MDL requires seven different measurements of a low concentration
standard or sample following the procedures established in the  40 Code of Federal Regulations (CFR) Part 136.  The
MDL is estimated between 0.0053 and 0.042 mg/kg. The equivalent MDL for the referee laboratory is 0.0026 mg/kg.

Practical Quantitation Limit - The low standard calculations using MDL values suggest that a PQL for the Ohio Lumex
field instrument may be as low as 0.027mg/kg  (5 times the lowest calculated MDL).  The %D for the average Ohio
Lumex result for a  tested sample with a referee laboratory value  of 0.06 mg/kg is 0.072  mg/kg, with a %D of 20%. This
was the lowest sample concentration tested during the demonstration that is close to but not below, the calculated PQL
noted above.  The referee laboratory PQL confirmed during  the demonstration is 0.005 mg/kg with a %D <10%.

Accuracy:  The results from  the RA-915+/RP-91C were compared to the 95% prediction interval for the SRM materials
and to the referee laboratory results (Method 7471B). The Ohio  Lumex data were within SRM 95% prediction intervals
93% of the time, which  suggests  significant equivalence to  certified standards. The  comparison between the Ohio
Lumex field data and the referee laboratory results suggest  that the two data sets are not the same. When a unified
hypothesis test is performed  (which accounts for laboratory bias), this result is  confirmed. Ohio Lumex data were found
to  be both above and below referee laboratory concentrations,  therefore there is no implied or suggested bias.  The
number of Ohio Lumex average values less than 30% different from the referee laboratory results or SRM reference
values was significant - 19 of 33 different sample lots. Ohio Lumex results therefore,  provide accurate estimates for
field determination. Because the Ohio Lumex data compare favorably to the SRM values, the differences between Ohio
Lumex and the referee laboratory  are likely the  result of reasons beyond the  scope of this study.

Precision:  The precision of the Ohio Lumex field instrument is better than the referee laboratory precision. The overall
average RSD, is 22.3% forthe referee laboratory compared to the Ohio Lumex average RSD of 16.1 %. This is primarily
because of the better precision obtained  for the SRM analyses  by Ohio Lumex. Both the laboratory precision and the
Ohio Lumex precision goals of 25% overall RSD were achieved.

Measurement Time: From  the time of sample  receipt, Ohio Lumex required approximately 21 hours, 15 minutes, to
prepare a  draft data package containing mercury results for 197  samples.  One technician performed  half of the
equipment setup and demobilization, most of the sample preparation, and all of the analyses. Individual analyses took
1 minute each, but the total  time per analysis averaged 8.1 minutes per sample (based upon 1.25 analysts) when all
field activities and data  package preparation were included  in the  calculation because the vendor chose to analyze
replicates of virtually every analysis.

Measurement Costs:  The cost peranalyses based upon 197 samples, when renting the RA-915+/RP-91 C, is $23.44
per sample. The cost per analyses for the 197 samples, excluding rental fee,  is $15.82 per sample. Based on a 3-day
field demonstration,  the total cost for equipment rental and necessary supplies  is estimated at $4,617.  The cost by
category is: capital costs, 32.5%;  supplies, 10.8%; support equipment, 6.0%; labor, 19.5%; and IDW, 31.2%.

-------
Key demonstration findings are summarized below for the secondary objectives.

Ease of Use:  Based on observations made during the demonstration, the RA-915+/RP-91C is reasonably easy to
operate; however, lack of automation somewhat impairs the ease of use. Operation requires one field technician with
a basic knowledge of chemistry acquired on the job or in a university and training on the instrument.

Potential Health  and Safety  Concerns:  No significant health  and  safety concerns  were  noted during  the
demonstration.  The only potential health and safety concerns identified were the generation of mercury vapors and  the
potential for burns with careless handling of hot quartz sample boats.  The vendor provides a mercury filter as standard
equipment; exercising caution and good laboratory practices can mitigate the potential for burns.

Portability. The RA-915+ airanalyzerwas easily portable, although the device, even when carried in the canvas sling,
was not considered light-weight.  The addition ofthe RP-91C and associated pump unit preclude this from being a truly
field portable instrument.  The device and  attachments  can  be transported in carrying  cases by two people, but must
then be set up in a stationary location. It was easy to set up, but the combined instrument is better characterized as
mobile rather than  field portable.

Durability: The RA-915+/RP-91C was well designed and constructed for durability.  The outside ofthe RA-915+ is
constructed of sturdy aluminum and the exterior of the  RP-91C furnace is stainless steel.

Availability of the Device: The RA-915+/RP-91C is readily available for rental, lease, or purchase. Spare parts and
consumable supplies can be added to the original instrument order, or can be received within 24 to 48 hours of order
placement. Standards are readily available from laboratory supply firms or can be  acquired through Ohio Lumex.

PERFORMANCE SUMMARY

In summary, during the demonstration, the RA-915+/RP-91C exhibited the following desirable characteristics of a field
mercury measurement device: (1) good accuracy compared to SRMs, (2) good precision, (3) good sensitivity, (4) high
sample throughput, (5) low measurement costs, and (6) ease of use. During the demonstration the  RA-915+/RP-91C
was found to have the following limitations: (1) lack of automation and (2) non-portable due to the instrument size and
weight.  The demonstration findings collectively indicated that the RA-915+/RP-91C is a reliable field measurement
device for mercury in soil and sediment.
  NOTICE: EPA verifications are based on an evaluation of technology performance under specific, predetermined criteria and appropriate
  quality assurance procedures. The EPA makes no expressed or implied warranties as to the performance of the technology and does not
  certify that a technology will always operate as verified. The end user is solely responsible for complying with any and all applicable
  federal, state, and local requirements.
                                                    VI

-------
                                              Foreword
The U.S. Environmental Protection Agency (EPA) is charged by Congress with protecting the nation's natural resources.
Under the mandate of national environmental laws, the Agency strives to formulate and implement actions leading to a
compatible balance between human activities and the ability of natural systems to support and nurture life. To meet this
mandate, the EPA' s Office of Research and Development provides data and scientific support that can be used to solve
environmental problems, build the scientific knowledge base needed to manage ecological resources wisely, understand
how pollutants affect public health, and prevent or reduce environmental risks.

The National Exposure Research Laboratory is the Agency's center for investigation of technical and management
approaches for identifying and quantifying risks to human health and the environment. Goals of the Laboratory's research
program are to (1) develop and evaluate methods and technologies for characterizing and monitoring air, soil, and water;
(2) support regulatory  and policy  decisions; and (3)  provide  the scientific  support  needed to ensure effective
implementation of environmental regulations and strategies.

The  EPA's Superfund Innovative  Technology Evaluation  (SITE)  Program  evaluates  technologies designed  for
characterization and remediation of contaminated Superfund and Resource Conservation and Recovery Act (RCRA) sites.
The SITE Program  was created to provide reliable cost and performance data in order to speed acceptance and use of
innovative remediation, characterization, and monitoring technologies by the regulatory and user community.

Effective monitoring and measurement technologies are needed to assess the degree of contamination at a site, provide
data that can be used to determine the risk to public health or the environment, and monitor the success or failure of a
remediation process. One component of the EPA SITE Program,  the Monitoring and Measurement Technology (MMT)
Program, demonstrates and evaluates innovative technologies to  meet these needs.

Candidate technologies can  originate within the federal government or the  private sector.  Through the SITE Program,
developers are given the opportunity to  conduct  a  rigorous demonstration of their  technologies under actual field
conditions. By completing the demonstration and distributing the results, the Agency establishes a baseline for acceptance
and use of these  technologies. The MMT  Program is managed  by  the  Office of Research and  Development's
Environmental Sciences Division in Las Vegas, NV.
                                                          Gary Foley, Ph. D.
                                                          Director
                                                          National Exposure Research Laboratory
                                                          Office of Research and Development
                                                    VII

-------
                                               Abstract


Ohio Lumex's RA915+/91C mercury analyzer was demonstrated  under the U.S. Environmental Protection Agency
Superfund Innovative Technology Evaluation Program in May 2003, at the Oak Ridge National Laboratory (ORNL) in Oak
Ridge, TN. The purpose of the demonstration was to collect reliable performance and cost data for the RA915+/91C and
four other field measurement devices for mercury in soil and sediment. The key objectives of the demonstration were:
1) determine sensitivity of each instrument with respect to a vendor-generated method detection limit (MDL) and practical
quantitation limit (POL); 2) determine analytical accuracy associated with vendorfield measurements using field samples
and standard reference materials (SRMs); 3) evaluate the  precision of vendor  field measurements; 4) measure time
required to perform mercury measurements; and 5) estimate costs associated with mercury measurements for capital,
labor, supplies, and investigation-derived wastes.

The demonstration also involved analysis of SRMs, field samples collected from  four sites, and spiked field samples for
mercury.  The performance results fora given field measurement device were compared to those of an off-site laboratory
using reference method, "Test Methods for Evaluating Solid Waste" (SW-846) Method 7471 B.

The sensitivity, accuracy, and  precision measurements were successfully completed. Results of these measurement
evaluations suggest that the Ohio Lumex  field instrument  can  perform  as well as the  laboratory analytical method.
Accuracy comparisons to standard reference materials showed statistical equivalence but field sample ana lysis suggested
possible matrix interferences.  Field instrument precision was better than laboratory precision  as determined by relative
standard deviation calculations. During the demonstration, Ohio Lumex required 21.25 hours (1,275 minutes)foranalysis
of 197 samples. The cost per analysis, based on measurement of 197 samples, when incurring a minimum 1-month rental
fee for the RA-915+/RP-91C, was determined to be $23.44 per sample.  Excluding the instrument rental cost, the costfor
analyzing the 197 samples was determined to be $15.82 per sample. Based on  the 3-day field demonstration, the total
cost for equipment rental and necessary supplies was  estimated at $4,617.

The RA915+/RP-91C exhibited good ease of use and durability, as well as no major health and safety concerns. However,
the device portability is somewhat limited by its size.  Additionally, the device is readily available for purchase or lease.
The demonstration findings collectively indicated that the RA91 5+/RP-91 C is a reliable  field mobile measurement device
for mercury in soil.
                                                    VIM

-------
                                               Contents
Notice	  ii
Verification Statement  	iii
Foreword  	  vii
Abstract  	viii
Contents	ix
Tables	  xii
Figures 	xiii
Abbreviations, Acronyms, and Symbols	xiv
Acknowledgments  	xvi

Chapter                                                                                               Page

1      Introduction 	  1
       1.1     Description ofthe SITE Program 	  1
       1.2     Scope of the Demonstration	2
               1.2.1   Phase I 	 2
               1.2.2   Phase II	 2
       1.3     Mercury Chemistry and Analysis 	 3
               1.3.1   Mercury Chemistry  	 3
               1.3.2   Mercury Analysis	4

2      Technology Description	6
       2.1     Description of Atomic Absorption Spectroscopy  	 6
       2.2     Description ofthe RA-915+/RP-91C  	 6
       2.3     Developer Contact Information  	 8

3      Field Sample Collection Locations and  Demonstration Site 	 9
       3.1     Carson River  	  10
               3.1.1   Site Description	  10
               3.1.2   Sample Collection	  10
       3.2     Y-12 National Security Complex 	  11
               3.2.1   Site Description	  11
               3.2.2   Sample Collection	  11
       3.3     Confidential Manufacturing  Site	  11
               3.3.1   Site Description	  11
                                                     IX

-------
                                      Contents (Continued)
Chapter
              3.3.2    Sample Collection	  12
       3.4    Puget Sound	  12
              3.4.1    Site Description	  12
              3.4.2    Sample Collection	  12
       3.5    Demonstration Site	  13
       3.6    SAIC GeoMechanics Laboratory 	  14

       Demonstration Approach	  15
       4.1    Demonstration Objectives  	  15
       4.2    Demonstration Design 	  16
              4.2.1    Approach for Addressing Primary Objectives	  16
              4.2.2    Approach for Addressing Secondary Objectives	20
       4.3    Sample Preparation and Management  	21
              4.3.1    Sample Preparation	21
              4.3.2    Sample Management  	  24
       4.4    Reference Method Confirmatory Process  	25
              4.4.1    Reference Method  Selection	  25
              4.4.2    Referee Laboratory Selection 	25
              4.4.3    Summary of Analytical Methods  	27
       4.5    Deviations from the Demonstration Plan  	28

       Assessment of Laboratory Quality Control  Measurements	  29
       5.1    Laboratory QA Summary	29
       5.2    Data Quality Indicators for Mercury Analysis  	29
       5.3    Conclusions and  Data Quality Limitations  	  30
       5.4    Audit Findings	  32

       Performance of the RA-915+/RP-91C	  33
       6.1    Primary Objectives	  33
              6.1.1    Sensitivity  	  33
              6.1.2    Accuracy	  35
              6.1.3    Precision 	43
              6.1.4    Time Required for Mercury Measurement 	46
              6.1.5    Cost	48
       6.2    Secondary Objectives  	48
              6.2.1    Ease of Use	48
              6.2.2    Health and Safety Concerns	  51
              6.2.3    Portability of the Device	  52
              6.2.4    Instrument Durability	  53
              6.2.5    Availability of Vendor Instruments and Supplies	  53

       Economic Analysis   	  54
       7.1    Issues and  Assumptions  	  54
              7.1.1    Capital Equipment Cost	  54
              7.1.2    Cost of Supplies  	  54

-------
                                     Contents  (Continued)
Chapter                                                                                          Page

              7.1.3   Support Equipment Cost	 55
              7.1.4   Labor Cost	 55
              7.1.5   Investigation-Derived Waste Disposal Cost 	 55
              7.1.6   Costs Not Included  	 56
       7.2     RA-915+/RP-91C Costs	 56
              7.2.1   Capital Equipment Cost	 57
              7.2.2   Cost of Supplies  	 57
              7.2.3   Support Equipment Cost	 57
              7.2.4   Labor Cost	 58
              7.2.5   Investigation-Derived Waste Disposal Cost 	 58
              7.2.6   Summary of RA-915+/RP-91C Costs  	 58
       7.3     Typical Reference Method Costs	 59

8      Summary of Demonstration Results	61
       8.1     Primary Objectives	 61
       8.2     Secondary Objectives 	62

9      Bibliography	 65

Appendix A -   Ohio Lumex Comments  	66
Appendix B -   Statistical Analysis  	 67
                                                  XI

-------
                                               Tables
Table                                                                                             Page

1-1     Physical and Chemical Properties of Mercury	4
1-2     Methods for Mercury Analysis in Solids or Aqueous Soil Extracts  	  5
3-1     Summary of Site Characteristics	  10
4-1     Demonstration Objectives	  15
4-2     Summary of Secondary Objective Observations Recorded During the Demonstration  	  20
4-3     Field Samples Collected from the Four Sites  	  22
4-4     Analytical Methods for Non-Critical Parameters 	28
5-1     MS/MSD Summary	  30
5-2     LCS Summary	  30
5-3     Precision Sum mary	  31
5-4     Low Check Standards	  31
6-1     Distribution of Samples Prepared for Ohio Lumex and the Referee Laboratory	  33
6-2     Ohio Lumex SRM Comparison  	  37
6-3     ALSI SRM Comparison	  37
6-4     Accuracy Evaluation by Hypothesis Testing  	  38
6-5     Number of Sample Lots Within  Each %D Range 	40
6-6     Concentration of Non-Target Analytes 	40
6-7     Evaluation of Precision 	44
6-8     Time Measurements for Ohio Lumex  	47
7-1     Capital Cost Summary for the RA-915+/RP-91C 	  57
7-2     Labor Costs	  58
7-3     IDW Costs	  58
7-4     Summary of Rental Costs for the RA-915+/RP-91C	  59
7-5     RA-915+/RP-91C Costs by Category  	  59
8-1     Distribution of Samples Prepared for Ohio Lumex and the Referee Laboratory	62
8-2     Summary of RA-915+/RP-91C Results for the Primary Objectives  	63
8-3     Summary of RA-915+/RP-91C Results for the Secondary Objectives  	  64
B-1     Unified Hypothesis Test Summary Information	69
                                                  XII

-------
                                               Figures
2-1     RA-915+ instrument schematic	 7
2-2     RA-915+/RP-91C shown setup in a van	 7
3-1     Tent and field conditions during the demonstration at Oak Ridge, TN	 13
3-2     Demonstration site and Building 5507	 13
4-1     Test sample preparation at the SAIC GeoMechanics Laboratory	23
6-1     Data plot for low concentration sample results 	41
6-2     Data plot for high concentration sample results 	42
6-3     RA-915+/RP-91C peak screen	 51
7-1     Capital equipment costs	 57
                                                   XIII

-------
                         Abbreviations, Acronyms, and Symbols
%             Percent
%D           Percent difference
°C            Degrees Celsius
|jg/kg         Microgram per kilogram
AAS          Atomic absorption spectroscopy
ALSI          Analytical Laboratory Services, Inc.
bgs           Below ground surface
cm            Centimeter
CFR          Code of Federal Regulations
Cl            Confidence Interval
COC          Chain of custody
Dl            Deionized (water)
DOE          Department of Energy
EPA          United States Environmental Protection Agency
g             Gram
H&S          Health and Safety
Hg            Mercury
HgCI2         Mercury (II) chloride
IDL           Instrument detection limit
IDW          Investigation-derived waste
ITVR          Innovative Technology Verification Report
kg            Kilogram
L             Liter
LCS          Laboratory control sample
LEFPC        Lower East Fork Poplar Creek
m             Meter
MDL          Method detection limit
mg            Milligram
mg/kg         Milligram per kilogram
mL            Milliliter
mm           Millimeter
MS/MSD       Matrix spike/matrix spike  duplicate
MMT          Monitoring and Measurement Technology
NERL         National  Exposure Research Laboratory
NiMH         Nickel metal halide
ng            Nanogram
nm            Nanometer
ORD          Office of Research and Development
ORNL         Oak Ridge National Laboratory
                                                 XIV

-------
                Abbreviations, Acronyms, and Symbols (Continued)
ORR          Oak Ridge Reservation
OSWER       Office of Solid Waste and Emergency Response
PPE           Personal protective equipment
ppm           Parts per million
POL           Practical quantitation limit
QA           Quality assurance
QAPP         Quality Assurance Project Plan
QC           Quality control
RPD           Relative percent difference
RSD           Relative standard deviation
SAIC          Science Applications International Corporation
SITE          Superfund Innovative Technology Evaluation
SOP           Standard operating procedure
SRM           Standard reference material
SW-846       Test Methods for Evaluating Solid Waste; Physical/Chemical Methods
TOC           Total Organic Carbon
TOM           Task Order Manager
UL            Underwriters Laboratory
UEFPC        Upper East Fork of Poplar Creek
Y-12           Y-12 Oak Ridge Security Complex, Oak Ridge, TN
                                                xv

-------
                                       Acknowledgments
The U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation wishes to acknowledge
the support of the following individuals in performing the demonstration and preparing this document:  Elizabeth Phillips
of the U.S. Department of EnergyOak Ridge National Laboratory (ORNL); Stephen Chi Ids, Thomas Early, Roger Jenkins,
and Monty Ross of the UT-Battelle ORNL; Dale Rector of the Tennessee Department of Environment and Conservation
(TD EC) Department of Energy Oversight; Sergey Pogarev and Joseph Siperstein of Ohio Lumex; Leroy Lewis of the Idaho
National Engineering and Environmental Laboratory, retired; IshwarMurarkaofthe EPA Science Advisory Board, member;
Danny Reible of Louisiana State University; Mike Bolen, Joseph Evans, Julia Gartseff, Sara Hartwell, Cathleen Hubbard,
Kevin Jago, Andrew Matuson, Allen  Motley, John Nicklas, Maurice Owens, Nancy Patti, Fernando Padilla, Mark Pruitt,
James Rawe, Herb Skovronek, and Joseph Tillman of Science Applications International Corporation (SAIC); Scott Jacobs
and Ann Vega of the EPA National Risk Management Research Laboratory's  Land Remediation and Pollution Control
Division; and Brian Schumacher of the  EPA National  Exposure Research Laboratory.

This document was QA reviewed by  George Brilis of the  EPA National Exposure Research Laboratory.
                                                  XVI

-------
                                              Chapter  1
                                            Introduction
The U.S. Environmental Protection Agency (EPA) under
the Office of Research and Development (ORD), National
Exposure  Research  Laboratory (NERL),  conducted a
demonstration to evaluate the performance of innovative
field  measurement devices  for their ability to measure
mercury concentrations in soils and  sediments.   This
Innovative Technology Verification Report (ITVR) presents
demonstration performance results and associated costs
of Ohio Lumex's Mercury Analyzer (RA-915+) with their soil
attachment (RP-91C).  The  vendor-prepared comments
regarding the demonstration are presented in Appendix A.

The demonstration was conducted as part of the  EPA
Superfund  Innovative  Technology  Evaluation  (SITE)
Monitoring and Measurement Technology(MMT) Program.
Mercury contaminated soils and sediments, collected from
four  sites  within  the continental  U.S., comprised the
majority of samples analyzed during the evaluation. Some
soil and sediment samples were spiked with mercury (II)
chloride (HgCI2) to provide concentrations not occurring in
the field samples.  Certified  standard reference material
(SRM) samples were also used to  provide samples with
certified mercury concentrations and to increase the matrix
variety.

The demonstration was conducted at the  Department of
Energy (DOE) Oak Ridge National  Laboratory (ORNL) in
Oak  Ridge, TN  during the week of May  5, 2003.  The
purpose of the  demonstration  was  to obtain  reliable
performance and cost data for field measurement devices
in  order to  1)  provide  potential  users  with  a better
understanding of the devices' performance and operating
costs underwell-defined field conditions and 2) provide the
instrument vendors with documented results that can assist
them in promoting  acceptance and use of their devices.
The  results  obtained  using   the  five  field  mercury
measurement devices were compared to the mercury
results obtained for identical sample sets (samples, spiked
samples, and SRMs) analyzed ata referee laboratory. The
referee  laboratory,  which was selected  prior  to  the
demonstration, used  a well-established EPA reference
method.


1.1    Description of the SITE Program
Performance  verification  of  innovative  environmental
technologies is an integral part of  the regulatory and
research  mission  of the EPA.  The SITE Program was
established by EPA's Officeof Solid Waste and Emergency
Response (OSWER)  and ORD  under the Superfund
Amendments and  Reauthorization Act of 1986.

The  overall goal  of  the  SITE Program  is  to  conduct
performance verification  studies and  to  promote  the
acceptance of innovative technologies that may be used to
achieve long-term protection of human health  and  the
environment. The program is designed to meetthree main
objectives:  1)  identify and  remove  obstacles  to  the
development   and  commercial   use  of  innovative
technologies;   2)  demonstrate  promising  innovative
technologies and  gather reliable performance and cost
information to support site characterization and  cleanup
activities; and  3)  develop procedures and policies that
encourage the use of innovative technologies at Superfund
sites, as well  as  at other  waste  sites or commercial
facilities.

The SITE Program includes the following elements:

    The MMT  Program evaluates innovative technologies
    that sample, detect, monitor, or measure hazardous
    and toxic  substances  in soil,  water, and sediment
    samples. These technologies are expected to provide
    better, faster, or  more cost-effective methods for

-------
    producing real-time data during site characterization
    and   remediation  studies   than  conventional
    technologies.

    The  Remediation  Technology  Program  conducts
    demonstrations of innovative treatment technologies to
    provide reliable performance,  cost, and applicability
    data for site cleanups.

    The  Technology  Transfer  Program  provides  and
    disseminates  technical information in the form  of
    updates,  brochures,   and  other  publications  that
    promote  the   SITE   Program  and  participating
    technologies.  The Technology Transfer Program also
    offers technical assistance, training, and workshops in
    the support of the technologies. A significant number
    of these activities are performed by EPA's Technology
    Innovation Office.

The  Field Analysis  of Mercury  in Soils and  Sediments
demonstration was performed under the MMT Program.
The  MMT Program provides developers of innovative
hazardous waste  sampling, detection,  monitoring,  and
measurement devices  with an  opportunity to demonstrate
the performance  of their  devices under  actual  field
conditions. The main objectives  of the MMT Program are
as follows:

    Test  and verify the performance of innovative  field
    sampling and  analytical technologies that  enhance
    sampling,  monitoring,   and  site  characterization
    capabilities.

    Identify   performance   attributes   of  innovative
    technologies that address field sampling, monitoring,
    and characterization problems in a cost-effective and
    efficient manner.

    Prepare  protocols, guidelines, methods, and  other
    technical  publications  that  enhance acceptance  of
    these technologies for routine use.

The MMT Program is administered by the Environmental
Sciences Division of the NERL  in Las  Vegas, NV.  The
NERL is the  EPA center for investigation of technical and
management approaches for identifying and  quantifying
risks to human health and the environment.  The NERL
mission components include 1) developing and evaluating
methods  and technologies for sampling, monitoring, and
characterizing water, air, soil, and sediment; 2) supporting
regulatory and policy decisions; and 3) providing technical
support  to  ensure  the   effective  implementation  of
environmental regulations and strategies.
1.2    Scope of the Demonstration

The  demonstration project consisted of two separate
phases:  Phase  I  involved  obtaining   information  on
prospective  vendors  having  viable  mercury detection
instrumentation. Phase II consisted of field and planning
activities  leading up to and including the demonstration
activities. The following subsections provide detail on both
of these project phases.

1.2.1  Phase I
Phase   I   was   initiated  by  making   contact  with
knowledgeable sources on the subject of "mercury in soil"
detection devices.  Contacts  included individuals within
EPA,  Science  Applications  International  Corporation
(SAIC), and industry where measurement of mercury in soil
was known to be conducted.  Industry contacts included
laboratories and private developers of mercury detection
instrumentation. In addition, the EPA Task Order Manager
(TOM)  provided contacts for "industry players" who had
participated in previous MMT demonstrations.  SAIC also
investigated university and other research-type contacts for
knowledgeable sources within the subject area.

These contacts led to additional knowledgeable sources on
the subject, which in turn led to various Internet searches.
The  Internet searches were  very successful in  finding
additional  companies  involved  with  mercury detection
devices.

All in all, these research activities generated an original list
of approximately 30 companies potentially involved in the
measurement of mercury in soils.  The list included both
international and  U.S.  companies.    Each  of  these
companies was contacted by phone or email  to acquire
further information. The contacts resulted in 10 companies
that appeared to have  viable technologies.

Due to instrument  design (i.e., the instrument's ability to
measure   mercury  in  soils  and sediments), business
strategies, and stage of technology development, only 5 of
those 10 vendors participated in the field demonstration
portion of phase II.

1.2.2  Phase II
Phase  II of the demonstration project involved strategic
planning, field-related activities for the demonstration, data
analysis, data interpretation, and preparation of the ITVRs.
Phase  II included pre-demonstration and demonstration
activities, as described in the following subsections.

-------
1.2.2.1 Pre-Demonstration Activities

The pre-demonstration activities were completed in the fall
2002. There were six objectives for the pre-demonstration:

    Establish  concentration ranges for testing  vendors'
    analytical equipment during the demonstration.

    Collect soil and sediment field samples to be used in
    the demonstration.

    Evaluate sample homogenization procedures.

    Determine mercury  concentrations in homogenized
    soils and sediments.

    Selecta reference method and qualify potential referee
    laboratories for the demonstration.

    Provide soil and sediment samples to the vendors for
    self-evaluation of their instruments, as a precursor to
    the demonstration.

As an integral part of meeting these objectives, a pre-
demonstration  sampling  event was  conducted   in
September 2002  to collect field samples of soils and
sediments containing different levels of mercury.  The field
samples were obtained from the following locations:

    Carson River Mercury site - near Dayton, NV

    Y-12 National Security Complex - Oak Ridge, TN

    A confidential manufacturing facility - eastern U.S.

    Puget  Sound - Bellingham  Bay, WA

Immediately after collecting field sample material from the
sites noted above, the  general mercury concentrations in
the  soils   and  sediments  were confirmed  by  quick
turnaround   laboratory  analysis  of  field-collected
subsamples using method SW-7471B. The field sample
materials were then shipped to a soil preparation laboratory
forhomogenization. Additional pre-demonstration activities
are detailed in Chapter 4.

1.2.2.2 Demonstration  Activities

Specific objectives for  this  SITE  demonstration were
developed  and  defined  in a  Field  Demonstration and
Quality Assurance  Project Plan  (QAPP) (EPA Report #
EPA/600/R-03/053). The Field Demonstration  QAPP is
available   through   the   EPA   ORD  web   site
(http://www.epa.gov/ORD/SITE) or from the EPA Project
Manager.  The demonstration objectives were subdivided
into two categories:  primary and secondary.  Primary
objectives are goals of the demonstration study that need
to   be  achieved  for  technology  verification.    The
measurements  used  to  achieve  primary objectives are
referred to as critical.  These measurements  typically
produce quantitative  results  that can  be verified using
inferential and descriptive statistics.

Secondary  objectives   are   additional  goals   of  the
demonstration  study developed  for  acquiring   other
information of interest about  the technology that is not
directly related  to verifying the primary objectives.  The
measurements required forachieving secondary objectives
are considered to be noncritical. Therefore, the analysis of
secondary objectives  is typically more qualitative in nature
and often  uses  observations  and sometimes descriptive
statistics.

The field portion of the demonstration involved evaluating
the capabilities  of five mercury-analyzing instruments  to
measure  mercury concentrations  in soil and sediment.
During the demonstration, each instrument vendor received
three  types  of  samples  1) homogenized field  samples
referred to as "field samples", 2)  certified SRMs,  and  3)
spiked field samples (spikes).

Spikes were prepared by adding known quantities of HgCI2
to field samples. Together, the field samples, SRMs, and
spikes are referred to as  "demonstration samples"  for the
purpose of this ITVR. All  demonstration  samples were
independently analyzed  by a  carefully selected referee
laboratory. The  experimental design for the demonstration
is detailed in Chapter 4.

1.3    Mercury Chemistry and Analysis

1.3.1 Mercury Chemistry
Elemental mercury is the  only metal that occurs as a liquid
at  ambient  temperatures.   Mercury  naturally  occurs,
primarily within the ore, cinnabar, as mercury sulfide (HgS).
Mercury easily forms  amalgams with many other metals,
including gold. As a result, mercury has historically been
used to recover gold from ores.

Mercury is ionically stable; however, it is very volatile for a
metal. Table  1-1  lists selected  physical and chemical
properties of elemental mercury.

-------
 Table 1-1. Physical and Chemical Properties of Mercury

 Properties                 Data
 Appearance

 Hardness

 Abundance

 Density @ 25 "C

 Vapor Pressure @ 25 "C

 Volatilizes @

 Solidifies @
Silver-white, mobile, liquid.

Liquid

0.5% in Earth's crust

13.53g/mL

0.002 mm

356 "C

-39 "C
Source: Merck Index, 1983
Historically, mercury releases to the environment included
a number  of  industrial processes  such as  chloralkali
manufacturing, copper and zinc smelting operations, paint
application,  waste oil  combustion,  geothermal  energy
plants, municipal  waste incineration, ink manufacturing,
chemical manufacturing,  paper mills,  leather tanning,
pharmaceutical production, and textile manufacturing. In
addition, industrial  and  domestic  mercury-containing
products, such as thermometers, electrical switches, and
batteries, are disposed of as solid wastes in landfills (EPA,
July 1995).  Mercury is also an indigenous compound at
many abandoned  mining sites and is, of course, found as
a natural ore.
At mercury-contaminated sites, mercury exists in mercuric
form (Hg2+), mercurous form (Hg22+), elemental form (Hg°),
and alkylated form (e.g., methyl or ethyl mercury).  Hg22+
and  Hg2+ are the more stable forms under oxidizing
conditions.   Under   mildly reducing conditions,  both
organically bound mercury and inorganic mercury may be
degraded to elemental  mercury,  which  can then  be
converted readily to methyl  or ethyl mercury by biotic and
abiotic processes.  Methyl and ethyl mercury are the most
toxic forms of mercury; the alkylated mercury compounds
are volatile and soluble in water.

Mercury (II) forms relatively strong complexes with Cl'and
CO32".  Mercury (II) also forms complexes with inorganic
ligands such as fluoride (F~),  bromide (Br~),  iodide (I"),
sulfate (SO42"),  sulfide (S2~), and phosphate (PO43~) and
forms  strong complexes with  organic ligands, such as
sulfhydryl groups, amino acids, and humic and fulvicacids.
The  insoluble  HgS  is  formed  under mildly reducing
conditions.

1.3.2 Mercury Analysis
There  are several laboratory-based,  EPA  promulgated
methods for the analysis of mercury  in  solid  and liquid
hazardous waste matrices.  In addition, there are several
performance-based methods  for the determination of
various mercury  species.  Table 1-2 summarizes the
commonly used  methods for measuring mercury in both
solid and liquid matrices, as identified  through a review of
the EPA Test Method Index  and SW-846. A discussion of
the choice of reference method is presented in  Chapter 4.

-------
Table 1-2.  Methods for Mercury Analysis in Solids or Aqueous Soil Extracts
   Method
   Analytical
   Technology
   Type(s) of
Mercury analyzed
    Approximate
Concentration Range
Comments
 SW-7471B   CVAAS
 SW-7472    ASV
 SW-7473
 SW-7474
TD,
amalgamation,
and AAS

AFS
  inorganic mercury   10-2,000 ppb
  organo-mercury

  inorganic mercury   0.1-10,000 ppb
  organo-mercury

  inorganic mercury   0.2 - 400 ppb
  organo-mercury
  inorganic mercury   1 ppb - ppm
  organo-mercury
                       Manual cold vapor technique widely
                       used for total mercury determinations

                       Newer, less widely accepted method


                       Allows for total decomposition analysis
                       Allows for total decomposition analysis;
                       less widely used/reference
 EPA 1631    CVAFS
 EPA 245.7   CVAFS
 EPA 6200    FPXRF
                      inorganic mercury   0.5-100ppt
                      organo-mercury
                                    inorganic mercury
                                    organo-mercury
                                         0.5 - 200 ppt
                      inorganic mercury   >30 mg/kg
                                             Requires "trace" analysis procedures;
                                             written for aqueous matrices; Appendix
                                             A of method written for sediment/soil
                                             samples

                                             Requires "trace" analysis procedures;
                                             written for aqueous matrices; will
                                             require dilutions of high-concentration
                                             mercury samples

                                             Considered a screening  protocol
AAS = Atomic Absorption Spectrometry
AAF = Atomic Fluorescence Spectrometry
AFS = Atomic Fluorescence Spectrometry
ASV = Anodic Stripping Voltammetry
CVAAS = Cold Vapor Atomic Absorption Spectrometry
CVAFS = Cold Vapor Atomic Fluorescence Spectrometry
FPXRF = Field Portable X-ray Fluorescence
EPA = U.S. Environmental Protection Agency
mg/kg = milligram per kilogram
ppb = parts per billion
ppm = parts per million
ppt = parts per trillion
SW = solid waste
TD = thermal decomposition

-------
                                              Chapter 2
                                    Technology Description
This chapter provides a detailed description of the thermal
decomposition method of atomic absorption spectroscopy
(AAS), which  is the type of technology on which Ohio
Lumex's instrument is based, and a detailed description of
the RA-915+  Mercury Analyzer  with the RP-91C soil
attachment.

2.1    Description   of  Atomic   Absorption
       Spectroscopy
The principle of analysis used by the RA-915+and RP-91C
is thermal decomposition followed by AAS, with a 10-meter
(m) multi-path  optical  cell and  Zeeman  background
correction.  AAS uses the absorption of light to measure
the concentration of gas-phase atoms. Because samples
are liquids or solids, the analyte atoms or ions must be
vaporized  in  a flame or graphite furnace.  The atoms
absorb ultraviolet or visible light and  make transitions  to
higher electronic energy levels. The analyte concentration
is   determined  from   the  amount  of  absorption.
Concentration measurements are  determined from a
working curve  after  calibrating  the  instrument  with
standards of known concentration.

In  reference to AAS, as a general analytical application,
thermal decomposition is followed by atomic absorption;
however, the mechanism of chemical recovery for analysis
may   vary.    Examples  include  cold  vapor traps,
amalgamation desorption, and direct detection.

A  sample  of  known mass is placed  in the drying and
decomposition furnace and heated to between 600-800
Celsius (°C).   The liquid or  solid sample is  dried and
organic materials are decomposed. The amount  of light
absorbed by an analyte (the product of decomposition), in
this case mercury vapor,  is compared to a standard  to
quantify the mass of that analyte present in a sample of
known size.  The absorption of light is proportional to the
mass  of the analyte present.  The wavelength of the light
source is specific to the analyte of interest. For mercury,
the wavelength is 254 nm.

2.2     Description of the RA-915+/RP-91C
The  RA-915+  Mercury Analyzer is a portable atomic
absorption (AA) spectrometer with a 10-m multipath optical
cell and Zeeman background correction.   Among  its
features  is  the  direct detection  of  mercury without
preliminary accumulation on a gold trap.  The  RA-915+
includes a built-in testcellforfield performance verification.
The unit can be used with the optional RP-91C for an ultra-
low mercury detection limit in water samples using  the
"cold vapor" technique. For direct mercury determination
in complex matrices withoutsample pretreatment, including
liquids, soils  and sediments, the instrument is  operated
with the  RP-91C accessory.

The operation of the RA-915+ is based  on the principle of
differential, Zeeman AA spectrometry combined with high-
frequency modulation  of polarized light. This combination
eliminates  interferences   and  provides  the  highest
sensitivity.  A mercury  lamp is placed in a permanent
magnetic field in which the 254-nm resonance line is split
into three polarized components, twoofwhich are circularly
polarized in the opposite direction. These two components
(o- and  o+) pass through a polarization modulator, while
the third component (n) is removed (see Figure 1). One o
component passes through  the absorption cell; the other o
component passes outside  of the absorption cell. In the
absence of mercury vapors, the intensity of the two o
components are equal. When mercury vapor is present in
the absorption cell, mercury atoms cause  a proportional,

-------
concentration-related difference in the intensity of the o
components.   This  difference in  intensity  is what is
measured by the instrument.
                     71      Zeemdn
                          mercury triplet
                                         ie envelope
  Polarizat
        Dry Catal
                     Multi
                                          Photocfctcclor
Figure 2-1. RA-915+ instrument schematic.
The  RP-91C attachment is intended to decompose  a
sample  and to  reduce the  mercury using the pyrolysis
technique. The RP-91C attachment is a furnace heated to
800 °C where mercury is converted from a bound state to
the atomic state by thermal decomposition, and reduced in
a two-section furnace. In the first section of the furnace,
the "light" mercury compounds are preheated and burned.
In the second section, a catalytic afterburner decomposes
"heavy"  compounds.  After the atomizer, the gas flow
enters the analytical cell of the attachment.  Ambient air is
used as a carrier gas; no cylinders of compressed gasses
are required. Zeeman correction eliminates interferences,
thus, no gold amalgamation  is required. The instrument is
controlled and the data are acquired by software based on
a Microsoft Windows® platform.

Applications and Specifications -  The RA-915+ is  a
portable  spectrometer  designed   for  interference-free
analysis/monitoring  of mercury content in ambient  air,
water, soil,  natural and  stack gases  from  chlor-alkali
manufacturing, spill response, hazardous waste, foodstuff,
and biological materials. The Ohio  Lumex system is fully
operational in the field and could be set up in a van, as well
as a helicopter, marine vessel,  or hand-carried  for
continuous measurements.   The  RP-91  and RP-91C
attachments are used to convert the instrument into a liquid
or solid sample analyzer, respectively. The instrument is
suitable for field operation using a built-in battery.

According to the RA-91 5+ Analyzer manual, the base unit
has a dimension of 47 cm by 22 cm by 11 cm and weighs
7.57 kg. The palm unit measures 13.5 cm by 8 cm by 2 cm
and weighs 0.32 kg. The power supply can be a built-in, 6-
volt  rechargeable  battery,  a power pack  adapter, an
external  electric  battery,  or  an  optional rechargeable
battery pack. The RP-91 C system includes a pumping unit
that has a dimension of 34 cm by 24 cm by  12 cm  and a
power supply unit measuring 14.5 cm by 15 cm by 8.5 cm
(see Figure  2).  Site requirements cited in the manual
include a temperature range of 5 to 40 °C, relative humidity
of up to 98%, atmospheric pressures  of 84  to  106.7
kilopascals, along with requirements for sinusoidal vibration
and magnetic field tension. Sensitivity of the instrument is
reportedly not  affected by  up  to a  95%  background
absorption  caused by  interfering  components   (dust,
moisture, organic and inorganic gases).
Figure 2-2.  RA-915+/RP-91C shown setup in a van.
Operation - The instrument calibration is performed by use
of liquid or  solid, primary National Institute of Standards
and Technology (NIST)-traceable standards.  The normal
dynamic  analytical range is from  1-100  ug/kg  by  direct
determination withoutdilution. No sample  mineralization is
needed, and the only waste generated is minimal residual
sample   residue,  excess  sample,  and  any  personal
protective  equipment  that  may be used.    Sample
throughput  is up to 30 samples  per hour without an auto
sampler.

-------
2.3    Developer Contact Information           Twmsburg, OH 44087
                                                   Toll free: (888) 876-2611
Additional information about the RA-915+and PR-91Ccan    Telephone' (3 30) 405-0837
be obtained from the following source:                    pax. /^Q\ 405-0847
Joseph Siperstein                                    Email: mail@ohiolumex.com
Ohio Lumex Co.                                      Internet: www.ohiolumex.com

9263 Ravenna Rd., Unit A-3

-------
                                              Chapter 3
             Field Sample Collection Locations and Demonstration Site
As previously described in Chapter 1, the demonstration in
part tested the ability of all five vendor  instruments to
measure  mercury  concentrations  in  demonstration
samples. The  demonstration samples consisted of field-
collected samples, spiked field samples, and SRMs. The
field-collected  samples   comprised  the  majority  of
demonstration  samples.  This chapter describes the four
sites from which the field  samples were collected, the
demonstration  site, and the  sample homogenization
laboratory. Spiked samples were preparedfrom these field
samples.

Screening of potential mercury-contaminated field sample
sites was conducted during Phase  I of the project.  Four
sites were selected for acquiring mercury-contaminated
samples thatwere diverse in appearance, consistency, and
mercury concentration. A key criterion was the source of
the contamination.  These sites  included:

    Carson River Mercury site -  near Dayton, NV

    The Y-12  National Security Complex (Y-12) - Oak
    Ridge, TN

    A confidential manufacturing facility - eastern U.S.

    Puget Sound - Bellingham Bay, WA

Site  Diversity  - Collectively,  the  four  sites provided
sampling areas  with both  soil and  sediment,  having
variable physical consistencies and variable  ranges of
mercury contamination.   Two of the sites (Carson River
and Oak Ridge) provided  both soil and sediment samples.
A third site (a manufacturing facility) provided just  soil
samples and a fourth site  (Puget Sound) provided only
sediment samples.

Access and Cooperation - Site  representatives were
instrumental in providing  site access, and in some cases,
guidance on  the  best  areas to  collect  samples from
relatively high and low mercury concentrations. In addition,
representatives from the host demonstration site (ORNL)
provided a facility for conducting the demonstration.

At three of the sites, the soil and/or sediment sample was
collected,  homogenized  by  hand  in  the  field,  and
subsampled  for  quick  turnaround  analysis.    These
subsamples  were sent  to  analytical  laboratories  to
determine the general range of mercury concentrations at
each of the sites.  (The  Puget Sound site  did not require
confirmation  of  mercury contamination due  to  recently
acquired mercury  analytical data  from another,  ongoing
research project.)   The  field-collected soil and sediment
samples from all four sites were then shipped to SAIC's
GeoMechanics Laboratory for a more thorough sample
homogenization (see Section 4.3.1) and subsampled for
redistribution to vendors  during  the pre-demonstration
vendor self-evaluations.

All five of the  technology  vendors  performed a  self-
evaluation   on   selected   samples  collected   and
homogenized during this pre-demonstration phase of the
project. For the self-evaluation, the laboratory results and
SRM values were supplied to the vendor, allowing the
vendor to determine how well it performed  the analysis on
the  field samples.  The  results  were used to gain  a
preliminary understanding of the field samples  collected
and to  prepare for the demonstration.

Table  3-1 summarizes key  characteristics of  samples
collected at each of the  four sites.  Also included are the
sample matrix, sample  descriptions, and sample depth
intervals. The analytical results presented in Table 3-1 are
based  on referee laboratory mercury  results  for  the
demonstration samples.

-------
Table 3-1. Summary of Site Characteristics
Site Name
Carson River
Mercury site

Y-12 National
Security Complex
Confidential
manufacturing site
Puget Sound -
Bellingham Bay

Sampling Area
Carson River
Six Mile Canyon
Old Hg Recovery Bldg.
Poplar Creek
Former plant building
Sediment layer
Underlying Native Material
Sample
Matrix
Sediment
Soil
Soil
Sediment
Soil
Sediment
Sediment
Depth
water/sediment
interface
3 - 8 cm bgs
0 - 1 m bgs
0 - 0.5 m bgs
3. 6 -9m bgs
1.5-1.8 m thick
0.3 m thick
Description
Sandy silt, with some
organic debris present
(plant stems and leaves)
Silt with sand to sandy silt
Silty-clay to sandy-gravel
Silt to coarse sandy gravel
Silt to sandy silt
Clayey-sandy silt with
various woody debris
Medium-fine silty sands
Hg Concentration
Range
10 ppb - 50 ppm
10 ppb- 1,000 ppm
0.1 - 100 ppm
0.1 - 100 ppm
5- 1,000 ppm
10 -400 ppm
0.16- 10 ppm
bgs = below ground surface.

3.1    Carson River

3.1.1  Site Description
The Carson  River Mercury site begins near Carson City,
NV, and extends downstream to the Lahontan Valley and
the Carson Desert.  During the Comstock mining era of the
late  1800s,   mercury was imported  to  the area  for
processing  gold and silver ore.  Ore  mined from  the
Comstock Lode was transported to mill sites, where it was
crushed  and mixed  with  mercury to  amalgamate  the
precious metals. The Nevada mills were located in Virginia
City, Silver City, Gold Hill, Dayton, Six Mile Canyon, Gold
Canyon, and adjacent to the Carson River between New
Empire and Dayton.  During the mining era, an estimated
7,500 tons of mercury were discharged into the Carson
River   drainage,  primarily  in   the   form   of
mercury-contaminated tailings  (EPA Region 9, 1994).

Mercury contamination is present at Carson Riveras either
elemental mercury and/or inorganic mercury sulfides with
less   than   1%,   if  any,  methylmercury.     Mercury
contamination exists in soils presentat the former gold and
silvermining  mill sites; waterways adjacentto the millsites;
and sediment, fish, and wildlife over more than a 50-mile
length of the  Carson River. Mercury is also present in the
sediments and adjacent flood  plain of the Carson River,
and in the sediments of Lahontan Reservoir, Carson Lake,
Stillwater Wildlife Refuge, and  Indian Lakes.  In addition,
tailings with elevated mercury levels are still presentat, and
around,  the   historic  mill  sites, particularly  in  Six  Mile
Canyon (EPA, 2002a).
3.1.2 Sample Collection

The  Carson River Mercury site provided both soil and
sediment  samples  across the range of contaminant
concentrations desired for the demonstration.   Sixteen
near-surface soil samples were collected between 3-8 cm
below ground surface (bgs). Two sediment samples were
collected  at the  water-to-sediment interface.   All 18
samples were collected on September 23-24, 2002 with a
hand shovel. Samples were collected in Six Mile Canyon
and along  the Carson River.

The sampling sites were  selected  based upon  historical
data from  the site.  Specific sampling locations in the Six
Mile Canyon were selected based  upon local  terrain and
visible soil conditions (e.g., color and particle  size).  The
specific sites were selected to obtain soil samples with as
much variety in mercury concentration as possible. These
sites included hills,  run-off pathways,  and dry  river bed
areas. Sampling locations along the Carson  River were
selected based upon historical mine locations, local terrain,
and river flow.

When collecting the soil samples, approximately 3 cm of
surface soil was scraped  to the side.  The sample  was
then  collected   with  a  shovel,  screened   through a
6.3-millimeter (mm) (0.25-inch) sieve  to remove larger
material, and collected in 4-liter (L) scalable bags identified
with a  permanent marker. The sediment samples were
also collected with a shovel, screened through a 6.3-mm
sieve  to  remove larger material,  and  collected in  4-L
scalable bags identified with a permanent marker. Each of
the 4-L scalable bags was placed into a second  4-L
                                                    10

-------
sealable bag, and the sample label was placed onto the
outside bag.  The sediment samples were then placed into
10-L buckets, lidded, and identified with a sample label.

3.2    Y-12 National Security Complex

3.2.1  Site Description

The Y-12 site is located at the DOE ORNL in Oak Ridge,
TN.   The Y-12 site is  an active manufacturing  and
developmental   engineering   facility   that  occupies
approximately 800 acres  on the northeast corner of the
DOE Oak Ridge Reservation (ORR) adjacent to the city of
Oak  Ridge, TN.  Built in 1 943 by the U.S. Army Corps of
Engineers as part of the World War II Manhattan Project,
the original mission of the installation was development of
electromagnetic  separation of  uranium isotopes  and
weapon components manufacturing, as partof the national
effort to produce the atomic bomb.  Between 1950 and
1963, large quantities of elemental mercury were used at
Y-12 during  lithium isotope separation pilot studies and
subsequent   production    processes   in   support of
thermonuclear weapons programs.

Soils at the Y-12 facility are contaminated with mercury in
many areas.  One of the areas of known high  levels of
mercury-contaminated soils is  in the vicinity of a former
mercury use  facility (the "Old Mercury Recovery Building"
- Building 8110). At this  location, mercury-contaminated
material and soil were processed in a  Nicols-Herschoff
roasting furnace to recover mercury. Releases of mercury
from this  process, and from a  building sump used to
secure the  mercury-contaminated  materials  and  the
recovered mercury, have contaminated  the surrounding
soils (Rothchild, et al., 1984). Mercury contamination also
occurred in the sediments of the East Fork of Poplar Creek
(DOE,  1998).   The Upper East Fork of Poplar Creek
(UEFPC) drains the entire Y-12 complex.   Releases of
mercury via building drains connected to the storm sewer
system, building basement  dewatering sump discharges,
and  spills to soils, all contributed to contamination of
UEFPC.   Recent investigations showed that bank  soils
containing mercury along the UEFPC were eroding and
contributing to mercury loading. Stabilization of the bank
soils along this reach of the creek was recently completed.

3.2.2  Sample Collection
Two  matrices were sampled at Y-12  in Oak Ridge, TN,
creek sediment  and soil.  A total of 10 sediment samples
was  collected; one sediment sample was collected  from
the Lower East Fork of Poplar Creek (LEFPC)  and  nine
sediment samples were collected from the UEFPC. A total
of six soil samples was collected from the Building 8110
area.  The  sampling  procedures that were  used  are
summarized below.

Creek Sediments - Creek sediments were  collected on
September 24-25, 2002  from the East Fork  of Poplar
Creek.  Sediment samples were  collected from various
locations in a downstream to upstream sequence (i.e., the
downstream LEFPC sample was collected first and the
most upstream point of the UEFPC was sampled last).

The sediment samples from Poplar Creek were collected
using a  commercially available clam-shell sonar dredge
attached to a rope. The dredge was slowly lowered to the
creek bottom surface, where it was pushed by foot into the
sediment.  Several drops of the sampler (usually seven or
more) were made to collect enough material for screening.
On  some  occasions,  a  shovel  was used to  remove
overlying "hardpan"  gravel  to expose finer sediments at
depth.  One creek sample  consisted  of  creek bank
sediments, which was collected using  a stainless steel
trowel.

The collected sediment was then  poured onto  a 6.3-mm
sieve to remove oversize sample material. Sieved samples
were then  placed in  12-L sealable plastic buckets.  The
sediment samples in these buckets were homogenized
with a plastic ladle and subsamples were collected in 20-
milliliter  (mL) vials for quick turnaround analyses.

Soil  - Soil samples were collected from pre-selected
boring locations September 25, 2002.  All samples were
collected in the immediate vicinity of the Building 8110
foundation using a commercially available bucket auger.
Oversize material was hand picked from the excavated soil
because the soil was too wet to be passed through a sieve.
The   soil   was  transferred   to   an  aluminum   pan,
homogenized by hand, and subsampled to a 20-mL vial.
The   remaining  soil  was  transferred  to  4-L  plastic
containers.

3.3     Confidential Manufacturing Site

3.3.1  Site Description
A confidential manufacturing site, located in the eastern
U.S., was selected for  participation in this demonstration.
The site contains elemental mercury, mercury amalgams,
and mercury oxide in shallow sediments (less than 0.3 m
deep) and  deeper  soils  (3.65  to 9 m  bgs).   This site
provided soil with concentrations from 5-1,000 mg/kg.

The  site is the  location of three  former processes that
resulted in mercury contamination.  The first  process
                                                   11

-------
involved amalgamation of zinc with mercury.  The second
process involved the manufacturing of zinc oxide.  The
third process involved the reclamation of silver and gold
from  mercury-bearing  materials  in  a  retort furnace.
Operations led  to  the dispersal  of elemental mercury,
mercury compounds  such as chlorides and  oxides, and
zinc-mercury amalgams.   Mercury values have  been
measured ranging  from  0.05 to over 5,000  mg/kg, with
average values of approximately 100 mg/kg.

3.3.2 Sample Collection
Eleven  subsurface  soil samples  were collected  on
September 24, 2002.  All samples were collected with a
Geoprobe® unit using plastic sleeves. All samples were
collected at the location of a former facility plant.  Drilling
locations  were  determined based on  historical   data
provided by the site operator. The intention was to gather
soil samples across a range of concentrations. Because
the surface soils were from relatively clean fill, the sampling
device was pushed to a depth of 3.65 m using a blank rod.
Samples were  then  collected  at  pre-selected  depths
ranging from 3.65 to 9 m bgs. Individual cores were 1-m
long.  The plastic sleeve for each 1-m core was  marked
with a permanent marker; the depth interval and the bottom
of each core was marked.  The filled plastic tubes were
transferred to a  staging table where appropriate  depth
intervals were selected for mixing. Selected tubes were cut
into 0.6-m intervals,  which  were emptied into a plastic
container for premixing soils. When feasible, soils were
initially screened to remove  materials larger than  6.3-mm
in diameter.  In many cases,  soils were too wet and clayey
to allow screening; in these cases, the soil was broken into
pieces by hand and, by using a wooden spatula, oversize
materials were manually  removed.  These soils  (screened
or hand  sorted) were then mixed until the soil  appeared
visually uniform  in color and texture. The mixed  soil was
then placed into a 4-L sample container for each chosen
sample  interval.   A  subsample of the  mixed soil was
transferred into a 20-mL vial, and  it was sent for  quick
turnaround mercury analysis. This process was repeated
for each subsequent sample interval.

3.4    Puget Sound

3.4.1 Site Description
The Puget Sound site consists of contaminated offshore
sediments.  The particular area  of the site used for
collecting  demonstration samples  is identified  as  the
Georgia Pacific, Inc. Log Pond. The Log  Pond is located
within the Whatcom Waterway in Bellingham Bay, WA, a
well-established  heavy industrial land use area  with a
maritime shoreline designation.   Log Pond  sediments
measure approximately  1.5 to 1.8-m thick, and contain
various   contaminants  including   mercury,   phenols,
polyaromatic hydrocarbons, polychlorinated biphenyls, and
wood debris.  Mercury was used as a preservative in  the
logging industry. The area was capped in late 2000 and
early 2001  with an average  of 7 feet of  clean capping
material, as part of a Model Toxics Control Act interim
cleanup  action.   The  total  thickness  ranges  from
approximately 0.15 m along the site perimeter to 3 m within
the interior of the project area. The restoration  project
produced 2.7 acres of shallow sub-tidal and 2.9 acres of
low intertidal habitat, all of which had previously exceeded
the Sediment Management  Standards cleanup criteria
(Anchor Environmental, 2001).

Mercury concentrations have been measured ranging from
0.16 to 400 mg/kg (dry  wt).  The  majority (98%) of  the
mercury  detected  in near-shore  ground waters  and
sediments of the Log  Pond is believed to be comprised of
complexed divalent (Hg2+) forms such as mercuric sulfide
(Bothner, et al., 1980  and Anchor Environmental, 2000).
3.4.2 Sample Collection
Science Applications International Corporation (SAIC) is
currently performing a SITE remedialtechnology evaluation
in the Puget Sound  (SAIC, 2002).  As part of ongoing work
at  that site, SAIC collected additional sediment for use
during this MMT project. Sediment samples collected on
August 20-21, 2002 from the Log Pond in Puget Sound
were obtained beneath approximately 3-6 m of water, using
a vibra-coring system capable of capturing cores to 0.3 m
below the  proposed dredging  prism.  The  vibra-corer
consisted  of a core barrel  attached  to a  power head.
Aluminum core tubes,  equipped with a stainless  steel
"eggshell" core catcher to retain  material, were inserted
into the core barrel.  The vibra-core was  lowered  into
position on the bottom  and advanced to the appropriate
sampling depth.    Once sampling  was completed, the
vibra-core was retrieved and the core  liner removed from
the core barrel. The core sample was examined at each
end to verify that sufficient sediment was retained for the
particular sample.  The  condition  and quantity of material
within  the  core   was   then   inspected  to  determine
acceptability.

The  following  criteria were used to  verify  whether an
acceptable core sample was collected:

    Target penetration depth (i.e.,  into native material) was
    achieved.
                                                    12

-------
    Sediment recovery of at least 65% of the penetration
    depth was achieved.

    Sample appeared  undisturbed and intact without any
    evidence of obstruction/blocking within the core tube or
    catcher.

The percentsediment recovery was determined by dividing
the length of  material recovered  by the  depth  of core
penetration below the mud line. If the sample was deemed
acceptable, overlying water was siphoned from the top of
the core tube and each end of the tube capped and sealed
with duct tape.  Following core  collection,  representative
samples  were  collected  from  each   core  section
representing a different vertical horizon.  Sediment was
collected from the  center of the core that had not been
smeared by, or in contact with, the core tube. The volumes
removed were placed in a decontaminated stainless steel
bowl or pan and mixed until homogenous in texture and
color (approximately 2  minutes).

After all sediment for  a vertical horizon composite was
collected and  homogenized, representative aliquots were
placed in the appropriate pre-cleaned sample containers.
Samples of both the sediment and the underlying native
materialwere  collected in a similarmanner. Distinct layers
of sediment and native material were easily recognizable
within each core.

3.5     Demonstration Site
The   demonstration   was  conducted   in   a  natural
environment, outdoors, in Oak Ridge, TN.  The area was
a grass covered hill with some parking areas, all of which
were surrounded by trees. Building 5507, in the center of
the demonstration area, provided facilities for lunch, break,
and sample storage for the project and personnel.

Most of the demonstration was performed during rainfall
events ranging from steady to torrential.  Severe puddling
of rain occurred to the extent that boards needed to be
placed under chairs to prevent them from sinking into the
ground. Even when it was not raining, the relative humidity
was high, ranging from 70.6 to 98.3 percent. Between two
and four of the tent sides were used  to keep rainfall from
damaging  the  instruments.   The  temperature in the
afternoons ranged from 65-70 degrees Fahrenheit, and the
wind speed was less than 10 mph. The latitude  is 36°N,
the longitude 35°W, and the elevation 275 m. (Figure 3-1
is a photograph of the site during the demonstration and
Figure 3-2 is a photograph of the location.)
Figure 3-1. Tent and field conditions during the
demonstration at Oak Ridge, TN.
Figure 3-2. Demonstration site and Building 5507.
                                                    13

-------
3.6    SAIC GeoMechanics Laboratory
Sample homogenization was completed  at  the  SAIC
GeoMechanics Laboratory in Las Vegas, NV.  This facility
is an  industrial-type building  with separate facilities for
personnel offices  and material  handling.  The  primary
function of the laboratory is for rock mechanics  studies.
The laboratory has rock mechanics equipment, including
sieves, rockcrushers, and sample splitters. The personnel
associated with this laboratory are experienced in the areas
of sample preparation  and sample homogenization.  In
addition to  the sample homogenization equipment, the
laboratory contains several benches, tables, and open
space. Mercury air monitoring equipment was used during
the sample  preparation activities for personnel safety.
                                                   14

-------
                                                   Chapter 4
                                       Demonstration Approach
This chapter describes the demonstration approach that
was used for evaluating the field mercury measurement
technologies  at  ORNL  in  May 2003.   It  presents  the
objectives, design, sample preparation and management
procedures,  and  the  reference  method  confirmatory
process used for the demonstration.


4.1     Demonstration Objectives

The primary goal of the SITE  MMT Program is to develop
reliable  performance  and   cost   data  on  innovative,
field-ready   measurement   technologies.     A   SITE
                                  demonstration   must  provide   detailed   and   reliable
                                  performance   and  cost  data  in  order  that  potential
                                  technology users have adequate information  needed to
                                  make   sound  judgements   regarding  an   innovative
                                  technology's applicability to a specific site and to be able to
                                  compare the technology to conventional technologies.

                                  Table  4-1  summarizes  the project  objectives  for this
                                  demonstration.  In accordance with QAPP Requirements
                                  for Applied Research Projects (EPA,1998), the technical
                                  project objectives for the demonstration were categorized
                                  as  primary and secondary.
Table 4-1. Demonstration Objectives

        Objective
                             Description
                                                                       Method of Evaluation
 Primary Objectives
 Primary Objective # 1


 Primary Objective # 2
 Primary Objective # 3

 Primary Objective # 4


 Primary Objective # 5
Determine sensitivity of each instrument with respect to vendor-generated MDL and
PQL.

Determine potential analytical accuracy associated with vendor field measurements.
Evaluate the precision of vendorfield measurements.

Measure time required to perform five functions related to mercury measurements:
1) mobilization and setup, 2) initial calibration, 3) daily calibration, 4) sample
analysis, and 5) demobilization.
Estimate costs associated with mercury measurements for the following four
categories: 1) capital. 2) labor. 3) supplies, and 4) investigation-derived wastes.
Independent laboratory
confirmation of SRMs,
field samples, and
spiked field samples.
Documentation during
demonstration; vendor-
provided information.
 Secondary Objectives
 Secondary Objective # 1
 Secondary Objective # 2
 Secondary Objective # 3
 Secondary Objective # 4

 Secondary Objective # 5
Document ease of use, skills, and training required to operate the device properly.
Document potential H&S concerns associated with operating the device.
Document portability of the device.
Evaluate durability of device based on materials of construction and engineering
design.
Document the availability of the device and its spare parts.
Documentation of
observations during
demonstration; vendor-
provided information.

Post-demonstration
investigation.	
                                                        15

-------
Critical data support primary objectives and noncritical data
support  secondary objectives. With the exception of the
cost information, primary objectives required the use of
quantitative  results  to  draw  conclusions  regarding
technology performance. Secondary objectives pertained
to  information  that was useful and did not necessarily
require the use of quantitative results to draw conclusions
regarding technology performance.

4.2     Demonstration Design

4.2.1   Approach  for Addressing  Primary
Objectives
The  purpose of this demonstration was to evaluate the
performance of the vendor's  instrumentation  against a
standard laboratory procedure.   In  addition, an  overall
average relative standard deviation (RSD) was calculated
for all measurements made by the vendor and the referee
laboratory.  RSD comparisons used descriptive statistics,
not inferential statistics, between the vendor and laboratory
results.  Other statistical comparisons (both inferential and
descriptive) for sensitivity,  precision, and accuracy were
used, depending upon actual demonstration results.

The  approach  for addressing  each  of  the  primary
objectives is discussed  in the following  subsections. A
detailed explanation of the precise statistical determination
used for evaluating primary objectives No. 1 through No. 3
is presented in Chapter 6.

4.2.1.1   Primary Objective #1: Sensitivity

Sensitivity is the  ability of a method or instrument to
discriminate   between   small  differences  in  analyte
concentration (EPA, 2002b). It can be discussed in terms
of an instrument detection  limit (IDL), a method detection
limit (MDL),  and as a practical  quantitation limit (PQL).
MDL is not a measure of sensitivity in the same respect as
an  IDL  or PQL.   It is a measure of precision  at a
predetermined,  usually low,  concentration.  The   IDL
pertains to the ability of the instrument to determine with
confidence the difference between a sample that contains
the analyte of interest at a low concentration and a sample
that does  not contain that  analyte.  The IDL is generally
considered to be the minimum true concentration of an
analyte  producing  a  non-zero  signal  that  can  be
distinguished from the  signals  generated  when no
concentration of  the analyte is  present and with an
adequate degree of certainty.
The IDL is not rigidly defined in terms of matrix, method,
laboratory,  or  analyst variability,  and it  is not usually
associated with a statistical level of confidence. IDLs are,
thus, usually lower than MDLs and  rarely serve a purpose
in terms of project objectives (EPA, 2002b).  The PQL
defines a specific concentration with an associated level of
accuracy.   The MDL defines a lower limit at which a
method   measurement   can  be  distinguished  from
background  noise.   The PQL  is a more  meaningful
estimate of sensitivity. The MDL and PQL were chosen as
the two distinct parameters for evaluating sensitivity.  The
approach  for  addressing  each  of  these  indicator
parameters  is  discussed separately in   the following
paragraphs.

MDL

MDL is the estimated measure of sensitivity as defined in
40 Code of Federal  Regulations (CFR) Part  136.  The
purpose of the MDL measurement is  to estimate  the
concentration at which an individual field instrument is able
to  detect a  minimum concentration  that  is  statistically
different from instrument background or noise. Guidance
for the definition of the MDL is provided in EPA G-5i (EPA,
2002b).

The  determination of  a MDL  usually requires  seven
different measurements of a low concentration standard or
sample. Following procedures established  in 40 CFR Part
136 for water matrices, the demonstration  MDL definition
is as follows:

                 MDL = Vl.O.Q9)S
where: t(n_..
          ,0.99)
99  percentile of the t-distribution
with n -1 degrees of freedom
number of measurements
standard deviation  of  replicate
measurements
PQL
The PQL is another important measure of sensitivity. The
PQL is defined in  EPA G-5i as  the  lowest  level an
instrument is  capable of producing a  result  that has
significance in terms of precision and bias.  (Bias  is the
difference  between  the measured  value and  the true
value.)  It is generally considered the lowest standard on
the instrument calibration curve.  It is often 5-10  times
higher  than the MDL, depending  upon the  analyte, the
instrument being  used, and  the  method for  analysis;
however, it should not be rigidly defined in this manner.
                                                    16

-------
During the demonstration, the PQL was to be defined by
the vendor's  reported calibration or based  upon lower
concentration samples or  SRMs.   The  evaluation of
vendor-reported   results   for  the  PQL  included  a
determination of the percentdifference (%D) between their
calculated value  and  true  value.  The true value  is
considered the value reported by the referee laboratory for
field samples or spiked field samples,  or, in the case of
SRMs, the certified value provided by the  supplier.  The
equation used for the %D calculation is:

                          calculated
where: C,,
                        'true
true concentration as determined
by the referee laboratory or SRM
reference value
calculated    test sample
concentration
The PQL and %D were reported for the vendor.  The %D
for the referee laboratory, at the same concentration, was
also reported for purposes of comparison.  No statistical
comparison was made between these two values; only a
descriptive comparison was made  for purposes of this
evaluation. (The %D requirementforthe referee laboratory
was defined  as  10% or less. The reference method PQL
was approximately  10 ug/kg.)

4.2.1.2 Primary Objective #2: Accuracy

Accuracy was calculated by comparing the measured value
to  a  known or true  value.     For  purposes  of  this
demonstration,  three separate standards  were used to
evaluate  accuracy.  These  included:   1) SRMs, 2) field
samples   collected   from  four  separate  mercury-
contaminated sites, and 3)spiked field samples. Foursites
were  used  for  evaluation of the  Ohio  Lumex field
instrument.    Samples  representing field  samples  and
spiked field  samples   were prepared  at the SAIC
GeoMechanics  Laboratory.  In order  to  prevent cross
contamination, SRMs were prepared in a separate location.
Each  of these standards  is discussed separately in the
following  paragraphs.

SRMs

The primary standards used to determine accuracy for this
demonstration were SRMs.  SRMs provided very tight
statistical comparisons, although  they did not provide all
matrices of interest nor all ranges of concentrations. The
SRMs were  obtained from reputable suppliers, and had
reported  concentrations at  associated 95% confidence
intervals  (CIs) and 95% prediction intervals.  Prediction
intervals were used for comparison because they represent
a statistically infinite number of analyses,  and  therefore,
would include all possible correct results 95% of the time.
All SRMs were analyzed  by the  referee  laboratory and
selected SRMs were analyzed by the vendor, based upon
instrument capabilities and concentrations of SRMs that
could be obtained. Selected SRMs covered  an appropriate
range for each  vendor.    Replicate  SRMs were  also
analyzed by the vendor and the laboratory.

The purpose for SRM analysis by the referee laboratory
was to provide a check on laboratory accuracy. During the
pre-demonstration, the referee laboratory was chosen,  in
part, based upon the analysis of SRMs.  This was done  to
ensure a competent laboratory  would be  used for the
demonstration. Because of the need to provide confidence
in laboratory analysis during the demonstration, the referee
laboratory analyzed  SRMs as  an ongoing  check for
laboratory bias.

Evaluation of vendor and laboratory analysis of SRMs was
performed  as follows.   Accuracy was   reported for
individual   sample   concentrations  of   replicate
measurements made at the  same concentration.

Two-tailed 95%  CIs  were  computed  according  to the
following equation:
                                                    (n-1,0.975)
                                                                -Jn
                                 where: t(n.1?0975) =
                      97.5th pe re entile  of  the
                      t-distribution with n-1  degrees of
                      freedom
                      number of measurements
                      standard  deviation of replicate
                      measurements
                                 The number of vendor-reported SRM results and referee
                                 laboratory-reported SRM  results that were  within the
                                 associated  95%  prediction   interval  were  evaluated.
                                 Prediction intervals were computed in a similar fashion to
                                 the Cl, except that the Student's "t" value use "n" equal to
                                 infinity and, because prediction intervals represented "n"
                                 approaching infinity, the square root  of "n" was  dropped
                                 from  the equation.

                                 A final measure of accuracy determined from SRMs is a
                                 frequency distribution thatshows the percentage of vendor-
                                 reported measurements that are within a specified window
                                                    17

-------
of the reference value. For example, a distribution within
a 30% window of a reported concentration, within a 50%
window,  and  outside a  50%  window  of a  reported
concentration.  This distribution aspect could be reported
as average concentrations of replicate results  from  the
vendorfor a particular concentration and matrix, compared
to the same collected sample from the laboratory. These
are descriptive statistics and are  used  to better describe
comparisons, but they are not intended as inferential tests.

Field Samples

The second accuracystandard used forthis demonstration
was  actual field samples collected from  four  separate
mercury-contaminated sites. This accuracy determination
consisted of a  comparison of vendor-reported results for
field samples to the referee laboratory results for the same
field samples. The field samples were used to ensure that
"real-world" samples  were tested for each  vendor.  The
field samples consisted of variable mercury concentrations
within varying  soil  and sediment matrices. The referee
laboratory  results  are  considered  the  standard  for
comparison to each vendor.

Vendor  sample  results for  a  given field sample  were
compared to replicates analyzed by the laboratory for the
same field sample.  (A hypothesis test was use with alpha
= 0.01 was performed.   The  null  hypothesis was  that
sample  results  were similar.   Therefore,  if  the  null
hypothesis is rejected, then the sample sets are considered
different.)    Comparisons  for  a  specific  matrix  or
concentration were made in order  to provide additional
information  on that  specific   matrix   or concentration.
Comparison of the vendorvalues to laboratory values were
similar to the  comparisons  noted previously for SRMs,
except that a  more  definitive  or  inferential statistical
evaluation was used. Alpha  = 0.01  was  used to help
mitigate  inter-laboratory  variability.   Additionally,  an
aggregate  analysis  was  used  to  mitigate  statistical
anomalies (see Section 6.1.2).

Spiked Field Samples

The  third accuracy standard for this demonstration  was
spiked field samples.  These  spiked field samples were
analyzed by the vendors and by the referee laboratory in
replicate, in order to provide additional  measurement
comparisons to a known value. Spikes were prepared to
cover additional concentrations not available from SRMs or
the samples collected in the field. They were grouped with
the field sample comparison noted above.
4.2.1.3 Primary Objective #3: Precision

Precision  can  be defined  as  the degree  of mutual
agreement of  independent  measurements  generated
through repeated application of a process under specified
conditions. Precision is usually thought of as repeatability
of a specific measurement, and it is often reported as RSD.
The  RSD  is computed  from a  specified number  of
replicates.  The  more replications of a measurement, the
more  confidence  is  associated  with  a  reported  RSD.
Replication of a  measurement may  be  as few  as 3
separate  measurements to 30 or more  measurements of
the same  sample,  dependent  upon the degree  of
confidence desired in the specified result.  The precision
of an analytical instrument may vary depending upon the
matrix being measured, the concentration  of the analyte,
and whether the measurement is made for an SRM or a
field sample.

The experimental design forthis demonstration included a
mechanism  to  evaluate  the  precision of the  vendors'
technologies.   Field samples from the  four mercury-
contaminated field sites were evaluated by each vendor's
analytical   instrument.     During  the  demonstration,
concentrations were predetermined only as low, medium,
or high.  Ranges of test samples (field samples, SRMs,
and  spikes)  were selected  to cover the appropriate
analytical ranges of the vendor's instrumentation.  It was
known prior to the demonstration that not all vendors were
capable of measuring similar concentrations (i.e., some
instruments were better at measuring low concentrations
and  others  were   geared  toward  higher concentration
samples or had other attributes such as cost or ease of use
that  defined  specific  attributes  of  their technology).
Because  of this fact, not all vendors analyzed  the same
samples.

During  the demonstration, the vendor's instrumentation
was  tested with samples from  the four different  sites,
having  different matrices  when possible (i.e., depending
upon  available  concentrations)  and  having  different
concentrations (high, medium, and low) using a variety of
samples.    Sample  concentrations  for  an  individual
instrument were chosen based upon vendor attributes in
terms of expected low, medium,  and high concentrations
that the particular instrument was capable of measuring.

The referee laboratory measured replicates of all samples.
The results were used for precision comparisons to the
individual  vendor.  The  RSD for the  vendor  and  the
laboratory were calculated individually, using the following
equation:
                                                    18

-------
               %RSD = -x1QO
                        x
where: S  = standard deviation of replicate results
       x = mean value of replicate results

Using descriptive statistics, differences between vendor
RSD and referee laboratory RSD were determined. This
included RSD comparisons based  upon  concentration,
SRMs, field samples, and different sites.  In addition, an
overall average RSD was calculated for all measurements
made by the vendor and the laboratory. RSD comparisons
were  based   upon  descriptive  statistical  evaluations
between the vendor and the laboratory, and results were
compared  accordingly.

4.2.1.4 Primary  Objective #4: Time per Analysis

The amount of time required for performing the analysis
was measured and reported for five categories:

    Mobilization and setup
    Initial calibration
    Daily calibration
    Sample analyses
    Demobilization

Mobilization and setup included the time needed to unpack
and prepare the instrument for  operation. Initial calibration
included the time to perform  the vendor recommended
on-site calibrations.  Daily calibration  included the time to
perform  the  vendor-recommended   calibrations   on
subsequent field days. (Note that this  could have been the
same as the  initial calibration, a  reduced calibration, or
none.) Sample  analyses included the time to  prepare,
measure, and calculate the results for the demonstration
and the necessary quality control (QC) samples performed
by the vendor.

The time per analysis was determined by dividing the total
amount of time required to perform  the analyses by  the
number of  samples analyzed  (197).  In  the  numerator,
sample analysis time included preparation, measurement,
and calculation of results for demonstration samples and
necessary QC samples performed by the vendor. In  the
denominator,  the total number of analyses included only
demonstration samples analyzed  by  the vendor, not  QC
analyses nor reanalyses of samples.

Downtime  that was required  or that occurred  between
sample analyses as a part of operation and handling was
considered a part of the sample analysis time. Downtime
occurring  due to instrument  breakage  or  unexpected
maintenance was not counted  in the assessment, but  it is
noted in this final report  as an additional time.   Any
downtime  caused by  instrument saturation or memory
effect was addressed, based  upon  its  frequency and
impact on  the analysis.

Unique  time measurements  are also addressed in this
report (e.g., if soil samples were analyzed directly, and
sedimentsamples required additional time to dry before the
analyses started, then  a statement was made noting that
soil samples were analyzed in X amount of hours, and that
sediment samples required drying time before analysis).

Recorded  times were rounded  to the nearest  15-minute
interval.  The number of vendor personnel used was noted
and factored into the time calculations. No comparison on
time  per analysis is made between the vendor and the
referee laboratory.

4.2.1.5  Primary  Objective #5: Cost

The  following four cost  categories were  considered  to
estimate costs associated with mercury measurements:

    Capital costs
    Labor  costs
    Supply costs
    Investigation-derived waste (IDW) disposal costs

Although both vendor and laboratory costs are presented,
the calculated costs were not compared with the referee
laboratory. A summary  of how each cost category was
estimated  for the measurement device is provided below.

   The capital cost was estimated based  on  published
    price lists for purchasing,  renting, or leasing each field
    measurement device.  If the device was purchased,
   the capital cost estimate did not  include salvage  value
   for the device after work was completed.

   The labor cost was based on the number of people
    required to analyze samples during the demonstration.
   The labor rate was  based on a standard hourly rate for
   a technician or other appropriate operator. During the
   demonstration, the skill level required was  confirmed
    based on vendor input regarding the operation of the
   device to produce mercury concentration results and
   observations made in the field.  The labor costs were
    based on: 1) the actual number of hours required  to
   complete all  analyses, quality  assurance (QA), and
    reporting; and 2) the assumption thata technician who
   worked for a portion of a day was paid for an entire
    8-hour day.

   The supply costs were based on  any supplies required
   to analyze the field and SRM  samples  during the
                                                    19

-------
    demonstration.   Supplies  consisted  of  items  not
    included in the capital category, such  as  extraction
    solvent, glassware,  pipettes, spatulas,  agitators, and
    similar materials.  The type and quantity of all supplies
    broughtto the field and used during the demonstration
    were noted and documented.

    Any  maintenance  and  repair  costs during  the
    demonstration were documented or provided by the
    vendor.  Equipment costs were estimated based  on
    this information and  standard cost analysis guidelines
    used in the SITE Program.

    The  IDW disposal  costs included decontamination
    fluids and equipment, mercury-contaminated soil and
    sediment samples, and  used   sample   residues.
    Contaminated personal  protective equipment (PPE)
    normally used  in the laboratory was  placed into a
    separate container.  The disposal  costs for the IDW
    were included in the overall analytical costs for each
    vendor.

After all of the cost categories were estimated, the cost per
analysis was calculated. This costvalue was based on the
number of analyses performed. As the numberof samples
analyzed increased, the initial capital  costs and certain
other costs were distributed  across  a  greater  number of
samples. Therefore, the per unit cost decreased. For this
reason, two costs were reported:  1) the initial capital costs
and 2) the operating costs per analysis. No comparison to
the referee laboratory's  method cost was made; however,
a generic cost comparison was made.  Additionally, when
determining  laboratory costs, the  associated cost for
laboratory audits and data validation should be considered.

4.2.2  Approach for Addressing Secondary
       Objectives
Secondary   objectives  were   evaluated   based  on
observations made during the demonstration. Because of
the number  of vendors  involved, technology observers
were required to make simultaneous observations of two
vendors each during the demonstration. Four procedures
were  implemented  to  ensure  that  these  subjective
observations made by the observers were as consistent as
possible.

First, forms were developed for each  of the five secondary
objectives.   These forms assisted  in standardizing the
observations. Second, the observers met each day before
the evaluations began, at significant break  periods,  and
after  each  day  of  work  to  discuss  and  compare
observations regarding each  device. Third, an additional
observerwas assigned to independently evaluate onlythe
secondary objectives in order to ensure that a consistent
approach was applied in  evaluating  these  objectives.
Finally, the SAIC  TOM circulated among the  evaluation
staff during the demonstration to ensure that a consistent
approach was being followed  by all personnel. Table 4-2
summarizes   the  aspects  observed   during   the
demonstration for  each  secondary  objective.     The
individual approaches to  each of these objectives  are
detailed further in the following subsections.
Table 4-2. Summary of Secondary Objective Observations Recorded During the Demonstration

                                                 SECONDARY OBJECTIVE
General
Information
- Vendor Name
- Observer Name
- Instrument Type
- Instrument Name
- Model No.
- Serial No.
Secondary Objective # 1
Ease of Use
- No. of Operators
- Operator Names/Titles
- Operator Training
- Training References
- Instrument Setup Time
- Instrument Calibration Time
- Sample Preparation Time
- Sample Measurement Time
Secondary Objective # 2
H&S Concerns
- Instrument Certifications
- Electrical Hazards
- Chemicals Used
- Radiological Sources
- Hg Exposure Pathways
- Hg Vapor Monitoring
- PPE Requirements
- Mechanical Hazard
- Waste Handling Issues
Secondary Objective # 3
Instrument Portability
- Instrument Weight
- Instrument Dimensions
- Power Sources
- Packaging
- Shipping & Handling
Secondary Objective # 4
Instrument Durability
- Materials of Construction
- Quality of Construction
- Max. Operating Temp.
- Max. Operating Humidity
- Downtime
- Maintenance Activities
- Repairs Conducted
H&S = Health and Safety
PPE = Personal Protective Equipment
                                                   20

-------
4.2.2.1 Secondary Objective #1:  Ease of Use

The skills and training required for proper device operation
were noted; these included any degrees or specialized
training required  by the operators.  This information was
gathered by interviews (i.e., questioning) of the operators.
The number of operators required was also noted.  This
objective was also evaluated by subjective  observations
regarding the easeof equipment useand major peripherals
required to measure mercury concentrations in soils and
sediments.  The  operating  manual  was  evaluated  to
determine  if it is easily useable and understandable.

4.2.2.2 Secondary  Objective #2:  Health  and Safety
       Concerns

Health and safety (H&S) concerns associated with device
operation  were noted during the demonstration. Criteria
included hazardous  materials used, the  frequency and
likelihood of potential exposures, and any direct exposures
observed  during  the  demonstration.   In  addition,  any
potential for exposure to mercury during sample digestion
and  analysis was evaluated,  based  upon  equipment
design. Other H&S concerns, such as basic electrical and
mechanical  hazards,  were  also  noted.    Equipment
certifications, such as Underwriters Laboratory (UL), were
documented.

4.2.2.3 Secondary  Objective  #3:   Portability of  the
       Device

The portability of the device was evaluated  by observing
transport,   measuring  setup  and  tear  down  time,
determining the size and weightof the unit and peripherals,
and assessing the ease with which the instrument was
repackaged for movement to another location. The use of
battery power or the need for an AC outlet was also noted.

4.2.2.4 Secondary Objective #4: Instrument Durability

The durability of each  device and major peripherals was
assessed   by noting  the  quality  of materials  and
construction.  All  device failures, routine  maintenance,
repairs,  and  downtime  were documented  during  the
demonstration.   No  specific  tests  were  performed  to
evaluate durability; rather, subjective observations were
made using a field form as guidance.

4.2.2.5 Secondary Objective #5: Availability of Vendor
       Instruments and Supplies

The  availability  of  each  device  was  evaluated  by
determining whether additional units and spare parts are
readily available  from the vendor or retail  stores.  The
vendor's office (or a web page) and/or a retail store was
contacted  to  identify and  determine  the  availability of
supplies of the tested measurement  device  and spare
parts. This portion of the evaluation was performed after
the field demonstration,  in  conjunction with the cost
estimate.

4.3    Sample Preparation and Management

4.3.1  Sample Prepara tion

4.3.1.1 Field Samples

Field samples were collected during the pre-demonstration
portion of the project, with the ultimate goal of producing a
set of consistent test soils and sediments to be distributed
among all participating vendors and the referee laboratory
for analysis during  the demonstration.   Samples were
collected from the following four sites:

    Carson River Mercury site (near Dayton, NV)
    Y-12 National Security Complex (Oak Ridge, TN)
    Manufacturing facility (eastern U.S.)
    Puget Sound (Bellingham, WA)

The field samples collected during the  pre-demonstration
sampling events comprised a variety of matrices,  ranging
from material  having a  high clay  content,  to  material
composed  mostly of gravelly, coarse sand.   The field
samples also differed with respect  to moisture content;
several were collected as wet sediments.  Table 4-3 shows
the number of distinct field samples that were collected
from each of the four field sites.

Prior to the start  of the demonstration, the field samples
selected  for  analysis during the  demonstration  were
processed at the SAIC GeoMechanics Laboratory in Las
Vegas,  NV.    The  specific sample  homogenization
procedure used by this laboratory largely depended on the
moisture content and physical consistency of the sample.
Two  specific  sample homogenization procedures were
developed  and tested by  SAIC  at the GeoMechanics
Laboratory  during the pre-demonstration portion of the
project.   The methods  included  a non-slurry  sample
procedure and a slurry sample procedure.

A standard operating procedure  (SOP) was  developed
detailing both methods.  The procedure was found to be
satisfactory, based upon  the results of replicate samples
during the  pre-demonstration. This SOP is included  as
Appendix A of the Field Demonstration Quality Assurance
Project Plan (SAIC, August 2003,  EPA/600/R-03/053).
Figure 4-1 summarizes the homogenization steps of the
SOP, beginning with sample mixing. This procedure was
                                                    21

-------
used   for   preparing   both  pre-demonstration  and
demonstration samples. Prior to the mixing process (i.e.,
Step 1 in Figure 4-1), all field samples being  processed
were visually inspected to ensure that oversized materials
were removed, and that there were no clumps  that would
hinderhomogenization. Non-slurry samples were air-dried
in accordance with the SOP, so that they could  be passed
multiple times through a riffle  splitter.  Due to the high
moisture content of many of the samples, they were not
easily air-dried and  could not be passed through a riffle
splitter while wet.   Samples  with  very  high  moisture
contents,  termed  "slurries,"  were  not  air-dried, and
bypassed the  riffle  splitting step.   The homogenization
steps for each type  of matrix are briefly summarized, as
follows.
Table 4-3.  Field Samples Collected from the Four Sites

                     No. of Samples / Matrices
 Field Site             Collected                  Areas For Collecting Sample Material
                         Volume Required
Carson River
Y-12
Manufacturing Site
Puget Sound
12 Soil
6 Sediment
10 Sediment
6 Soil
12 Soil
4 Sediment
Tailings Piles (Six Mile Canyon)
River Bank Sediments
Poplar Creek Sediments
Old Mercury Recovery Bldg. Soils
Subsurface Soils
High-Level Mercury (below cap)
Low-Level Mercury (native material)
4 L each for soil
12 L each for sediment
12 L each for sediment
4 L each for soil
4 L each
1 2 L each
Preparing Slurry Matrices

For slurries (i.e., wet sediments), the mixing steps were
sufficiently thorough that the sample containers could be
filled directly  from the  mixing  vessel.  There were  two
separate mixing steps for the slurry-type samples. Each
slurry was initially mixed mechanically within the sample
container (i.e., bucket) in which the sample was shipped to
the SAIC GeoMechanics Laboratory. A subsample of this
premixed sample was  transferred to a second  mixing
vessel.  A mechanical  drill equipped  with a paint mixing
attachment was used to mix the subsample. As shown in
Figure 4-1, slurry samples  bypassed the sample  riffle
splitting step.  To ensure all sample bottles  contained the
same material, the entire set  of containers to be filled was
submerged into the slurryas a group. The filled vials were
allowed to  settle  for a minimum  of  two days, and  the
standing water was removed  using a Pasteur pipette. The
removal of the standing  waterfrom the slurry samples was
the only change to the homogenization procedure between
the pre-demonstration and the demonstration.

Preparing "Non-Slurry" Matrices

Soils  and sediments having no excess moisture were
initially  mixed (Step 1) and then  homogenized in  the
sample  riffle  splitter (Step 2).  Prior to these steps, the
material was  air-dried  and  subsampled to  reduce  the
volume of material to a  size that was easier to handle.
As shown in Figure 4-1 (Step 1) the non-slurry subsample
was  manually stirred with a spoon or similar equipment
until  the material  was  visually  uniform.   Immediately
following manual mixing, the subsample was mixed and
split six times for more complete homogenization (Step 2).
After the sixth and final split, the sample material was
leveled to form a flattened, elongated rectangle and cut into
transverse sections to fill the containers (Steps 3 and 4).
After homogenization, 20-mL sample vials were filled and
prepared for shipment (Step 5).

For the demonstration, the vendor analyzed 197 samples,
which included replicates of up to 7 samples per sample
lot.    The   majority of the  samples distributed had
concentrations within the range of the vendor's tech no logy.
Some samples had expected concentrations at or below
the estimated level of detection  for each of the vendor
instruments.  These samples were designed to evaluate
the  reported MDL  and PQL and also to assess  the
prevalence of false positives. Field samples distributed to
the vendor included sediments and soils collected from all
four  sites  and  prepared  by both  the  slurry and  dry
homogenization  procedures.   The field  samples were
segregated into broad sample sets: low, medium, and high
mercury concentrations. This gave the vendor the same
general understanding of the sample to be analyzed  as
they  would  typically have for  field  application  of their
instrument.
                                                     22

-------
         Test material mixed until
             visually uniform

        For non-slurries
        Mix manually
        a) Mix mechanically the entire
        sample volume

        b) Subsample slurry, transfer to
        mixing vessel, and mix
        mechanically
      r
               Slurries transferred
              directly to 20 ml_ vials
               (vials submerged into slurry)
                                                             Non-slurries to
                                                              riffle splitter
                                  Combined splits
                                  are reintroduced
                                  into splitter (6 X)
                                               \              /
                   Transfer cut
                   sections to
                   20 ml_ vials
                                          zn
                                     TEFLON SURFACE
                                                     Elongated
                                                   rectangular pile
                                                   (from6 ** split)
                                Sample aliquots made
                                  by transverse cuts
                                 across sample  piles
1
                                 Samples shipped @ 4 °C to
                                  referee lab and Oak Ridge
                                      (Container numbers will vary)
Figure 4-1. Test sample preparation at the SAIC GeoMecharries Laboratory.
                                                23

-------
In  addition,  selected  field  samples  were spiked  with
mercury (II) chloride to generate samples with additional
concentrations  and  test  the  ability  of  the  vendor's
instrumentation to measure the  additional  species  of
mercury.  Specific  information  regarding the  vendor's
sample distribution is included in Chapter 6.

4.3.1.2 Standard Reference Materials

Certified SRMs were analyzed by both the vendors and the
referee laboratory.  These samples were homogenized
matrices which had  a known concentration  of mercury.
Concentrations were certified values, as provided by the
supplier,  based on independent confirmation via multiple
analyses of multiple  lots  and/or  multiple analyses by
different  laboratories (i.e.,  round robin testing).  These
analytical  results  were then used to determine "true"
values, as well as a  statistically derived intervals (a  95%
prediction interval) that provided a  range within which the
true values were expected to fall.

The  SRMs selected  were designed to encompass the
same contaminant  ranges indicated  previously:  low-,
medium-,  and  high-level mercury concentrations.    In
addition,  SRMs of varying matrices were included in the
demonstration to challenge the vendor technology as well
as the referee laboratory. The referee laboratory analyzed
all SRMs.  SRM samples were intermingled with site  field
samples and labeled in the same manner as field samples.

4.3.1.3 Spiked Field Samples

Spiked  field  samples were prepared   by  the  SAIC
GeoMechanics Laboratory using  mercury (II) chloride.
Spikes were prepared  using  field samples from  the
selected  sites.   Additional  information was  gained by
preparing  spikes  at  concentrations  not   previously
obtainable. The SAIC GeoMechanics Laboratory's ability
to  prepare spikes was tested prior to the  demonstration
and evaluated in order to determine expected variability
and accuracyof the spiked sample.  The spiking procedure
was evaluated by preparing several different spikes using
two different spiking procedures (dry and wet).  Based
upon replicate analyses results, it was determined that the
wet, or slurry, procedure was the only effective method  of
obtaining a homogeneous spiked sample.

4.3.2  Sample Management

4.3.2.1 Sample Volumes, Containers,and Preservation

A subset from the pre-demonstration field samples  was
selected  for  use in  the  demonstration  based on the
sample's mercury concentration range and sample  type
(i.e., sediment versus soil).   The SAIC GeoMechanics
Laboratory  prepared  individual batches of field  sample
material to fill sample containers for each vendor. Once all
containers from a field sample were filled, each container
was labeled and cooled to 4  °C.   Because mercury
analyses were  to be performed both by the vendors in the
field and by the referee laboratory, adequate sample size
was  taken  into  account.    Minimum  sample  size
requirements for the vendors varied from 0.1 g or less to
8-10 g.  Only  the referee laboratory  analyzed separate
sample aliquots for parameters otherthan mercury. These
additional parameters included arsenic, barium, cadmium,
chromium,  lead, selenium,  silver, copper, zinc,  oil and
grease, and total organic carbon (TOC). Since the mercury
method  (SW-846  7471B) being  used by  the  referee
laboratory requires 1 g for analysis, the sample size sent to
all participants was a 20-mL  vial (approximately 10 g),
which ensured a sufficient volume and mass for analysis
by all vendors.

4.3.2.2  Sample Labeling

The sample labeling used for the 20-mL vials consisted of
an internal code developed by SAIC. This "blind" code was
used  throughout  the  entire  demonstration.   The  only
individuals  who  knew the  key  to  the coding   of  the
homogenized samples to the specific  field samples were
the SAIC  TOM,  the  SAIC  GeoMechanics  Laboratory
Manager, and the SAIC QA Manager.

4.3.2.3  Sample   Record  Keeping,  Archiving,  and
        Custody

Samples were  shipped to  the  laboratory  and   the
demonstration  site the week prior to the demonstration. A
third set of vials was archived at the SAIC GeoMechanics
Laboratory as reserve samples.

The sample shipment to Oak Ridge  was  retained at all
times  in  the custody of SAIC at their Oak Ridge office until
arrival of the demonstration  field  crew.  Samples were
shipped under chain-of-custody (COC) and with  custody
seals on both the coolers  and the inner plastic bags. Once
the demonstration crew arrived, the coolers were retrieved
from the SAIC office.  The custody seals on the plastic
bags inside the cooler were broken by the vendor upon
transfer.

Upon  arrival at the ORNL site,  the  vendor set  up  the
instrumentation at the direction and oversight of SAIC. At
the start of sample testing, the vendor was provided with a
sample  set representing field  samples  collected  from a
particular field site, intermingled with SRM and  spiked
samples.    Due  to  variability   of  vendor  instrument
                                                    24

-------
measurement ranges for mercury detection, not all vendors
received  samples  from  the same field material.   All
samples were stored in an ice cooler prior to demonstration
startup and were stored in an on-site sample refrigerator
during the demonstration. Each sample set was identified
and distributed as a set, with  respect to the site from which
it was  collected.  This was done because, in any field
application, the location  and general type of the samples
would be known.

The vendor was responsible for analyzing all samples
provided,  performing any  dilutions   or  reanalyses  as
needed, calibrating the instrument if applicable, performing
any necessary maintenance, and reporting all results. Any
samples  that were not analyzed during the  day were
returned to  the vendor for analysis at the beginning of the
next day.   Once analysis of the samples from  the first
location were completed by  the vendor, SAIC provided a
set of samples from the second location. Samples were
provided  at the time that  they were requested by the
vendor.  Once  again, the transfer of samples was
documented using a chain-of-custody (COC) form.

This process was repeated forsamples from each location.
SAIC maintained custody of all remaining sample sets until
they were transferred to the vendor.   SAIC maintained
custody of samples that already had  been analyzed and
followed the waste handling procedures in Section 4.2.2 of
the Field Demonstration QAPP to dispose of these wastes.
4.4
Reference
Process
Method    Confirmatory
The  referee laboratory analyzed  all  samples that were
analyzed by the  vendor technologies in the field.  The
following subsections provide information on the selection
of the  reference  method,  selection of  the  referee
laboratory, and details regarding the  performance of the
reference  method  in  accordance with  EPA protocols.
Other parameters  that were analyzed  by  the referee
laboratory are also discussed briefly.

4.4.1   Reference Method Selection
The selection of SW-846 Method 7471B as the reference
method  was  based on  several factors,  predicated  on
information obtained from the technology vendors, as well
as the  expected contaminant types  and soil/sediment
mercury concentrations  expected  in  the test matrices.
There are several laboratory-based, promulgated methods
for the analysis of  total mercury.   In addition, there  are
several performance-based methods for the determination
of various  mercury  species.   Based  on  the  vendor
technologies, it was determined that a reference method
for total mercury would be needed (Table 1-2 summarizes
the methods evaluated, as identified through a review of
the EPA Test Method Index and SW-846).

In selecting  which of the  potential methods  would  be
suitable as a reference method, consideration was given to
the following questions:

   Was the method widely used and accepted? Was the
   method an EPA-recommended, or similar regulatory
   method?  The selected reference  method should  be
   sufficiently used  so that it could be  cited  as  an
   acceptable  method  for  monitoring  and/or  permit
   compliance among regulatory authorities.

   Did the selected reference method  provide QA/QC
   criteria  that  demonstrate acceptable performance
   characteristics over time?

   Was the method  suitable for the species of mercury
   that were expected to be encountered? The reference
   method  must be  capable  of  determining, as total
   mercury, all forms of the contaminant known or likely
   to be present in the matrices.

   Would the method achieve the necessary detection
   limits  to  evaluate  the  sensitivity of  each  vendor
   technology adequately?

   Was the method suitable for the concentration range
   that was expected in the test matrices?

Based on the above considerations, itwas determined that
SW-846  Method  7471B (analysis of mercury in  solid
samples by cold-vapor AAS) would be  the best reference
method. SW-846 method 7474,  (an atomic fluorescence
spectrometry method using Method 3052 for microwave
digestion of the solid) had also been considered a likely
technical candidate; however, because this  method was
not as  widely used  or  referenced, Method  7471B was
considered the better choice.

4.4.2  Referee Laboratory Selection
During the planning of the pre-demonstration phase of this
project, nine laboratories were sent a  statement of work
(SOW) for the analysis of mercury to be performed as part
of the pre-demonstration.  Seven of the nine laboratories
responded to the SOW with appropriate bids. Three of the
seven laboratories were selected as candidate laboratories
based  upon  technical  merit, experience,  and pricing.
These laboratories received and analyzed blind samples
and SRMs during pre-demonstration activities. The referee
                                                    25

-------
laboratory to be used for the demonstration was selected
from these three candidate laboratories.  Final selection of
the referee laboratory was based upon: 1) the laboratory's
interest in  continuing  in  the  demonstration,  2)  the
laboratory-reported SRM results, 3) the laboratory MDL for
the reference  method selected, 4) the  precision of the
laboratory calibration curve, 5) the  laboratory's ability to
support the demonstration (scheduling conflicts, backup
instrumentation, etc.), and 6) cost.

One  of the three  candidate laboratories was eliminated
from selection  based on a technical consideration. It was
determined that this laboratory would not be able to meet
demonstration  quantitation limit requirements.  (Its lower
calibration standard was approximately 50 ug/kg, and the
vendor  comparison requirements  were  well below  this
value.)  Two  candidates thus remained,  including  the
eventual demonstration laboratory,  Analytical Laboratory
Services, Inc. (ALSI):

        Analytical Laboratory Services, Inc.
        Ray Martrano, Laboratory Manager
        34 Dogwood Lane
        Middletown, PA 17057
        (717)944-5541

In  order to make  a final decision on  selecting a referee
laboratory, a preliminary audit was performed by the SAIC
QA Manager at the remaining  two candidate laboratories.
Results of the  SRM samples were compared for the  two
laboratories. Each laboratory analyzed each sample (there
were two SRMs) in triplicate. Both laboratories were within
the 95% prediction interval foreach SRM. In addition, the
average result from the two SRMs was  compared to the
95% Cl for the SRM.

Calibration curves from  each laboratory were reviewed
carefully. This included calibration curves generated from
previously performed analyses and those generated for
other laboratory clients. There were two QC requirements
regarding calibration curves; the correlation coefficient had
to  be  0.995  or greater  and the  lowest  point on  the
calibration curve had to be within 10%  of the predicted
value.  Both laboratories were able to achieve these  two
requirements for  all curves reviewed and for a  lower
standard of 10  ug/kg, which  was the  lower standard
required for the demonstration,  based upon information
received from each of the vendors.  In addition, an analysis
of  seven  standards was reviewed  for  MDLs.   Both
laboratories were able to achieve an MDL that was below
1 ug/kg.
It should be noted that vendor sensitivity claims impacted
how low this lower quantitation standard should be. These
claims were somewhat vague, and the actual quantitation
limit each vendor could achieve was uncertain prior to the
demonstration (i.e., some vendors claimed a sensitivity as
low as 1 ug/kg, but it was uncertain at the time if this limit
was actually a PQLora detection limit). Therefore, it was
determined that,  if necessary, the laboratory  actually
should be able to achieve even alowerPQL than 10 ug/kg.

For both laboratories, SOPs based upon SW-846 Method
7471B were reviewed. Each SOP followed this reference
method.   In  addition,  interferences were  discussed
because   there   was   some   concern  that   organic
interferences  may have been present in the  samples
previously analyzed by the laboratories.  Because these
same  matrices  were expected  to  be  part  of  the
demonstration, there was some concern  associated with
how these interferences would be eliminated.  This is
discussed at the end of this  subsection.

Sample throughput was somewhat important because the
selected  laboratory was to receive  all demonstration
samples at the same time (i.e., the samples were to be
analyzed  at  the  same time in order to eliminate any
question of variability associated with loss of contaminant
due to holding time). This meant that the laboratory would
receive approximately 400 samples for analysis over  the
period  of  a  few  days.  It  was also  desirable  for  the
laboratory to produce a data  report within  a 21-day
turnaround time for purposes of the demonstration. Both
laboratories   indicated   that   this   was   achievable.
Instrumentation was  reviewed  and examined  at  both
laboratories.  Each  laboratory  used  a Leeman mercury
analyzer for analysis.  One of the two laboratories had
backup  instrumentation  in   case  of problems.   Each
laboratory indicated that its Leeman mercury analyzer was
relatively new and had not been a problem in the past.

Previous  SITE program  experience  was another factor
considered as partof these pre-audits. This is because the
SITE program generally requires a very high  level of QC,
such that most laboratories are not familiar with the  QC
required unless they have previously participated in  the
program.  A second aspect of the SITE program is that it
generally requires  analysis of relatively "dirty" samples and
many laboratories are not use to  analyzing such "dirty"
samples.    Both   laboratories  have   been   longtime
participants in this program.

Other QC-related  issues  examined  during  the audits
included:  1) analyses of other SRM  samples not previously
examined, 2) laboratory control charts, and  3) precision
                                                    26

-------
and accuracy results.  Each of these issues was closely
examined.  Also, because of  the desire to increase the
representativeness of the samples for the demonstration,
each laboratory was asked if sample aliquot sizes could be
increased  to 1 g (the method  requirement  noted 0.2 g).
Based  upon previous results,  both laboratories  routinely
increased  sample  size to 0.5 g, and each laboratory
indicated that increasing the sample size would  not be a
problem. Besides these QC issues, other less tangible QA
elements  were  examined.    This  included   analyst
experience,   management  involvement   in   the
demonstration, and internal laboratory QA management.
These elements were also factored into the final decision.

Selection  Summary

There were very few factors that separated the quality of
these two laboratories.  Both were exemplary in performing
mercury analyses.   There were,  however, some minor
differences based upon this evaluation that were  noted by
the auditor. These  were as follows:

   ALSI had  backup instrumentation available.   Even
   though neither laboratory reported any problems with
   its primary instrument (the Leeman mercury analyzer),
   ALSI did have a backup instrument in case there were
   problems with the primary  instrument, or in the event
   that the laboratory needed to perform other mercury
   analyses during the demonstration time.

   As  noted, the  low standard requirement  for the
   calibration curve  was one of the QC  requirements
   specified for this demonstration in order to ensure that
   a lower quantitation could be achieved.  This low
   standard was 10 ug/kg for both laboratories.  ALSI,
   however, was able to show experience in being able to
   calibrate  much lower than  this, using  a  second
   calibration curve.  In the event that the vendor was
   able to analyze at concentrations as low as 1 ug/kg
   with precise and accurate  determinations, ALSI was
   able to perform analyses at lower concentrations as
   part of the demonstration.  ALSI used a second, lower
   calibration curve for any analyses required below 0.05
   mg/kg.  Very  few vendors  were  able to  analyze
   samples at concentrations at this low a level.

   Management practices and analyst experience  were
   similar at both laboratories. ALSI had participated in a
   few more  SITE  demonstrations  than  the  other
   laboratory, but this difference  was  not significant
   because both  laboratories had proven themselves
   capable of handling the additional QC requirements for
   the SITE program.  In addition, both  laboratories had
    internal QA management procedures to  provide the
    confidence needed to achieve SITE requirements.

    Interferences for the samples previously analyzed were
    discussed and data were reviewed.  ALSI performed
    two separate analyses for each sample. This included
    analyses   with  and  without  stannous   chloride.
    (Stannous  chloride is the  reagent  used to  release
    mercury into the vapor phase for analysis. Sometimes
    organics can cause interferences in the vapor phase.
    Therefore, an analysis with no stannous chloride would
    provide information on organic interferences.) The
    other laboratory did not routinely perform this analysis.
    Some  samples were thought to  contain  organic
    interferences, based  on previous sample  results. The
    pre-demonstration results reviewed indicated that no
    organic interferences were present.  Therefore, while
    this was  thought to  be  a  possible discriminator
    between the two laboratories  in terms of analytical
    method performance, it became moot for the samples
    included in this demonstration.

The factors above were considered in the final evaluation.
Because there were only minor differences in the technical
factors, cost of analysis  was used as the discriminating
factor.   (If there had  been  significant differences  in
laboratory quality, cost would not  have been  a factor.)
ALSI  was  significantly  lower  in  cost  than  the  other
laboratory.  Therefore, ALSI was chosen as the referee
laboratory for the demonstration.

4.4.3   Summary of Analytical Methods

4.4.3.1  Summary of Reference Method

The critical measurement forthis study was the analysis of
mercury in soil and sediment samples.  Samples analyzed
by the  laboratory included field  samples, spiked field
samples,  and  SRM  samples.    Detailed   laboratory
procedures for subsampling, extraction, and analysis were
provided in  the SOPs included as Appendix B of the Field
Demonstration  QAPP.   These are briefly summarized
below.

Samples were analyzed for mercury using Method 7471 B,
a cold-vapor atomic absorption method,  based on the
absorption of light at the 253.7-nm wavelength by mercury
vapor.  The mercury is reduced to the elemental state and
stripped/volatilized from solution in a closed system. The
mercury vapor passes through a cell positioned  in  the light
path of the AA spectrophotometer.  Absorbance (peak
height)  is   measured  as  a function   of  mercury
concentration.   Potassium  permanganate is  added  to
eliminate possible interference  from sulfide.  As per the
                                                    27

-------
method, concentrations as high as 20 mg/kg of sulfide, as
sodium sulfide, do not interfere with the recovery of added
inorganic mercury in reagent water. Copper has also been
reported  to  interfere; however, the  method  states that
copper concentrations as high as 10  mg/kg had no effect
on recovery of mercury from spiked samples.  Samples
high in chlorides require additional permanganate (as much
as 25 ml_) because, during the oxidation step, chlorides are
converted to free chlorine, which also absorbs radiation of
254 nm.  Free chlorine is removed by using an excess (25
ml_) of hydroxylamine sulfate reagent.  Certain  volatile
organic materials that absorb at this wavelength may also
cause  interference.   A  preliminary analysis   without
reagents  can determine  if this  type  of  interference is
present.

Prior to analysis, the contents of the sample container are
stirred, and the sample mixed prior to removing an aliquot
for the mercury  analysis. An aliquot of soil/sediment (1 g)
is placed in the  bottom of a biochemical oxygen demand
bottle, with reagent water and aqua  regia added.  The
mixture is heated  in a water bath at 95 °C for 2 minutes.
The solution is cooled and reagent water and potassium
permanganate solution are added to the sample bottle.
The bottle contents are thoroughly mixed, and  the bottle is
placed in the water bath  for 30 minutes  at 95 °C. After
cooling, sodium  chloride-hydroxylamine sulfate is added to
reduce the  excess permanganate.  Stannous chloride is
then added and the  bottle attached  to the analyzer; the
sample is aerated  and  the  absorbance  recorded.    An
analysis without stannous chloride is also included as an
interference  check  when  organic   contamination  is
suspected.  In  the event of positive results  of the  non-
stannous chloride analysis, the  laboratory was to report
those results to SAIC so  that a determination of organic
interferences could be made.

4.4.3.2 Summary   of   Methods   for  Non-Critical
       Measurements.

A  selected  set of  non-critical  parameters  was   also
measured during the demonstration.  These  parameters
were measured to provide a better insightinto the chemical
constituency of the field samples, including the presence of
potential interferents. The results of the tests for potential
interferents were  reviewed  to determine if a  trend was
apparent in the event that inaccuracy or low precision was
observed.   Table  4-4  presents the  analytical  method
reference   and  method   type   for  these  non-critical
parameters.
Table 4-4.  Analytical Methods for Non-Critical Parameters
 Parameter
                   Method Reference    Method Type
 Arsenic, barium,
 cadmium,
 chromium, lead,
 selenium, silver,
 copper, and zinc
SW-846 3050/6010   Acid digestion, ICP
Oil and Grease
TOC
Total Solids
EPA 1664
SW-846 9060
EPA 2540G
n-Hexane
extraction,
Gravimetric
analysis
Carbonaceous
analyzer
Gravimetric
4.5     Deviations  from
        Plan
          the  Demonstration
There was one deviation to the demonstration plan. The
samples were distributed to Ohio Lumex by site (Carson
River, Oak Ridge, etc.) as planned; however, due to the
potential for memory effects,  Ohio Lumex analyzed the
high concentration samples from all sites prior to analyzing
the low concentration samples for any of the sites.

Additionally, Ohio Lumex was able to complete allanalyses
during the demonstration;  however, they were  unable to
locate  the  results for  one data  point, and therefore,
provided data for  196 samples  prior to leaving  the
demonstration site.
                                                    28

-------
                                             Chapter 5
              Assessment of Laboratory Quality Control Measurements
5.1    Laboratory QA Summary
QA may be defined as a system of activities, the purpose
of which is to provide assurance that defined standards of
quality are met with a stated level of confidence.  A QA
program is a  means of integrating the quality planning,
quality assessment, QC, and quality improvement efforts
to meet user requirements.  The  objective  of the QA
program is to reduce measurement errors to agreed-upon
limits, and to  produce  results of acceptable and known
quality. The QAPP specified the necessary guidelines to
ensure that  the  measurement  system for  laboratory
analysis was in control, and provided detailed information
on the analytical approach to  ensure  that data of high
quality could be obtained  to achieve project  objectives.
The laboratory analyses were critical to project success, as
the laboratory results  were  used  as  a  standard for
comparison to the field method results. The field methods
are of unknown quality, and  therefore, for comparison
purposes the  laboratory analysis  needed to be a known
quantity. The following sections provide information on the
use of data quality indicators, and a detailed summary of
the QC analyses associated with project objectives.

5.2    Data  Quality Indicators  for  Mercury
       Analysis
To assess the quality of the data generated by  the referee
laboratory, two important data quality indicators of primary
concern are  precision and accuracy.  Precision can  be
defined as the degree of mutual agreement of independent
measurements generated through repeated application of
the process under specified conditions.  Accuracy is the
degree of agreement of a measured value with the true or
expected  value.   Both accuracy  and  precision  were
measured  by the  analysis of matrix spike/matrix spike
duplicates (MS/MSDs).  The precision of the spiked
duplicates is evaluated by expressing, as a percentage, the
difference between results of the  sample and sample
duplicate results. The relative percent difference (RPD) is
calculated as:

          (Maximum Value - Minimum Value)
         (Maximum Value -(-Minimum Value)/2

To determine and evaluate accuracy, known quantities of
the target analytes were spiked into selected field samples.
All spikes were post-digestion spikes because of the high
sample   concentrations  encountered   during   the
demonstration.     Pre-digestion  spikes,   on  high-
concentration samples would either have been diluted or
would have required additional studies to determine the
effect of spiking more analyte and subsequent recovery
values.  To determine matrix spike recovery,  and hence
measure accuracy, the following equation was applied:
%R=
                          C
                              x100
where,

       Css    =      Analyte concentration  in spiked
                     sample
       Cus    =      Analyte concentration in unspiked
                     sample
       Csa    =      Analyte concentration  added to
                     sample

Laboratory control  samples (LCSs)  were used as  an
additional measure of accuracy in the event of significant
                                                   29

-------
matrix interference. To determine the percent recovery of
LCS analyses, the equation below was used:
         „,,_.   Measured Concentration   .__
         %R =	xi 00
              Theoretical Concentration
While several precautions were taken to generate data of
known quality through control of the measurement system,
the data must also be representative of true conditions and
comparable   to   separate   sample   aliquots.
Representativeness refers  to the  degree with  which
analytical results accurately and  precisely reflect actual
conditions present at the  locations chosen for sample
collection. Representativeness was evaluated as part of
the pre-demonstration and combined with the  precision
measurement  in  relation to sample aliquots.  Sample
aliquoting by the SAIC GeoMechanics Laboratory tested
the ability of the  procedure to produce homogeneous,
representative,  and comparable  samples. All  samples
were  carefully  homogenized   in  order  to  ensure
comparability between  the  laboratory and the vendor.
Therefore, the RSD measurement objective of 25% or less
for replicate sample lotanalysis was intended to assess not
only precision but representativeness and  comparability.

Sensitivity was  another  critical factor  assessed for the
laboratory method of analysis.  This was measured  as a
practical quantitation limit and  was determined by the low
standard on the calibration curve. Two separate calibration
curves were run by the laboratory when necessary.  The
higher calibration curve was used for the  majority of the
samples and had a lower calibration limit of 25 ug/kg. The
lower calibration curve was used  when  samples were
below this lowercalibration standard.  The lowercalibration
curve had a lower limit standard of 5 ug/kg.  The lower limit
standard of the calibration curve was run with each sample
batch as a check standard and was  required to  be within
10% of  the  true value (QAPP QC  requirement).   This
additional check on analytical sensitivity was performed to
ensure  that  this  lower  limit   standard  was   truly
representative  of  the  instrument  and method  practical
quantitation limit.

5.3     Conclusions    and   Data     Quality
        Limitations
Critical sample  data and associated QC analyses were
reviewed to determine whether the data collected were of
adequate  quality  to provide  proper evaluation  of the
project's technical objectives.  The results of  this  review
are summarized below.
Accuracy objectives for mercury analysis by Method 7471B
were assessed by the evaluation of 23 spiked duplicate
pairs, analyzed in accordance with standard procedures in
the same manner as the samples. Recovery values for the
critical compounds were well within objectives specified in
the QAPP, except for two spiked samples summarized in
Table 5-1. The results of these  samples, however, were
only slightly outside specified limits, and given the number
of total samples (46 or 23 pairs), this is an insignificant
number of results that did not fa II within specifications. The
MS/MSD results therefore,  are supportive of the overall
accuracy objectives.
Table 5-1.  MS/MSD Summary
 Parameter                  Value
 QC Limits

 Recovery Range

 Number of Duplicate Pairs

 Average Percent Recovery

 No. of Spikes Outside QC
 Specifications
80%- 120%

85.2%- 126%

23

108%
An additional measure of accuracywas LCSs. These were
analyzed with every sample batch (1 in 20 samples) and
results are presented in Table 5-2. All results were within
specifications, thereby supporting the conclusion that QC
assessment met project accuracy objectives.
Table 5-2.  LCS Summary
Parameter
QC Limits
Recovery Range
Number of LCSs
Average Percent Recovery
No. of LCSs Outside QC
Specifications
Value
90%- 110%
90% -100%
24
95.5%
0
Precision  was  assessed  through  the analysis  of 23
duplicate spike pairs for mercury.  Precision specifications
were established prior to the demonstration as a RPD less
                                                    30

-------
than  20%.    All but  two  sample  pairs  were  within
specifications, as noted in Table 5-3. The results of these
samples, however,  were only slightly  outside specified
limits, and given the number of total samples (23 pairs),
this is an insignificant number of results that did not fall
within specifications. Therefore,  laboratory analyses met
precision specifications.
Table 5-3.  Precision Summary
 Parameter                  Value
QC Limits
MS/MS D RPD Range
Number of Duplicate Pairs
Average MS/MSD RPD
No. of Pairs Outside QC
Specifications
RPD<
0.0%
23
5.7%
2
20%
to 25%



Sensitivity results were within specified project objectives.
The  sensitivity objective was evaluated as the PQL, as
assessed by the low standard on the calibration curve. For
the majority of samples, a calibration curve of 25-500 ug/kg
was  used.  This is because the majority of samples fell
within  this calibration range  (samples often  required
dilution).  There were, however, some samples below this
range and a second curve was used. The calibration range
for this lower curve was 5-50 ug/kg. In order to ensure that
the lower concentration on the calibration curve was a true
PQL, the laboratory ran  a low check standard (lowest
concentration on the calibration curve) with  every batch of
samples.  This standard was required to be within 10% of
the specified value. The results of this low check standard
are summarized  in Table 5-4.
Table 5-4. Low Check Standards
 Parameter                  Value
 QC Limits

 Recovery Range

 Number of Check Standards
 Analyzed

 Average Recovery
Recovery 90% - 110%

88.6%-111%

23


96%
There were a few occasions where this standard did not
meet  specifications.  The  results  of these  samples,
however, were  only slightly outside specified limits, and
given  the  number of total  samples  (23), this  is an
insignificant number of results  that did not fall within
specifications.  In addition, the laboratory reanalyzed the
standard when specifications  were not achieved, and the
second determination always fell within the required limits.
Therefore  laboratory  objectives  for  sensitivity  were
achieved according to QAPP specifications.

As noted previously, comparabilityand representativeness
were assessed through the analysis of replicate samples.
Results of these replicates are presented in the discussion
on primary  project objectives for precision. These results
show that data were within project and QA objectives.

Completeness objectives were achieved for the project. All
samples were analyzed and data were provided for 100%
of the samples received  by the  laboratory.  No sample
bottles were lost or broken.

Other measures of data quality included method blanks,
calibration checks, evaluation  of linearity of the calibration
curve, holding time specifications,  and an independent
standard verification  included with  each sample  batch.
These results were reviewed for every sample batch run by
ALSI, and were within specifications. In addition, 10% of
the reported results were checked against the raw data.
Raw data  were reviewed to  ensure that sample results
were within the  calibration  range  of the instrument, as
defined by the calibration curve. A 6-point calibration curve
was generated at the start of each sample batch of 20. A
few data points were found  to  be incorrectly reported.
Recalculations were  performed for these data, and any
additional data points that were suspected  outliers were
checked to  ensure correct results were reported.  Veryfew
calculation  or dilution errors were found.  All errors were
corrected so that the appropriate data were reported.

Another measure of compliance were the non-stannous
chloride runs performed by the laboratory for every sample
analyzed. This was done to check for organic interference.
There were no  samples that were found  to  have any
organic  interference  by this method.   Therefore, these
results met expected  QC specifications and data were not
qualified in  any fashion.

Total  solids data  were  also  reviewed to ensure  that
calculations were performed appropriatelyand dry weights
reported  when  required.  All of  these QC checks met
                                                     31

-------
QAPP  specifications.   In  summary,  all data  quality
indicators and QC specifications were reviewed and found
to be well within project specifications. Therefore, the data
are considered suitable for purposes of this evaluation.

5.4   Audit Findings
The SAIC SITE QA Manager conducted audits of both field
activities and of the subcontracted laboratory as part of the
QA measures for this  project.   The  results  of these
technical system reviews are discussed  below.
The  field  audit  resulted  in   no  findings  or  non-
conformances.  The  audit performed at the subcontract
laboratory was con ducted during the time of project sample
analysis.   One  non-conformance  was identified and
corrective action  was initiated. It was discovered that the
laboratory  PQL was  not meeting specifications due to a
reporting error. The analyst was generating the calibration
curves as specified above; however, the lower limit on the
calibration  curve was  not being  reported.   This was
immediately  rectified and  no other  findings  or  non-
conformances were identified.
                                                     32

-------
                                              Chapter 6
                            Performance of the RA-915+/RP-91C
Ohio Lumex analyzed 197 samples from May 5-8, 2003 in
Oak Ridge, TN.  Results for these samples were reported
by Ohio Lumex, and a statisticalevaluation was performed.
Additionally,   the  observations   made  during  the
demonstration were reviewed, and the remaining primary
and secondary objectives were completed. The results of
the primary and secondary objectives, identified in Chapter
1, are discussed in Sections 6.1 and 6.2, respectively.
                                   The distribution of the samples prepared for Ohio Lumex
                                   and the referee laboratory is presented in Table 6-1.  From
                                   the four sites, Ohio Lumex received samples at 36 different
                                   concentrations for a total of  197  samples.  These 197
                                   samples consisted of 22 concentrations in replicates of 7,
                                   1 concentration in replicate of 4, and 1 3 concentrations in
                                   replicates of 3.
Table 6-1. Distribution of Samples Prepared for Ohio Lumex and the Referee Laboratory
        Site
Concentration Range
                                         Soil
                                      Sediment
                                                                  Sample Type
Spiked Soil
SRM
Carson River
(Subtotal = 62)

Puget Sound
(Subtotal = 67)
Oak Ridge
(Subtotal = 51)
Manufacturing
(Subtotal = 17)
Subtotal
(Total = 197)
Low(1-500ppb)
Mid (0.5-50 ppm)
High (50->1, 000 ppm)
Low (1 ppb - 10 ppm)
High (10-500 ppm)
Low (0.1 -10 ppm)
High (10-800 ppm)
General (5-1,000 ppm)



3
0
0
30
0
10
3
10

56

10
0
0
0
3
7
6
0

26

7
7
0
14
7
7
0
0

42

7
28
0
13
0
14
4
7

73

6.1    Primary Objectives

6.1.1  Sensitivity

Sensitivity objectives are explained in Chapter 4. The two
primary   sensitivity  evaluations  performed  for  this
demonstration were the MDL and POL.  Determinations of
these two measurements are explained in the paragraphs
below, along with a comparison  to the referee laboratory.
These determinations set the standard for the evaluation of
accuracy and   precision  for  the  Ohio  Lumex field
instrument.   Any sample analyzed by Ohio Lumex and
subsequently reported as below their level ofdetection was
not used as part of any additional evaluations.  This was
                                   done because of the expectation that values  below the
                                   lower limit of instrument sensitivity would not reflect the true
                                   instrument accuracy and precision.

                                   The sensitivity measurements of MDL and POL are both
                                   dependent upon the matrix and method. Hence, the MDL
                                   and POL will vary, depending upon whether the  matrix is a
                                   soil, waste, or water. Only soils and sediments were tested
                                   during this demonstration and therefore, MDL calculations
                                   for this evaluation reflect soil and sediment matrices. POL
                                   determinations are  not independent calculations, but are
                                   dependent  upon  results  provided by the vendor for the
                                   samples tested.
                                                   33

-------
Comparison of the MDLand PQLto laboratory sensitivity
required that a standard evaluation be performed for all
instruments  tested during this demonstration.  PQL, as
previously noted, is defined in EPA G-5i as the lowest level
of method and instrument performance with  a specified
accuracy and precision. This is often defined by the lowest
point on the calibration curve. Our approach was to let the
vendorprovide thelowerlimitofquantitation asdetermined
by their particular standard operating procedure, and then
test this limit by comparing results of samples analyzed at
this low concentration to the referee laboratory results, or
comparing the results to a standard reference material, if
available.   Comparison  of these  data  are,  therefore,
presented for the lowest concentration sample results, as
provided bythe vendor.  If the vendor provided "non-detect"
results, then no formal evaluation of that sample was
presented.  In addition, the sample(s) was not  used in the
evaluation of precision and accuracy.

Method Detection Limit - The standard procedure for
determining  MDLs  is  to  analyze a  low standard or
reference  material seven times,  calculate the  standard
deviation and multiply the standard deviation by the "t"
value  for  seven measurements  at the 99th  percentile
(alpha  = 0.01).  (This value is 3.143 as determined from a
standard statistics table.) This procedure for determination
of an  MDL is  defined  in  40 CFR  Part 136, and while
determinations for MDLs  may be defined differently for
other instruments, this method was previously noted in the
demonstration  QAPP  and  is intended  to  provide  a
comparison to other similar MDL evaluations. The purpose
is  to provide a lower level of detection with a statistical
confidence at which the instrument will detectthe presence
of a substance above its  noise  level.   There  is  no
associated accuracy or precision provided  or implied.

Several blind standards and field samples were provided to
Ohio Lumex at  their  estimated lower limit of sensitivity.
The Ohio Lumex lower limit of sensitivity was previously
estimated  at 0.005 mg/kg.  Because there are several
different SRMs and field samples at concentrations close
to  the MDL,  evaluation  of the MDL was performed using
more than a single concentration.  Samples  chosen for
calculation were based upon:  1) concentration  and how
close it was to the estimated MDL, 2) number of analyses
performed for the same sample (e.g., more than 4), and 3)
if non-detects were reported by Ohio Lumex for a sample
used  to  calculate  the MDL.   Then the next highest
concentration sample  was selected  based  upon the
premise that a non-detect result reported for one of several
samples indicates the selected sample is on the "edge" of
the instruments detection capability.
Seven  replicates  were analyzed by Ohio  Lumex  for a
sample that had a reported average concentration by the
referee laboratory of 0.06 mg/kg. (Sample lot 02 from the
Puget Sound site.) The average concentration reported by
Ohio Lumex for  this sample was  0.072 mg/kg and the
standard  deviation was 0.0135 mg/kg.  An SRM with a
reference  value  of  0.017  mg/kg  (sample lot  35) was
analyzed  seven times by Ohio Lumex with a  reported
average concentration of 0.0067 mg/kg and a standard
deviation  of 0.0017 mg/kg.  Calculations of the respective
MDLs based upon each of these standards are 0.042 and
0.0053 mg/kg.

As a further check of the MDL, sample lot 37 (SRM)  had a
reference value of 0.1 58 mg/kg. Seven samples analyzed
by Ohio Lumex for this sample lot had a reported average
concentration of 0.196 mg/kg and a standard deviation of
0.0098 mg/kg. This results in a calculated MDL of  0.031
mg/kg, which falls between the values noted above.

Based upon these results it appears that the M DL for this
instrument  is  somewhere between  0.0053 and   0.042
mg/kg. The lowest standard analyzed by Ohio Lumex was
the SRM  noted above (sample lot 35) with a reference
value of 0.017 mg/kg (which is close to the average  MDL)
with a reported average concentration by Ohio Lumex of
0.0067 mg/kg. While the average result for this sample
has a percent difference (%D) of-63.5%, the sample was
easily detected by the Ohio Lumex field instrument, and is,
therefore, by definition within  the range  of the  MDL.
Consequently, the estimated  sensitivity provided by Ohio
Lumex of 0.005 mg/kg is a reasonable estimation of the
MDL for aqueous samples, assuming that some  samples
will likely have matrix interferences and  may result in a
slightly higher MDL.  The calculated MDL  for soils and
sediments  is somewhere  between  0.0053 and   0.042
mg/kg. The equivalent MDL for the referee laboratory is
0.0026 mg/kg. The calculated result is only intended as a
statistical estimation and not a true test of  instrument
sensitivity.

Practical  Quantitation  Limit  - This value is usually
calculated by determining a low standard on the instrument
calibration curve, and it is estimated as the lowest standard
at which  the  instrument will accurately  and precisely
determine a given concentration within specified QC limits.
The PQL is often around 5-10 times  the MDL. This PQL
estimation, however, is method- and matrix-dependent. In
order to determine the PQL,  several low standards were
provided  to Ohio Lumex,  and  subsequent %Ds  were
calculated.
                                                    34

-------
The lower limit of sensitivity previously provided by the
vendor(0.005 mg/kg) appears to be close to their MDL, but
this would likely result in a higher instrument and method
PQL.  The PQL should have a precision and accuracy that
matches  the  instrument  capabilities  within   a  certain
operating range  of analysis.  The relationship between
sensitivity and precision  is such  that  the   lower the
concentration, the higher the variation in reported sample
results.  Five times the  estimated MDL (estimated  PQL)
would result in a value of 0.027 to 0.21  mg/kg.  Therefore,
values in this range  were chosen for estimating the PQL
and associated %D between  the  Ohio Lumex reported
average and the reference value if it  is an SRM, or the
average value reported by the referee laboratory.  Also
compared are the  95% CIs  for  additional descriptive
information.

The Ohio Lumex average result for the 0.017 mg/kg SRM
noted above (sample lot 35)  was 0.0067 mg/kg.  The
standard deviation was  0.0017 mg/kg  and the  95% Cl  is
0.0051 to 0.0083 mg/kg. The %D for this sample is-63.5%
and therefore this is clearly below the instrument PQL.

The Ohio Lumex average result for the 0.158 mg/kg SRM
(sample lot 37) was 0.196 mg/kg. The  standard deviation
was 0.0098 mg/kg and the 95% Cl is 0.187 - 0.205 mg/kg.
The %D for this sample is 24.1%. This is a reasonable %D
for most analytical instrumentation and therefore within the
instrument's PQL.

The average result reported by the referee  laboratory for
sample lot 02 was 0.06 mg/kg. The result reported by Ohio
Lumex for  this same  sample  was 0.072  mg/kg.  The
standard  deviation was 0.0135 mg/kg. The %D  for this
sample is 20%.

Sensitivity Summary

The low standard calculations using MDL values suggest
that a PQL for the Ohio Lumex field instrument may be as
low as  0.027  mg/kg.   The  referee  laboratory  PQL
confirmed during the demonstration is 0.005 mg/kg with a
%D of <10%. The %D for the average Ohio Lumex result
for the average referee  laboratory value of 0.06 mg/kg  is
0.072 mg/kg, with a %D of 20%.  This  was  the lowest
sample concentration tested during the demonstration that
is close to the calculated PQL noted above.

The range for the calculated MDL is between 0.0053 and
0.042 mg/kg,  based on the results of seven replicate
analyses for low standards. The equivalent MDL for the
referee   laboratory  is   0.0026  mg/kg.    The   MDL
determination, however, is only a statistical calculation that
has been used in the past by EPA, and  is currently not
considered  a  "true"  MDL by  SW-846  methodology.
SW-846 is suggesting thatperformance-based methods be
used, and that PQLs be determined using low standard
calculations.

6.1.2 Accuracy

Accuracy is the instrument measurement compared to a
standard,  or "true" value.  For this  demonstration,  three
separate standards were used for determining accuracy.
The primary standard is SRMs. The SRMs are traceable
to national systems. These were obtained from reputable
suppliers with reported concentration and an  associated
95% Cl and 95% prediction  interval.  The Cl from the
reference  material is used  as  a  measure of comparison
with  Cl calculated from replicate analyses for the same
sample analyzed  by the laboratory or vendor.  Results are
considered comparable if CIs of the  SRM overlap with the
CIs computed from the replicate analyses by the vendor.
While this is not a definitive measure  of comparison, it
provides  some  assurance that  the  two values  are
equivalent.

Prediction  intervals  are  intended  as a  measure of
comparison for a single laboratory or vendor result with the
SRM. When computing a prediction interval, the equation
assumes an infinite number of analyses, and it is used to
compare individual  sample results.  A 95% prediction
interval would, therefore, predict the correct result from a
single analysis 95% of the  time for  an  infinite number of
samples, if the result is comparable  to that of the SRM.  It
should be  noted that the corollary to this statement is that
5% of the time a resultwill beoutside the prediction interval
if determined for an infinite number of samples. If several
samples are analyzed, the percentage of results within the
prediction  interval will be slightly above or below 95%. The
more samples analyzed, the more likely the percentage of
correct results  will be  close to 95% if  the result for the
method  being tested is comparable to the SRM.

All SRMs  were analyzed in replicates  of three, four, or
seven by both the vendor and the referee laboratory. In
some instances,  analyses performed by the vendor were
determined  to  be  invalid  measurements   and  were,
therefore,  not included with  the reported results.   There
were nine  different SRMs analyzed by both the vendor and
the laboratory, for a total of  57 data points by  the vendor
and  62  data points by  the laboratory.  One specially
prepared SRM (sample lot 55) was not included, because
analyses performed by  the vendor and the laboratory
suggested that the SRM value  was in question. Because
this was a specially prepared  SRM, and  had somewhat
                                                    35

-------
less documentation in regards to the reference value, and
because both the referee laboratory and vendor results,
while statistically equivalent were statistically different from
the SRM value,  this  SRM  was  not  included in  the
evaluation.

The second accuracy determination used a comparison of
vendor results of field samples and SRMs  to the referee
laboratory results for these same samples.  Field samples
were used to ensure that "real-world" samples were tested
by the vendor.  The referee laboratory result is considered
as the standard for comparison to the vendor result.  This
comparison is in the form of a hypothesis test with alpha =
0.01. (Detailed equations along with additional information
aboutthis statisticalcomparison is included in Appendix B.)

It should be noted that there is  evidence of a laboratory
bias.  This bias was  determined by comparing average
laboratory  values  to SRM reference  values,  and is
discussed below. The laboratory bias is low  in comparison
to  the reference value.  A bias correction was not made
when  comparing individual  samples (replicate analyses)
between the laboratory and vendor; however, setting alpha
= 0.01 helps mitigate  for this possible bias by widening the
range of acceptable results  between the two data sets.

An aggregate analysis, or unified hypothesis test, was also
performed for all 33 sample  lots. (A detailed discussion of
this statistical comparison is included in Appendix B.) This
analysis provides additional  statistical evidence in relation
to  the accuracy evaluation.  A bias term is included in this
calculation in order to account for any data bias.

The third measure of  accuracy is obtained by the analysis
of spiked field  samples.  These were analyzed by the
vendor and the laboratory in replicate in  order to provide
additional measurement comparisons and are treated the
same as the other field samples. Spikes were prepared to
cover additional concentrations not available from SRMs or
field  samples.  There  is no  comparison  to  the spiked
concentration, only a  comparison between the vendor and
the laboratory reported value.

The purpose for SRM  analyses by the referee laboratory is
to  provide a check on  laboratory accuracy.   During the
pre-demonstration, the referee laboratory was chosen, in
part, based upon the  analysis of SRMs.  This was done to
ensure that a competent laboratory would be  used for the
demonstration.     The   pre-demonstration  laboratory
qualification  showed that  the  laboratory  was within
prediction intervals for all SRMs analyzed. Because of the
need to provide confidence in laboratory analysis during the
demonstration, the referee laboratory also analyzed SRMs
as an ongoing check of laboratory bias. As noted in Table
6-3,  not all laboratory results were within the prediction
interval.  This is discussed in  more  detail  below.   All
laboratory QC  checks, however,  were found to be within
compliance (see Chapter 5).

Evaluation of vendor and laboratory analysis of SRMs is
performed  in  the following  manner.    Accuracy was
determined  by comparing  the  95%  Cl  of the sample
analyzed by the vendor and laboratory to the 95% Cl  for
the SRM. (95% CIs around the true value are provided by
the SRM supplier.) This information is provided in Tables
6-2  and 6-3,  with  notations when  the CIs  overlap,
suggesting com parable results. In addition, the number of
SRM results forthe vendor's analytical instrumentation and
the referee laboratory that are within the associated 95%
prediction interval are reported.  This is a more definitive
evaluation  of  laboratory  and vendor  accuracy.   The
percentage of total results within the prediction interval for
the vendor and laboratory are reported in Tables 6-2 and
6-3, respectively.

The single most importantnumberfrom these tables is the
percentage of samples within the 95% prediction interval.
As noted for the Ohio Lumex data, this percentage is 93%,
with n = 57.  This suggests that the Ohio Lumex data are
within  expected   accuracy   accounting  for  statistical
variation. For five of the nine determinations, Ohio Lumex
average results are above the reference value.  This would
suggest that there is no bias associated with the  Ohio
Lumex data.   Six of the nine sample groups overlap with
the 95% CIs  calculated  from  the Ohio Lumex data,
compared to values provided by the supplier of the SRM.
This   number  is  also   suggestive  of  a   reasonable
comparison  to the SRM value, accounting for statistical
variation.

The  percentage  of samples  within  the  95% prediction
interval for the laboratory data is 87%.   For  7 of the 9
determinations,  ALSI average  results  are  below  the
reference value.   This suggests  that  the ALSI data are
potentially  biased low.   Because  of  this  bias,  the
percentage of  samples outside the prediction interval is
slightly below the anticipated number of results, given that
the number  of samples analyzed (62) is relatively high.
Nonetheless,  the  referee  laboratory  data  should  be
considered accurate and not significantly different from the
SRM value.  Because there is no bias correction term  in
the individual hypothesis tests (Table 6-4), alpha is set at
0.01  to help mitigate for laboratory bias. This in effect
widens the scope of vendor data that would fall within an
acceptable range of the referee laboratory. Six of the nine
                                                     36

-------
sample groups overlap with the 95% CIs calculated from
the ALSI data, compared to values provided by the supplier
of the SRM.   This number is also suggestive of a

Table 6-2.  Ohio Lumex SRM Comparison
reasonable comparison to the SRM value accounting for
statistical variation.
Sample SRM Value/ 95% Cl
Lot No.













a
b

37
44
35
36
38
39
41
43
45




0.158/0.132-0.184
4.7/4.3-5.1
0.017/0.010-0.024
0.082/0.073-0.091
0.62/0.61 -0.63s
1.09/1. 06 -1.126
2.42/2.16-2.46
3.80/3.50-4.11
6.45 / 6.06 - 6.84
Total Samples
% of samples w/in
prediction interval
Ohio Lumex Avg./ 95% Cl

0.196/0.187-0.205
4.88 / 4. 72 -5. 03
0.0067/0.0051 -0.0083
0.071 / 0.062 -0.080
0.627 / 0.607- 0.647
1.07/ 1.01-1.13
2.01 / 1.68 -2. 37
3.64/3.33-3.95
8.14/8.02-8.26



Cl is estimated based upon n=30. A 95% prediction interval

Prediction interval is estimated
based upon n=30. A 95% Cl
Cl
Overlap
(ves/no)
no
yes
no
yes
yes
yes
yes
yes
no



was provided
was provided
No. of 95% Prediction Ohio Lumex No.
Samples Interval w/in Prediction
Analyzed
7
6
7
3
7
6
7
7
7
57


by the SRM
by the SRM
0-
3.0-
0-
0.035 -
0.545 -
0.94-
1.3-
2.41 -
4.83-



supplier but no Cl
0.357
6.4
0.0358 b
0.13 b
0.695
1.24
3.3
5.20
8.06



was given.
Interval
7
6
7
3
7
5
7
7
4
53
93%


supplier but no prediction interval was given.
Table 6-3.  ALSI SRM Comparison
Sample SRM Value/ 95% Cl
Lot No.

37
44
35
36
38
39
41
43
45



a
b

0.158/0.132-0.184
4.7/4.3-5.1
0.017/0.010-0.024
0.082/0.073-0.091
0.62/0.61 -0.63"
1.09/ 1.06- 1.12"
2.42/2.16-2.46
3.80/3.50-4.11
6.45 / 6.06 - 6.84
Total Samples
% of samples w/in
prediction interval
Cl is estimated based upon n=30.
ALSI Avg./ 95% Cl

0.139/0.093-0.185
2.33 / 1.05-3.61
0.0087/0.0078-0.0096
0.073/0.068-0.078
0.628 / 0.606 - 0.650
1 .24 / 0.634 - 1 .85
1.79/1.29-2.29
2.76/2.51 -3.01
5.44/4.10-6.78



A 95% prediction interval
Prediction interval is estimated based upon n=30. A 95% Cl
Cl
Overlap
(ves/no)
yes
no
no
yes
yes
yes
yes
no
ves



was provided
was provided
No. of 95% Prediction ALSI No. w/in
Samples Interval Prediction
Analyzed
7
7
7
7
7
7
7
7
6
62


by the SRM
by the SRM
0-
3.0-
0-
0.035 -
0.545 -
0.94-
1.3-
2.41 -
4.83-



supplier but no Cl
0.357
6.4
0.0358 b
0.13 b
0.695
1.24
3.3
5.20
8.06



was given.
Interval
7
2
7
7
7
6
6
7
5
54
87%


supplier but no prediction interval was given.
Hypothesis Testing

Sample results from field and spiked field samples for the
vendor com pared to sim ilar tests by the referee laboratory
are used as another accuracy check.  Spiked samples
were used to cover concentrations not found in the field
samples, and they are considered the same as the field
samples for purposes of comparison.  Because of the
limited  data available for determining the accuracy of the
spiked  value, these were not considered  the same as
reference standards.   Therefore, these samples were
evaluated in the same fashion as field samples, but they
were not compared to individual spiked concentrations.
Using a hypothesis test with alpha = 0.01, vendor results
for all samples were compared to laboratory results to
determine  if  sample  populations  are  the  same  or
significantly different.  This was performed for each sample
lot separately.   Because this test does  not separate
precision from bias, if Ohio Lumex's or ALSI's computed
standard deviation was large due to a highly variable result
(indication of poor  precision), the two CIs could overlap.
Therefore, the fact that there was no significant difference
between the  two  results  could be due to high  sample
variability. Accordingly, associated RSDs have also been
reported in Table 6-4 along with results of the hypothesis
testing for each sample lot.
                                                    37

-------
Table 6-4. Accuracy Evaluation by Hypothesis Testing
Sample Lot No./ Site
03/ Oak Ridge
Ohio Lumex
ALSI
09/ Oak Ridge
Ohio Lumex
ALSI
14/ Oak Ridge
Ohio Lumex
ALSI
21/ Oak Ridge
Ohio Lumex
ALSI
241 Oak Ridge
Ohio Lumex
ALSI
261 Oak Ridge
Ohio Lumex
ALSI
371 Oak Ridge
Ohio Lumex
ALSI
441 Oak Ridge
Ohio Lumex
ALSI
60/ Oak Ridge
Ohio Lumex
ALSI
021 Puget Sound
Ohio Lumex
ALSI
05/ Puget Sound
Ohio Lumex
ALSI
08/ Puget Sound
Ohio Lumex
ALSI
10/ Puget Sound
Ohio Lumex
ALSI
11/ Puget Sound
Ohio Lumex
ALSI
12/ Puget Sound
Ohio Lumex
ALSI
251 Puget Sound
Ohio Lumex
ALSI
34/ Puget Sound
Ohio Lumex
ALSI
36/ Puget Sound
Ohio Lumex
ALSI
571 Puget Sound
Ohio Lumex
ALSI
Avg. Cone.
mg/kg
0.317
0.260
0.497
0.466
7.86
4.75
17.3
11.2
197
221
97.7
77.0
0.196
0.139
4.88
2.33
149
165
0.062
0.06
0.267
0.21
0.52
0.36
1.76
0.55
1.31
0.81
1.4
1.08
41.3
16.6
117
11.3
0.071
0.07
1.03
0.73
RSD or CV
4.8%
3.8%
23.2%
34.2%
32.0%
27.5%
23.3%
23.8%
28.0%
44.8%
2.6%
13.2%
5.0%
36.4%
3.0%
59.4%
23.8%
30.9%
43.9%
23.6%
9.4%
33.3 %
14.2%
13.4%
120%
20.5%
14.2%
32.7%
7.2%
2.8%
12.4%
12.3%
24.7%
23.4%
4.9%
6.7%
1 1 .2%
16.2%
Number of
Measurements
3
3
7
7
7
7
3
3
3
7
3
7
7
7
6
7
7
7
7
4
3
3
7
7
3
3
7
7
3
3
3
3
3
7
3
7
7
7
Significantly Different at
Alpha = 0.01
yes

no

yes

no

no

yes

no

no

no

no

no

yes

no

yes

no

yes

no

no

yes

Relative Percent
Difference (Ohio
Lumex to ALSI)
19.8%

6.4%

49.3%

42.8%

-1 1 .5%

23.7%

34.0%

70.7%

-10.2%

3.3 %

23.9%

36.4%

105%

47.2%

25.8%

85.3%

165%

1 .4%

34.1%

                                                      38

-------
Sample Lot No./ Site
61/ Puget Sound
Ohio Lumex
ALSI
621 Puget Sound
Ohio Lumex
ALSI
01 / Carson River
Ohio Lumex
ALSI
041 Carson River
Ohio Lumex
ALSI
06/ Carson River
Ohio Lumex
ALSI
38/ Carson River
Ohio Lumex
ALSI
39/ Carson River
Ohio Lumex
ALSI
41 / Carson River
Ohio Lumex
ALSI
431 Carson River
Ohio Lumex
ALSI
56/ Carson River
Ohio Lumex
ALSI
59/ Carson River
Ohio Lumex
ALSI
13/ Manufacturing Site
Ohio Lumex
ALSI
Ml Manufacturing Site
Ohio Lumex
ALSI
451 Manufacturing Site
Ohio Lumex
ALSI
Avg. Cone.
mg/kg

154
200

23.7
14.6

0.29
0.24

0.13
0.11

0.29
0.26

0.63
0.63

1.07
1.24

2.01
1.79

3.64
2.76

0.22
0.23

1.91
1.71

10.2
5.91

15.7
10.5

8.14
5.44
RSD or CV

47.0%
10.9%

13.0%
28.3%

30.5%
37.8%

18.9%
9.1%

7.3%
15.7%

3.5%
3.8%

6.5%
52.9%

17.5%
30.5%

9.1%
9.6%

8.0%
12.6%

10.2%
7.9%

51.9%
15.4%

24.2%
14.6%

1.6%
23.4%
Number of
Measurements

7
7

7
7

7
7

3
7

3
7

7
7

7
7

7
7

7
7

7
7

7
7

7
7

3
7

7
6
Significantly Different at
Alpha = 0.01
no


yes


no


no


no


no


no


no


yes


no


no


no


no


no


Relative Percent
Difference (Ohio
Lumex to ALSI)
-26.0%


47.5%


21 .5%


18.9%


10.3%


-0.2%


-14.7%


1 1 .6%


27.5%


5.2%


11.1%


53.3%


39.7%


39.8%


CV = Coefficient of variance
Of the 33 sample lots, 9 results are significantly different
based upon the hypothesis test noted above. Most of the
relative percent differences are positive, which indicates
that the Ohio Lumex result is generally higher than the
laboratory result. This is indicative of the previously noted
low bias associated with the laboratory data.  There are
some Ohio Lumex results that are less than the laboratory
result, therefore, no overall Ohio Lumex high or low bias
is apparent.  It appears that Ohio Lumex data are subject
to more random variability.
In determining the number of results significantly above or
below the value reported by the referee laboratory, 19 of 33
Ohio Lumex average results were found to have RPDs less
than 30% for sample concentrations above the estimated
POL.  Only two of 33  Ohio Lumex average results have
RPDs greater than 100%  for this same group of samples
(see Table 6-5).  Interferences may be a problem  but,
because of the random variability associated with the data,
no interferences are specifically apparent  from the data
collected.  Table 6-6 shows the  results of additional data
collected for these same samples.
                                                     39

-------
Table 6-5.  Number of Sample Lots Within Each %D Range
                        <30%           >30%, <50%
>50%, <100%
>100%
                                      Total
Positive %D 14
Negative %D 5
Total 19
Only those sample lots with the average


9
0
9
3
0
3
2
0
2


28
5
33

result greater than the PQL are tabulated.
Table 6-6. Concentration (in mg/kg) of Non-Target Analytes
Lot* Site TOC O&G Ad As Ba
1 Carson River 870
2 Puget Sound 3500
3 Oak Ridge 2300
4 Carson River 2400
5 Puget Sound 3500
6 Carson River 7200
8 Puget Sound 8100
9 Oak Ridge 3300
10 Puget Sound 4200
11 Puget Sound 3800
12 Puget Sound 3500
13 Manufacturing Site 3200
14 Oak Ridge 7800
17 Manufacturing Site 2400
21 Manufacturing Site 7800
24 Oak Ridge 6600
25 Puget Sound 46000
26 Oak Ridge 88000
34 SRM CRM-204 (web) NR
35 SRM Can met SO-3 NR
36 SRM Can met SO-2 NR
37 SRMCRM-016 NR
38 SRM NWRI TH-2 NR
39 SRM NWRI WQB-1 NR
41 SRMCRM026 NR
43 SRM CRM 027 NR
44 SRM CRM 021 NR
45 SRM CRM 033 NR
46 SRM CRM 032 NR
55 SRM RTC spec. NR
56 Spiked Lot 1 870
57 Spiked PS- X1.X4 3500
59 Spiked CR-SO-14 870
60 Spiked Lot 7 5100
61 Spiked Lot 10 4200
62 Spiked Lot 5 3500
CRM = Canadian Reference Material
190
290
530
200
210
200
200
150
130
130
290
100
180
90
320
250
1200
340
NR
NR
NR
NR
NR
NR
NR
NR
NR
NR
NR
NR
190
290
190
150
130
210

<0.5
<0.5
1.8
<0.5
<0.5
<0.5
<0.5
1.9
<0.5
<0.5
<0.5
<0.5
0.32
<0.5
1.9
<0.5
<0.5
9.1
<0.5
NR
NR
0.7
5.8
1
0.57
6
6.5
0.78
81
NR
<0.5
<0.5
<0.5
1.1
<0.5
<0.5

9
3
4
8
3
4
3
5
3
4
3
2
2
<2
4
5
2
10
0.82
NR
NR
7.8
8.7
23
5.4
12
25
130
370
NR
9
3
9
5
3
3

210
23
150
240
28
32
27
160
24
20
23
110
41
180
150
89
46
140
0.04
300
970
79
570
600
210
170
590
220
120
NR
210
23
210
120
24
28

Cd
<0.5
<0.5
<0.5
<0.5
<0.5
<0.5
1.0
0.5
<0.5
<0.5
0.8
<0.5
0.4
<0.5
2.8
<0.5
0.7
1.9
14
NR
NR
0.47
5.2
2
12
12
1.2
89
130
NR
<0.5
<0.5
<0.5
<0.5
<0.5
<0.5

Cr
19
16
46
17
18
16
17
70
18
18
16
42
16
48
22
6.3
35
47
4.5
26
16
14
120
89
27
27
11
100
15
NR
19
16
19
50
18
18

Cu
13
10
20
32
11
9
23
49
8
8
7
51
9
20
40
7
33
73
NR
17
7
16
120
80
19
9.9
4800
96
590
NR
13
10
13
28
8
11

Pb
3
1
15
12
3
1
99
24
1
1
2
7
11
15
23
10
31
82
11
14
21
14
190
84
26
52
6500
61
4600
NR
3
1
3
15
1
3

Se
<2
<2
<2
<2
<2
<2
2
<2
<2
<2
<2
<2
<2
<2
<2
<2
<2
<2
NR
NR
NR
1
0.83
1
1.9
14
NR
89
170
NR
<2
<2
<2
<2
<2
<2

Sn
<5
<5
<5
<5
<5
<5
<5
<5
<5
<5
<5
<5
<4
<5
<4
<5
6
5
NR
NR
NR
NR
NR
3.9
NR
NR
300
390
1300
NR
<5
<5
<5
<5
<5
<5

Zn
60
24
55
66
28
24
37
100
24
24
23
61
74
120
340
31
98
250
NR
52
120
70
900
275
140
51
550
230
2600
NR
60
24
60
61
24
28

Ha
0.19
0.04
0.31
0.10
0.16
0.23
0.37
0.66
0.62
0.63
1.1
5.5
78
10
14
220
35
100
0.002
0.02
0.08
0.16
0.62
1.09
2.4
3.8
4.7
6.4
21
0.01
0.19
0.61
1.6
72
220
23

RTC = Resource Technology Corporation
NR = Not Reported by Standard Supplier
                                                   40

-------
Discussion of Interferences

The  RSDs for Ohio Lumex are small,  suggesting  that
precision  is good  and is not simply random  variation
causing the  differences  noted  above.    (This will  be
discussed  in more detail in  Section  6.1.3)  As  noted
previously, it would  appear that interference is the cause of
the inaccurate analyses, but it is not readily apparent as to
the interferent causing the problem.  Specifically, there is
no apparentsignificant difference between reported values
and  associated  sites  from  which  the   samples were
collected.  There are possible exceptions, however, noted
for the  Puget Sound  samples but  only  descriptive
observations.  For example,  discounting SRMs, for the
Puget Sound site, only 6 of the 11  results reported by Ohio
Lumex are considered the same as those from the referee
laboratory.   Therefore,  there  may  be  a  significant
interference in the Puget Sound samples not presentin the
other samples.  Upon examination of  additional data
collected for these  samples (see  Table 6-6), no apparent
differences were noted.  For example,  a high organic
content may cause interference, but not all  the  Puget
Sound samples necessarily have a higher organic content
than  othersamples tested. In  addition, the Method 7471 B
mercury analysis requires that a non-stannous chloride
analysis be conducted with each sample analyzed, in order
to test for organic interferences.  Upon examination of the
referee  laboratory data for the  sample sets mentioned
above, there was no apparent interference noted  in the
non-stannous chloride analyses.

Puget Sound samples also had a higher percentage of
moisture for some of the samples analyzed which may help
explain these differences.  But this does  not explain  all
differences or all  similarities.   There  are not enough
samples  to  suggest that this  difference  is statistically
significant.   Other  interferences  caused  by additional
elements were also not found to be significant. Of course,
there could  be  interferences that were not tested, and
therefore, while  it may  be an interference (or likely  a
combination of interferences) particular to a sample lot, the
exact cause  remains unknown.  The reason(s)  for these
similarities and  differences and the reason(s) for the
difference between the Ohio Lumex and referee laboratory
results is only speculative.  In addition to  the  statistical
summary presented above, data plots (Figures 6-1 and 6-
2) are included in order to  present a visual interpretation of
the accuracy.
   Figure 6-1. Data plot for low concentration sample results.
                                                     41

-------
           400
           350
           300
        j? 250

        I
        g
        li 200

        I
        o 150
           100
           50
       Figure 6-2. Data plot for high concentration sample results.
Two separate plots have been included for the Ohio Lumex
data.  These two plots are divided based  upon sample
concentration  in  order  to  provide  a  more   detailed
presentation. Concentrations ofsamples analyzed by Ohio
Lumex ranged approximately from 0.01 to over 200 mg/kg.
The previous statisticalsummaryeliminated some of these
data based upon whether concentrations were interpreted
to be in  the analytical range of the Ohio  Lumex field
instrument.  This graphical presentation presents all data
points. It shows Ohio Lumex data com pared to ALSI data
plotted against concentration.  Sample groups are shown
by connecting lines.  Breaks between groups indicate a
different  set of samples  at  a  different  concentration.
Sample  groups were arranged  from lowest to highest
concentration.

As can be seen by this presentation, samples analyzed by
Ohio Lumex appear to match well with the ALSI results,
with  some notable  exceptions.  This is only  a  visual
interpretation and does not provide statistical significance.
It does  however, provide a visual  interpretation that
supports  the previous statistical results for accuracy, as
presented above.

Unified Hypothesis Test

SAIC  performed a  unified  hypothesis test analysis  to
assess the comparability of analytical results provided by
Ohio Lumex and those provided by ALSI. (See Appendix
B for a detailed description of this test.)  Ohio Lumex and
ALSI both supplied multiple assays on  replicates derived
from a total of 33  different sample lots, whether field
materials or reference materials.  The  Ohio Lumex and
ALSI data  from these assays  formed  the  basis  of this
assessment.

Results from this analysis suggest that the two data sets
are not the same.  The null hypothesis tested was that, on
average,  Ohio  Lumex and ALSI produce the same results
within a given sample lot.  The null hypothesis is rejected
in  part because Ohio Lumex results tended to exceed
those from ALSI for the same sample  lot.  Even when a
bias term is used  to  correct this discrepancy, the null
                                                    42

-------
hypothesis is still rejected.  Additional information about
this statistical evaluation is included in Appendix B.

Accuracy Summary

In  summary, Ohio  Lumex data were within SRM  95%
prediction intervals  93% of the time, which is statistically
equivalent.  ALSI data also compared favorably to  SRM
values and were within the 95% prediction interval 87% of
the time indicating statistical parity found to be biased low.

The comparison between the Ohio Lumex field data and
the ALSI results suggest that the two data sets are not the
same.  When a unified hypothesis test is performed, this
result is confirmed.  Ohio Lumex data were found to  be
both above and below referee laboratory concentrations.
The number of Ohio Lumex average values less than 30%
different from the  referee laboratory  results or  SRM
reference values was 19 of 33 different sample lots.  Ohio
Lumex results therefore, provide accurate estimates for
field determination,  and may be affected by interferences
not identified by this  demonstration.  Because the  Ohio
Lumex data compare favorably to the  SRM  values, the
differences  between  Ohio Lumex and  the  referee
laboratory are likely the result of matrix interferences.

6.1.3  Precision

Precision is usually  thought of as repeatability of a specific
measurement, and  it is often reported as RSD.  The RSD
is computed from a specified number of replicates. The
more replications of a measurement, the higher confidence
associated  with  a reported  RSD.   Replication  of  a
measurement  may  be   as   few  as  3   separate
measurements, to 30 or more measurements of the same
sample, depending  upon the degree of confidence desired
in the specified result. Most samples were analyzed seven
times by both Ohio  Lumex and the referee laboratory.  In
some cases, samples may have been analyzed as few as
three  times.  This  was often  the situation when it was
believed that the chosen sample, or SRM, was likely to be
below the vendor quantitation limit. The  precision goal for
the referee  laboratory, based  upon pre-demonstration
results, is an RSD of 25% or less. A descriptive evaluation
for differences between Ohio Lumex RSDs and the referee
laboratory RSDs was  determined. In Table 6-7, the RSD
for each separate  sample lot is shown for Ohio  Lumex
compared to the referee laboratory. The average RSD was
then computed for all measurements  made by  Ohio
Lumex,  and this value was compared to  the average RSD
for the laboratory.
In addition, the precision of an analytical instrument may
vary depending upon the  matrix being measured, the
concentration  of   the  analyte,   and   whether  the
measurement is made for an SRM or a field sample. To
evaluate precision for clearly different matrices, an overall
average RSD for the SRMs is calculated and compared to
the  average RSD for the field samples. This comparison
is also included in  Table 6-7 and  shown  for both Ohio
Lumex and the referee laboratory.

The purpose of this evaluation  is to determine  the field
instrument's capability  to   precisely  measure  analyte
concentrations under real-life  conditions.  Instrument
repeatability was measured  using samples from each of
four different sites.  Within each site,  there may be two
separate matrices, soil and sediment.  Not all sites have
both soil and sediment matrices, nor are there necessarily
high, medium, and low concentrations for each sample
site. Therefore, spiked samples were included  to cover
additional ranges.

Table 6-7  shows results from Oak Ridge,  Puget Sound,
Carson River, and the manufacturing site.  It was thought
that because these  four different field sites represented
different matrices, measures of precision may vary from
site to site. The average RSD for  each site is shown in
Table 6-7  and compared between Ohio  Lumex  and the
referee laboratory.   SRM RSDs  are not included  in this
comparison because SRMs, while grouped with  different
sites for purposes of ensuring that the  samples remained
blind during the demonstration, were not actually  samples
from that site, and were, therefore, compared separately.

The RSDs of various concentrations  are  compared by
noting the  RSD of the individual sample lots.  The ranges
of test samples (field, SRMs, and  spikes) were selected to
cover the appropriate analytical ranges of  Ohio  Lumex's
instrumentation.  Average  referee  laboratory values for
sample concentrations are included in the table, along with
SRM values, when appropriate.  These are discussed in
detail in Section 6.1.2, describing the accuracy evaluation
and  are  included  here  for  purposes  of  precision
comparison. Sample concentrations were separated into
approximate ranges: low, medium, and high, as  noted in
Table 6-7  and Table  6-1.   Samples  reported  by Ohio
Lumex as below their approximated POL were not included
in Table 6-7. There appears to be no correlation  between
concentration (low, medium, or high) and RSD; therefore,
no  other formal  evaluations of this  comparison  were
performed.
                                                    43

-------
Table 6-7. Evaluation of Precision
Sample Lot No. Ohio Lumex
and Lab
Avg. Cone, or Reference
SRM Value
RSD

Number of
Samples
w/in 25% RSD Goal?

OAK RIDGE
Lot no. 03
Ohio Lumex
ALSI
Lot no. 09
Ohio Lumex
ALSI
Lot no. 14
Ohio Lumex
ALSI
Lot no. 21
Ohio Lumex
ALSI
Lot no. 24
Ohio Lumex
ALSI
Lot no. 26
Ohio Lumex
ALSI
Lot no. 37
Ohio Lumex
ALSI
Lot no. 44
Ohio Lumex
ALSI
Lot no. 60
Ohio Lumex
ALSI
Oak Ridge Avg. RSD
Ohio Lumex
ALSI
0.26 (low)


0.47 (low)


4.75 (medium)


1 1 .2 (medium)


221 (high)


77.0 (high)


0.14 (low)


2.33 (medium)


165 (high)






4.8%
3.8%

23.2%
34.2%

32.0%
27.5%

23.3%
23.8%

28.0%
44.8%

2.6%
13.2%

5.0%
36.4%

3.0%
59.4%

23.8%
30.9%

19.7%
25.5%

3
3

7
7

7
7

3
3

3
7

3
7

7
7

7
7

7
7




yes
yes

yes
no

no
no

yes
yes

no
no

yes
yes

yes
no

yes
no

yes
no

yes
no
PUGET SOUND
Lot no. 02
Ohio Lumex
ALSI
Lot no. 05
Ohio Lumex
ALSI
Lot no. 08
Ohio Lumex
ALSI
Lot no. 10
Ohio Lumex
ALSI
Lot no. 1 1
Ohio Lumex
ALSI
Lot no. 12
Ohio Lumex
ALSI
Lot no. 25
Ohio Lumex
ALSI
Lot no. 34
Ohio Lumex
ALSI
0.06 (low)


0.21 (low)


0.36 (low)


0.55 (low)


0.81 (low)


1.08 (medium)


16.6 (high)


11.3 (medium)



43.9%
23.6%

9.4%
33.3%

14.2%
13.4%

120%
20.5%

14.2%
32.7%

7.1%
2.8%

12.4%
12.3%

24.7%
22.4%

7
7

3
3

7
7

3
3

7
7

3
3

3
3

3
7

no
yes

yes
no

yes
yes

no
yes

yes
no

yes
yes

yes
yes

yes
yes
                                                        44

-------
Table 6-7. Continued
Sample Lot No. Ohio Lumex
and Lab
Lot no. 36
Ohio Lumex
ALSI
Lot no. 57
Ohio Lumex
ALSI
Lot no. 61
Ohio Lumex
ALSI
Lot no. 62
Ohio Lumex
ALSI
Puget Sound/ Avg. RSD
Ohio Lumex
ALSI

Avg. Cone, or Reference
SRM Value
0.073 (low)


0.73 (low)


154 (high)


14.6 (high)






RSD


4.9%
6.7%

1 1 .2%
16.2%

47.0%
10.9%

13.0%
28.3%

28.8%
19.7%

Number of
Samples

3
7

7
7

7
7

7
7




w/in 25% RSD Goal?


yes
yes

yes
yes

no
yes

yes
no

no
yes
CARSON RIVER
Lot no. 01
Ohio Lumex
ALSI
Lot no. 04
Ohio Lumex
ALSI
Lot no. 06
Ohio Lumex
ALSI
Lot no. 38
Ohio Lumex
ALSI
Lot no. 39
Ohio Lumex
ALSI
Lot no. 41
Ohio Lumex
ALSI
Lot no. 43
Ohio Lumex
ALSI
Lot no. 56
Ohio Lumex
ALSI
Lot no. 59
Ohio Lumex
ALSI
Carson River/ Avg. RSD
Ohio Lumex
ALSI
0.24 (low)


0.11 (low)


0.26 (low)


0.63 (low)


1.24 (medium)


1 .79 (medium)


2.76 (medium)


0.23 (low)


1.71 (medium)






30.5%
37.7%

18.9%
9.1%

7.3%
15.7%

3.5%
3.8%

6.5%
52.9%

17.5%
30.5%

9.1%
9.6%

8.0%
12.6%

10.2%
7.9%

15.0%
16.6%

7
7

3
7

3
7

7
7

7
7

7
7

7
7

7
7

7
7




no
no

yes
yes

yes
yes

yes
yes

yes
no

yes
no

yes
yes

yes
yes

yes
yes

yes
yes
MANUFACTURING SITE
Lot no. 13
Ohio Lumex
ALSI
Lot no. 17
Ohio Lumex
ALSI
Lot no. 45
Ohio Lumex
ALSI
5.91 (medium)


10.5 (high)


5.44 (medium)



51.9%
15.4%

24.2%
14.6%

1.6%
23.4%

7
7

3
7

7
6

no
yes

yes
yes

yes
ves
45

-------
 Table 6-7.  Continued
  Sample Lot No. Ohio Lumex
	and Lab	
Avg. Cone, or Reference
     SRM Value	
                             RSD
Number of
 Samples
w/in 25% RSD Goal?
  Manufacturing Site/Avg. RSD
        Ohio Lumex
           ALSI
                            38.0 %
                            15.0%
                    no
                    yes
                                           SUMMARY STATISTICS
      Overall Avg. RSD
        Ohio Lumex
           ALSI
                            16.1%
                            22.3%
                    yes
                    yes
    Field Samples/Avg. RSD
        Ohio Lumex
           ALSI
                            24.3%
                            20.3%
                    yes
                    yes
      SRMs/Avg. RSD
        Ohio Lumex
           ALSI
                             8.0%
                            24.3%
                    yes
                    yes
The referee laboratory analyzed replicates of all samples
analyzed by Ohio Lumex.  This was used for purposes of
precision comparison to Ohio Lumex.  RSD for the vendor
and the laboratory were calculated individually and shown
in Table 6-7.

As noted from Table 6-7, Ohio Lumex precision is similar
to that of the referee laboratory. The single most important
measure of precision  provided  in Table  6-7,  overall
average  RSD,  is  22.3% for  the  referee  laboratory
compared to the Ohio Lumex average RSD of 16.1%. The
laboratory and Ohio Lumex RSD are both within the 25%
RSD objective for precision expected from both analytical
and sampling variance.

In  addition, field sample precision compared to SRM
precision shows that  there  may  be some  difference
between these two sample lots; field sample RSD is 20.3%
for ALSI and 24.3% for Ohio Lumex; SRM RSD is 24.3%
for ALSI and 8.0% for Ohio Lumex. This is similar to the
results for the accuracy comparison. Ohio Lumex appears
to have better precision for the SRM analyses than for the
field  sample analyses.   For  purposes of this analysis,
spiked samples are considered the same as field samples
because these were similar field matrices and the resulting
variance was expected to be equal to that of field samples.
The  replicate  sample  RSDs  also  confirm the  pre-
demonstration   results,   showing  that  sample
homogenization procedures  met  their originally stated
objectives.
                           There appears to be no significant site variation between
                           Oak Ridge, Puget Sound,  and the manufacturing  site
                           samples. (See Table 6-7 showing average RSDs for each
                           of these sample lots. These average RSDs are computed
                           using only  the results of the field samples and not the
                           SRMs.) The Carson River site had a lower average RSD
                           for both the vendor and the laboratory, but this difference
                           may not be significant because this same result was not
                           evident in the data comparisons performed for other data
                           sets.

                           Precision Summary

                           The precision of the Ohio Lumex field instrument is better
                           than the referee laboratory precision. The overall average
                           RSD is 22.3% for the  referee laboratory,  compared to the
                           Ohio  Lumex  average RSD  of 16.1%.  This  is primarily
                           because of the  better precision  obtained for  the SRM
                           analyses by Ohio Lumex.  Both the laboratory precision
                           and the Ohio Lumex precision goals of 25% overall RSD
                           were achieved.
                           6.1.4  Time    Required
                                   Measurement
             for   Mercury
                           During the demonstration, the time required for mercury
                           measurement activities was measured.  Specific activities
                           that were  timed  included:  instrument setup, sample
                           analysis, and instrument disassembly. One field technician
                           performed all operations during the demonstration, with the
                           exception of instrument setup and tear down, plus a small
                                                   46

-------
amount of sample preparation  activities.
operator assisted with these items.
A  second
Setup and disassemble times were measured one time.
Analytical time was measured each day, beginning when
the first blank was started,  and continuing until the last
blank was completed at the end of the day.  Any downtime
was noted and  then subtracted from the  total daily
operational time.  The total of the operational time from all
four days  was divided by the total number of analyses
performed.  For this calculation, analyses  of blanks and
calibration standards, and reanalyses of samples were not
included in the total number of samples.

Setup time for the RA-915+/RA-91C consisted of removing
the instrument from the shipping container, placement on
a level working surface, establishment of all electrical and
gas tubing connections, and instrument warm-up.  The
time required to  remove the RA-915+/RA-91C from the
shipping  container  could  not be  measured precisely
because  the  device was  removed  from   the shipping
container before the evaluation team could time these
activities; however, the vendor did replicate  the majority of
this process at the request of the evaluator, so that a time
estimate could be made.  Based on these observations, it
is estimated that one person could remove the device from
the shipping container in 15 minutes. Setup time for other
peripheral devices,  such  as the computer/monitor  and
analytical  balance,  was  accomplished   during  the
instrument  warm-up  time.   Leveling of  the balance,
depending on field  conditions, took between  5 and 10
minutes.  Setup of the computer/monitor took less than 5
minutes.

After all devices were set in place, remaining electrical and
gas  flow   connections   had   to   be   made.   The
RA-915+/RP-91C was connected to a powersource and to
the com puter/monitor. The balance was also connected to
the power source, but  not to the computer/monitor.  Gas
connections  had to  be made from the auxiliary  pump,
through a flip-up  flow gauge, and then to the instrument.
A mercury trap came pre-assembled and already inserted
in  the vent line which was  attached  to the instrument.
Overall, the electrical and gas flow connections required 10
minutes.

After initial setup of the RA-915+/RP-91 C was  complete,
the instrument required approximately 45 to  60 minutes to
warm to 800 °C. It is worth noting that setup of the balance
and computer/monitor were performed  during this time
period.
Overall, the time required to remove the instrument from its
shipping container, setup the device, allow the instrument
to  reach operating  temperature,  and setup peripheral
devices during instrument warm-up  is  estimated at
approximately 60 to 75 minutes.

Individual sample analysis times were not measured for the
duration  of  the  demonstration.   Analysis  time   was
estimated by recording start and stop times each day, and
accounting for any instrument  downtime due to operator
breaks  or device failure  and  maintenance  activities.
Therefore, the total time  for analyses included  blanks,
calibration  standards,  and   any sample   reanalyses;
however, the total number of analyses performed includes
only demonstration samples (samples, spikes, and SRMs),
not vendor blanks, calibration  standards, or reanalyses.
Table 6-8 presents the time measurements recorded for
each of the four days of operation of the RA-915+/RP-91C.
It should be noted thatthe second technician  was required
approximately  25% of the  time  in order to  achieve the
sample throughput observed during the demonstration, and
that the times in Table 6-8 are lapse times not labortimes.


Table 6-8. Time Measurements for Ohio Lumex

     Day       Day    Day     Day     Day    4-Day
                 1234      Total
                Run Time
                (minutes)
               195     540     540
1,275
             Analysis Time Summary

             In total, Ohio Lumex analyzed 197 samples during the
             demonstration. The turnaround time on individual sample
             analyses was 1 minute; however, the vendor chose to
             analyze replicates of virtually every sample.  Using the total
             analytical time reported in Table 6-8 and factoring in the
             second analyst (1275 minutes x 1.25analysts), 8.1 minutes
             per  analysis  is a  better  approximation  of  real  world
             operating conditions (assuming that replicate analyses are
             performed).  The vendor claims that 25 samples can be
             processed in an hour over an 8-hour day, an average of 2.4
             minutes per sample,  if replicates are not performed.  Field
             observations support this claim.

             The  number  of  blanks,  standards, and  reanalysis of
             samples outside of the  calibration range will vary from site
             to site, depending on project goals (e.g., are "greater than"
             results  acceptable or must all samples be quantified) and
             site  conditions (e.g.,  high concentration samples or very
             heterogeneous samples).   If project  goals  require all
                                                    47

-------
samples to be quantified, the number of reanalyses and
blanks required could be higherand, therefore, the time per
analysis could be greater.  If on the other hand, sample
results can be reported as "greater than" values (as was
gene rally doneduring the demonstration), then 6.5 minutes
per analysis is a reasonable average time.

Instrument disassembly was measured from  the time that
the lastsample or blank analysis ended until the instrument
was disassembled  and placed in the  original shipping
container.    Disassembly involved  turning off power,
disconnecting the powersource and interface cables to the
computer/monitor, and removal of the auxiliary pump unit.
Packaging involved  placing these components in wheeled
shipping cases.  It is estimated that this complete process
would take  one person approximately 30 minutes  to
complete.

6.1.5 Cost

Background information, assumptions  used in the cost
analysis, demonstration results, and a cost  estimate are
provided in Chapter 7.


6.2    Secondary Objectives

This  section discusses the performance results for the
RA-915+, along with the RP-91C attachment for soils,  in
terms of the secondaryobjectives described in Section 4.1.
These secondary objectives  were addressed  based on
observations of the RA-915+ and RP-91C combination and
information provided by Ohio Lumex.

6.2.1 Ease of Use

Documents the ease of use, as  well as the skills and
training required to properly operate  the device.
  Based  on   observations  made  during   the
  demonstration,   the  RA-915+/RP-91C   is
  reasonably easy to operate; lack of automation
  somewhat impairs the ease of use.  Operation
  requires  one  field  technician  with  a basic
  knowledge of chemistry acquired on the job or in
  a university and training on the instrument.
Six major elements were addressed in evaluating the ease
of use:

    Usefulness of Standard Operating Practices (SOPs)
    Operator training and experience required
    Ease of equipment setup
    Ease of calibration
    Ease of sample preparation
    Ease of measurement

Each of these is described, in sequence, in the following
paragraphs.   Five of the  six  elements  were given a
subjective rating -excellent, good, fair, and poor- based on
observations made by the instrument evaluator.  Operator
training and experience in merely discussed.

The vendor provided  two  SOPs, one entitled "RA-915+
Mercury  Analyzer"  and  the   other entitled  "RP-91 C
Attachment." These procedures were evaluated during the
demonstration.

The RA-915+ procedure provides the following information:

    Comprehensive safety guidelines

    Equipment list with corresponding images

    Equipment application, including applicable  media,
    detection  limits,  sample parameters,  and  detection
    technique

    Technical specifications and operating conditions

    Design  and operation of the analyzer, including a
    schematic

    Description of the  appearance and functions of the
    equipment from all angles

    Pre-operational  procedures  such  as  setup  and
    selection of operational mode

    Operational procedures for the  display  unit and for
    connection to a personal computer

    Detailed equipment test and maintenance procedures

    Troubleshooting guide

The SOP was well-organized, covered major information
requirements, and was easy to understand.  The safety
precautions were  thorough and well-documented.  The
parts and equipment list covered all required parts for use
of the  RA-915+.   The table clearly presented various
applications  and   related  data, including the  need for
ancillaryequipmentforwaterand soil analyses. Equipment
specifications matched those  documented  during  the
demonstration. The schematics and discussion of system
design and operational principles were well written. They
provided a thorough description of the operational principle
for  the  technology,  easily  understood  by  someone
unfamiliar with  the  technology.    The  description of
                                                   48

-------
appearance and functional controls was also  useful for
novices with the equipment.  Similarly, the detailed pre-
operational  procedures  were  generally  clear  and
comprehensive, allowing an operator with training on the
basics of the equipment to setup and operate the RA-915+.
A step-by-step evaluation of the procedure could not be
performed without impacting  the evaluation of analytical
throughput(see Section 6.1.4). Finally, the troubleshooting
table was easy to follow; however, there was no opportunity
to evaluate the table, for accuracy or completeness, during
the demonstration.

The SOP for the RP-91C was written in a  manner similar
to  the SOP for  the RA-915+.   The RP-91C  SOP  was
equally clearand thorough. Adequate detail was provided
to  assist an inexperienced operator in  equipment setup,
calibration, operation, troubleshooting, and maintenance.
The only maintenance activity that was performed during
the demonstration was replacement of the optical lense.
The  procedure  provides  adequate  information for a
technician to perform this maintenance  activity.

There were two crucial operational elements encountered
during the  demonstration  that  were  not  adequately
addressed in the RP-91C SOP.  The first was the selection
of  sample size such that the  results remain  within  the
calibration range. The SOP  instructs the user to use a
sample mass such that the mass of mercury is less than
1  ug;  however, selection of sample  size requires an
estimate  of the  expected mercury concentration. This
problem is not unique to the RA-915+/RP-91C;  any AA
instrument requires an estimate of sam pie concentration to
obtain sample results within a specified calibration range.
Second, no  information was provided on how  to handle
samples  that  were  outside of  the  calibration  range.
Procedures  implemented  during  the  demonstration
included analyzing  a blank sample after a sample  was
above the  calibration range  (to purge  the  system of
mercury)  and  reducing  sample  size  on  subsequent
reanalyses (if quantitative results are reported).  These
procedures were not described in the SOP; however, the
software provided the following prompt: "OUT OF RANGE".
It is not known whether the analyst is trained to analyze a
clean-out blankfollowing this prompt. The specific content
of  the training course is not known.

Ohio  Lumex provides a  1-day training  course  for an
additional cost of $600 to anyone who purchases, rents, or
leases the RA-915+/RP-91C.  The vendor asserts thatthis
is a 6-hour, comprehensive course covering software and
hardware installation, and operationaltraining on use of the
instrument for soil analysis.  The training course was not
evaluated during the demonstration. It may supplement
the SOPs.  Overall, the SOPs were good, but could use
additionaldetail related to sample size selection and results
outside of the calibration curve.

Ohio Lumex chose to operate the RA-915+/RP-91C with
one chemist during the demonstration.  The chemist held
a Ph.D. in chemistry.  Ohio Lumex claims that a laboratory
or field technician with a high school diploma and  basic
computer knowledge can  operate the equipment after a
1-daytraining course on the instrument. Field observations
support this claim.  Most operations required either use of
a keyboard  or mouse with  a Microsoft Windows-based
system.  The prompts were clear and easy to understand.

The operator performed equipment setup with ease. The
RP-91C connected rapidly and easily to the RA-915 + . The
unit plugged into a power supply and an interface with the
PC. The  external air pump and flow meter (used with the
RP-91 C)  were encased in a metal box with a hinged lid.
The lid was opened, the flow meter (rotometer) was hinged
upward  into  a vertical  position,  and  the  pump  was
connected to the rotometer with plastic tubing that comes
with the unit.  The self-standing balance was easily setup
and leveled.

It was difficult to determine exactly how  much time was
required for setup because a second vendor representative
helped with setup  to expedite the process. Typically, the
two vendor representatives  setup  the  equipment  in 5
minutes (the instrument was already unpacked from its
shipping  case).   Field observations  indicate  that one
person could setup all required equipment, starting with
shipping containers, in approximately 30 minutes, perhaps
less  in  some cases.  It should be  noted that once
instrument setup is complete, furnace warm-up requires
45-60 minutes to reach the operating temperature of 800
°C.  There was  no  display indicating  actual furnace
temperature; the operator merely observed theinnerlining
of the furnace. When it achieved a red glow, the furnace
was deemed hot enough for operation.  Overall,  the ease
of setup  was  good,  with  the only  drawback being  the
extended warm-up time for the instrument.

Calibration was performed by the operator alone. A blank
was analyzed and a 2-point calibration performed in less
than five minutes.  The RP-91C SOP (p13) recommended
three to four calibration points.  Calibration consisted of
weighing  the standard(s)and analyzing them according to
the steps in the SOP. A calibration curve was plotted and,
if acceptable, the  calibration coefficients  were accepted.
Overall, the ease of calibration was good.
                                                    49

-------
The operator was able to perform sample preparation and
analysis on a continuous basis. Sample preparation took
less than one minute per sample, on average, although
some minorassistance was performed by a second vendor
representative.   In general,  sample  preparation  was
unwieldy,  increasing  the potential for  lost sample or
weighing errors.

Sample preparation consisted of preparing small pieces of
aluminum foil (approximately 5-8 cm squares) for weighing
soil samples.  Several times during the demonstration, a
second vendor representative assisted with this task.  The
samples were initially mixed in the original container, using
a clean  quartz weigh boat. Approximately one half of the
sample  was transferred to the aluminum foil, which  was
then  placed on the digital balance.   The  balance  was
zeroed with the sample and foil, which were then removed
from  the balance.  A  small amount of sample was then
transferred to a clean, quartz weigh boat. Sample transfer
was completed by dipping the quartz cup of the weigh boat
into the  soil and scooping a small quantity into the bowl.
Each weigh boat was equipped with  an insulated  plastic
handle  to  allow  safe  handling  of  the  weigh  boat
immediately after heating.

The  balance  was placed  at ground  level,  inside  a
cardboard box (a standard file storage  box with dimensions
of 30 cm wide by 40 cm  long by 30  cm deep) to shield the
balance from wind effects.  The balance had a hinged top
cover with a 5-cm,  transparent  portal for convenient
viewing  of the sample; however, each time  a sample  was
inserted or removed,  the lid had to be opened and then
closed.  The operator sat in a chair almost continuously
during the demonstration so as to be able to reach the
balance and the sample injection port in alternating steps.
The location of the balance on the ground  was  required
because of the top-opening mechanism on the balance.
Inserting and  removing  samples through  the hinged top
and the box opening required great care.   Each  time the
operator took a short break, it was clear that he was stiff
from  working in a sitting position on a continuous basis.

An aluminum foil square with soil  sample was placed on
the balance (the balance is not part of the system, but can
be provided),  the balance was zeroed (tare weight), the
aluminum foil with sample was removed from the balance,
and a small amount of the sample was placed in the weigh
boat. The aluminum foil and residual  sample were placed
on the  balance again  (gross weight).  The  difference
between the tare weight (zero) and the gross weight (a
negative number) was the net weight used for the analysis.
This weight was manually calculated  and recorded in the
instrument  data  entry panel using the keyboard. This
operation was relatively easy to understand and could be
performed by a trained technician. However, there were
opportunities for spilling residual sample after weighing or
improperly calculating or entering netweightdata. Sample
weights can be determined by recording a tare weight for
the weigh boat, adding sample,  and  recording a gross
weight (the difference being the net weight).  In this way,
use of aluminum foil can be eliminated. The same issues
remain with manual calculations and data recording.

Sample analysis took less  than  1  minute per sample.
Because  of the lack of automation in the process, the
operator was constantly busy weighing samples, recording
and entering weights, inserting and removing weigh boats
from the RP-91C, or recording analytical results. It should
be  noted  that the  operator  always  analyzed  duplicate
samples and, oftentimes, analyzed triplicates to ensure
good analytical precision.

As samples were analyzed, vendor-proprietary software
screens allowed the user to track the  sample adsorption
curve on the screen and know when the analysis was
completed (see  Figure 6-3). The software is compatible
with Windows 95, 98, or 2000, and  can export data to
Microsoft Excel.  Sample analysis consisted of inserting the
pre-weighed sample  boat  in  the  small opening  in the
furnace,  watching  the  adsorption  curve  to show the
analysis was completed, and removing the sample weigh
boat.  Sample analysis was easy to understand and could
be performed by a trained technician.

During the  demonstration,  samples with concentrations
outside  of  the  equipment   calibration  range   were
encountered.  These samples would result in a peak that
was  above the top end of the  calibration  range. The
operator was required to analyze a blank to demonstrate
that excess mercury had been purged from the system; an
"OUT OF RANGE" screen prompt advised the operator
that the sample was not in the calibration range.

The digital  balance was the major peripheral item. The
vendorwill supply a  balance, or the usercan supply his/her
own  balance.   Though the balance  is not part of the
required vendor  equipment,  a balance is a  necessary
peripheral.  Therefore, the balance was evaluated during
the demonstration.  The reader  should note that  other
brands and models of balances may be used and these
may not perform in the same manner as the balance used
during the demonstration.  The balance itself was easy to
use, but  the lack  of an automatic  interface with the
monitor/software made the overall system more difficult to
operate and increased the potential for error.
                                                    50

-------
                FVogtom
                                                                       Vogtom
                                                                      EM^B   ?  j
                                                      ,1111      •   .
                                                      !	4 - - * • -I- --I	\- - *	*-
                                                           I              .
                                                      ,   .--*,_ I, _ -.1	i_  4	t
                                                   .k  ,   i  i  i   i   i   i
                                                   JL-i. -_ -J.--L--I	1. . i   u
        0  20 AC< SO  60 1QO 120 HO 16O 13O 2CO 22O 2*0 260 260 300 320 3+0 330 350 ICO 420 4-iO <6O -*3O SOO S2C S*O 560 560 600
Figure 6-3. RA-91 5+/RP-91 C peak screen.
6.2.2    Health and Safety Concerns

Documents potential health and safety  concerns
associated with operating the device.
  No significant health and safety concerns were
  noted  during the  demonstration.   The  only
  potential health  and safety concerns identified
  were the generation of mercury vapors and the
  potentialforburns with careless handling of hot
  quartz sample boats.  The vendor  provides a
  mercury filter as standard equipment; exercising
  caution  and  good  laboratory practices  can
  mitigate the potential for burns.
Health and safety concerns, including chemical hazards,
radiation  sources,  electrical  shock,  explosion,  and
mechanical hazards were evaluated.
No chemicals were used in the preparation or processing
of samples, except for analytical standards.  During this
demonstration, the analyticalstandards were soil SRMsfor
mercury.   These were handled  with  gloves, and  the
operator wore safety glasses at all times.  Such standard
laboratory precautions mitigate the potential for  dermal
exposure.  Similar procedures were also used for  soil
samples which contained mercury.  Because the RP-91C
attachment is designed to thermally  convert mercury
compounds to mercury vapors as part of the  analytical
process,  and  no fume hood  was present  to exhaust
mercury vapors after analysis, inhalation of mercury was a
concern.  The vendor installs a  proprietary mercury trap in
the   exhaust  line  from   the   RP-91C  attachment.
Measurements were taken with a Jerome 431-x gold film
mercury  vapor  analyzer  manufactured  by Arizona
Instruments Corporation. The  instrument has a range of
0.000 to 0.999 mg/m3.  In all cases, readings were 0.000
mg/m in  the breathing zone of the operator.

In looking at electrical  shock potential,  two factors were
evaluated: 1) obvious  areas where electrical wires  are
                                                   51

-------
exposed and 2) safety certifications. No obviouslyexposed
wires  were  noted  during  the   demonstration.    All
connections  between  equipment  were  made  using
standard  electrical power cords, modem interface lines,
and 8-pin cords. Power cords were grounded and a surge
protector (provided by EPA) was utilized. The RA-915 +
line voltage (110 volts AC) was stepped down to 12 volts
(DC)  at  2.5  amps  using  a power  transformer.   The
RA-915+ was  U L,  SA, and  CE certified,  among  other
certifications marked on the transformer.  The balance
utilized during the demonstration was a KND 1-microgram,
digital balance, model FX-320. It operated on a  12-volt DC
(at 0.Samps) power  source that was  stepped down from
110 volts and 7.5  amps.   This device had  no visible
certifications.  A standard  laptop computer  was used
(Hewlett  Packard Pavilion,  model HP  F145A).  This
computer had UL, CE, and numerous other certifications.

No obvious explosion  hazards were  noted. The use of
ambient air as a  carrier gas  eliminates the possibility of
explosion associated with the use of  oxygen as a carrier
gas in the presence of ignition sources.

No  serious mechanical hazards were noted during  the
demonstration.   All equipment  edges were  smooth,
minimizing any chance of cuts or scrapes. The hinged lid
on the RP-91C  pump/rotometer housing  presents  the
possibility of a pinch  hazard, as would any hinged device;
however, the lid  is very light weight, remained opened
throughout the demonstration, and is  designed to remain
securely in place when the lid is  open.

6.2.3   Portability of the Device

Documents the portability of the device.
  The RA-915+ air analyzer was easily portable,
  although the device, even when carried in the
  canvas sling, was not considered light-weight.
  The  addition  of  the RP-91C  and associated
  pump unit preclude this from being a truly field
  portable  instrument.     The  device   and
  attachments  can  be transported  in  carrying
  cases by two people, but must then be setup in
  a stationary location. It was easy to setup, but
  the combined instrument is better characterized
  as mobile rather than field portable.
The RA-915+ measured 46 cm (L) by 11 cm (W) by 21 cm
(H).  The weight was reported as 7.5 kg.  The RP-91C
attachment measures  32 cm by 24  cm by 12 cm and
weighs 5.5 kg. Also included as a standard feature with
the RP-91 C were a monitor, keyboard and mouse; and the
pump/rotometer case.  All were light  weight and easily
portable, with the pump and rotometer enclosed in a metal
carry case with a handle.  Remote locations also require
the use of a generator or 12-volt battery.

The RA-915+/RP-91C was not easily  portable  from the
standpoint of being a handheld instrument.  Movement and
setup of the equipment gene rally took two people about 10
minutes,  with the equipment  already unpacked.   It is
estimated that one person would require approximately 30
minute to unpack the instrument from the carrying case
and complete setup.  The RA-915+ air analyzer was easily
portable, although the device, even when carried in the
canvas sling, was notlightweight. The addition ofthe RPD-
91, pump unit, and battery preclude this from being a truly
field portable instrument.  The device  and attachments can
be transported by carrying two containers with handles,
plus the RP-91 C attachment, the monitor/mouse, power
cords/transformers, and  data cables (plus an analytical
balance).   Even  when  placed  in wheeled  shipping
containers, the device is only portable in the  sense that it
can be managed in a manner similar to wheeled luggage.
Transport in paved areas is easy; transport up a rocky
incline would be difficult.  The  device is, however, easily
transportable in any size vehicle, and can be moved to any
location where a vehicle  can go.  Therefore, it would be
practical for many field  applications.   It  should  not be
characterized  as  a  handheld instrument.   During  the
demonstration, the complete soil analytical unit, including
the monitor and air pump, easily fit on  a table measuring 30
inches wide by 72 inches long, with adequate space for
sample staging and preparation.

The balance required a flat, stable surface.  Because the
balance was top loaded, the vendor chose  to place the
balance on the ground near the chair in which the operator
sat. The balance was placed inside of a cardboard box to
eliminate the effects of wind on the enclosed balance.  This
setup required the operator to repeatedly bend over to tare
the sample (on aluminum foil) and again after the analytical
sample was removed in the sample boat.

The RA-915+/RP-91Cwasoperated using a 12-volt battery
during the Visitors' Day. The unit appeared  to  operate well,
although no samples were being processed for evaluation
and no evaluation  was made of the amount of time the
battery lasted.  The vendor  reports a  battery life of
approximately  3.5 hours.   Alternatively,  a  standard
electrical source of 11 0 volts can be utilized. Power can be
supplied by any standard 2,000 watt  generator.
                                                   52

-------
For the demonstration, the vendor was supplied with a
folding table, two chairs, and a tent to provide shelter from
inclement  weather.  In addition,  one  1-gallon  container
each was  provided for waste soil and decontamination
water utilized to clean weigh boats. A 2-gallon zip-lock bag
was furnished for disposal of used gloves, wipes, and other
wastes which were contaminated during the demonstration.
Finally, a large trash bag was supplied for disposal on non-
contaminated wastes.

6.2.4    Instrument Durability

Evaluates the  durability of the device based on  its
materials of construction and engineering design.
  The RA-915+/RP-91C was well  designed  and
  constructed for durability.
The  outside of the RA-915+ is constructed  of sturdy
aluminum (2 mm thickness) that was painted to prevent
corrosion. The exterior of the RP-91C furnace is stainless
steel; the interior is quartz.  The  furnace is covered by a
painted metal guard to  prevent burns. The  auxiliary air
pump and rotometer were  housed in a sturdy, painted
aluminum box (2 mm thickness).  The lid of this container
was  secured with  hinges,  and  was  opened when the
rotometer was setup for operation. No environmental (e.g.,
corrosion) or mechanical (e.g., shear stress or impact)
tests  were performed;  however, the outer shell of the
instrument was well-designed and constructed,  indicating
that  the  device would likely  be   durable  under field
conditions.

No evaluation  could be made  regarding the  long-term
durability  of the  furnace,  analytical cell,  or circuitry.
External  visual  inspection  did  not  indicate  that  any
problems were likely, although many parts were obscured
from view. The vendor offers a standard 1-year warranty,
and  will   provide  a  1-year  extended  warranty  and
maintenance plan at the owner's  cost. This warranty cost
$2,400, and covers all parts and labor except consumable
items (lamp, rechargeable battery  for the RA-915 + ,  and
filters). The only mechanical part with the potential to fail
overtime is the air pump. Long term operation could result
in the need for repairer replacement of the air pump. The
heating element of the furnace is  the other part with some
potential for long term failure, although it worked properly
during the demonstration.  Plastic tubing for the rotometer
may also be subject to long term failure due to the effects
of sun  and  temperature or mechanical failure.  Overall,
however, the  design  and  construction of the instrument
support the vendor claim  that this instrument is durable.
The vendor asserts that life expectancy of the furnace and
air pump is  3-5 years  with  heavy use.

Finally, most of the demonstration was performed during
rainfall  events ranging from  steady to  torrential.   The
instrument  was located under a tent with side flaps to
protect it from rainfall. Even when it was not raining, the
relative humidity was  high. The high humidity and rainfall
had no apparent impact on the reliability of the instrument
operation.

6.2.5    Availability  of Vendor  Instruments
         and Supplies

Documents the availability of the device and spare
parts.
  The  RA-915+/RP-91C  is readily  available  for
  rental, lease, or purchase.  Spare  parts and
  consumable supplies  can  be  added  to  the
  original  instrument order or can be received
  within   24-48   hours   of   order  placement.
  Standards are readily available from  laboratory
  supply firms or can be  acquired  through Ohio
  Lumex.
EPA representatives contacted Ohio Lumex regarding the
availability  of  the  RA-915+/RP-91C   and  supplies.
According to  Ohio Lumex, such  systems are available
within a  few weeks of order placement, but can  be
expedited.  The  RA-915+/RP-91C  also is  available  for
rental or leasing and lead time is subject to availability.

The  instrument comes standard with four quartz-sample
injectors; no  other parts or  consumable supplies are
provided standard with the equipment. Spare parts, such
as the furnace, furnace lenses, the air pump, or additional
sample injectors, can be ordered individually. These and
any  other  parts  are  available   within  24-48  hours.
Standards can be provided  by Ohio  Lumex or can  be
purchased from a laboratory supply firm.
                                                    53

-------
                                              Chapter 7
                                       Economic Analysis
The purpose of the economic analysis was to estimate the
total cost of mercury measurement at a hypothetical site.
The cost per analysis was estimated; however, because
the cost per analysis would decrease  as the number  of
samples analyzed increased, the totalcapital costwas also
estimated and reported.  Because unit analytical costs are
dependent upon the total number of analyses, no attempt
was made to compare the cost of field analyses with the
RA-915+/RP-91C to the  costs associated with the referee
laboratory.   "Typical" unit cost results gathered from
analytical laboratories were reported to provide a context
in which to review the RA-915+/RP-91C costs.  No attempt
was made to make a direct comparison between these
costs  because  of differences in sample throughput,
overhead factors, total equipment utilization factors, and
other  issues that  make  a  head-to-head  comparison
impractical.

This  chapter describes the  issues and  assumptions
involved in the  economic  analysis, presents  the costs
associated with field  use  of the RA-915+/RP-91C, and
presents  a  cost  summary  for  a "typical"  laboratory
performing sample analyses using the reference method.


7.1    Issues and Assumptions
Several factors  can affect mercury measurement costs.
Wherever possible  in  this  chapter,  these factors are
identified  in  such  a  way that  decision-makers  can
independently  complete  a  project-specific  economic
analysis.  Ohio  Lumex offers three options for potential
RA-915+/RP-91C users: 1) purchase of the instrument,  2)
weekly rental, and 3) equipment leasing with an option  to
purchase. Because  site  and  user requirements  vary
significantly,  all  three of these options are  discussed  to
provide each user with the information to make a case-by-
case decision.

A  more detailed cost analysis was performed on the
equipment rental option because this case represents the
most frequently encountered field scenario. The results of
that cost analysis are provided in Section 7.2

7.1.1  Capital Equipment Cost

The RA-915+/RP-91C comes complete with the analytical
instrument (RA-915+), furnace attachment and auxiliary air
pump/flow meter (RP-91C), a  set of 4  quartz injection
spoons with ceramic handles, and software, regardless of
whetherthe instrument is purchased, rented, or leased. An
optional digital balance is available for purchase, rental, or
lease from Ohio Lumex, but not included  in the base cost
of any of these three options because the usermay provide
his/herown balance. Because there is no outputsignallink
between the balance and the system, any balance can be
used.  A laptop computer with display screen  can be
purchased, rented, or  leased from  Ohio Lumex or can be
provided by the user.  A user-supplied printer can also be
attached to the system using a  standard printer cable; no
purchase, lease, or rental option is available forthe printer.

The  cost  quoted  by  Ohio Lumex  does  not  include
packaging orfreightcosts to ship the instrument to the user
location.   No deposit is required  for rental and  lease
agreements.  A user  manual is provided at no cost.  A
6-hour training session is available for an additional fee.

7.1.2  Cost of Supplies

The cost of supplies was estimated based on the supplies
required  to  analyze  demonstration  samples   and
discussions  with  Ohio  Lumex.    Requirements   vary
depending on whether solid or liquid samples are  being
                                                   54

-------
analyzed. For purposes of this costestimate, only supplies
required to analyze solid samples are factored into the cost
estimate because only solid samples were analyzed during
the demonstration.  Supplies required for liquid samples
are not noted because a different analytical attachment is
used.  Supplies consisted  of consumable  items (e.g.,
calibration standards, mercury trap) and non-consumables
that could not be returned because they were contaminated
or the remainder of a set (e.g., quartz injection spoons).
The purchase prices and supply sources were obtained
from Ohio  Lumex, and  confirmed  by contacting those
sources.   Because the  user cannot return  unused or
remaining portions of supplies, no  salvage  value was
included  in  the cost  of  supplies.   PPE  supplies  were
assumed to be part  of the overall site investigation or
remediation costs; therefore, no PPE costs were included
as supplies.

7.1.3  Support Equipment Cost

During the demonstration, the RA-915+/RP-91C, air pump,
laptop computer, and  balance were  operated using AC
power.   The costs  associated with  providing the power
supply  and  electrical  energy were  not included in the
economic analysis; the demonstration site provided AC
power at no cost.  During Visitors' Day, all of the items
mentioned were operated using a 12-volt DC battery.

Because  of the  large number of samples expected to be
analyzed  during the demonstration, EPA provided support
equipment,  including tables and chairs for the two field
technician's comfort.  In addition, EPA provided a tent to
ensure  that  there were no  delays in the project due to
inclement weather.  These costs may not be incurred in all
cases.  However, such equipment is frequently needed in
field situations, so these costs were included in the overall
cost analysis.

7.1.4  Labor Cost

The labor cost was estimated based on the time required
for RA-915+/RP-91C setup, sample preparation, sample
analysis,  summary data  preparation, and  instrument
packaging at the end of the  day. Setup time covered the
time required to take the  instrument out of its packaging,
set up all  components, and ready the device for operation.
However, the RA-915+/RP-91Cwas already removed from
the original shipping container.  Therefore, this time was
estimated rather than measured.   Sample preparation
involved mixing  samples with the injection spoon. Sample
preparation  was  generally completed  while previous
samples were being analyzed. Sample analysis comprised
the time required to analyze all sam pies and submit a data
summary. The data summary was strictly a tabulation of
results in whatever form the vendor chose to provide. In
this case, the vendortranscribed results from the electronic
database to the field COC forms (no printer was available
in the field).  The  time required to perform all tasks was
rounded to the nearest hour.  However, for the economic
analysis, it was assumed that a field technician who  had
worked for a fraction of a day would be paid for an entire 8-
hourday. Based on this assumption, a  daily rate for a field
technician was used in the analysis.

During the demonstration, EPA representatives evaluated
the skill level required for the field technician to analyze
and report results  for mercury samples.  Based on these
field observations, a field technician with basic chemistry
skills acquired on  the job or in a university setting, and a
1-day training course specific to the RA-915+/RP-91C, was
considered qualified to operate the instrument.  For the
economic analysis, an hourly rate of $15 was used for a
field technician.  A multiplication factor of 2.5  was applied
to laborcosts to account for overhead costs. Based on this
hourly rate and multiplication factor, and  an 8-hour day, a
daily rate of $300 was used for the economic analysis.
Monthly labor rates  are based on  the assumption of an
average of 21 work days per month.  This assumes  365
days per year, and non work days totaling 113 days per
year (104 weekend days and 9 holidays; vacation days are
discounted assuming vacations will be scheduled around
short-term work   or  staff will be  rotated  during long
projects).  Therefore, 252 total annual work days  are
assumed.

7.1.5  Investigation-Derived Waste Disposal
        Cost
Ohio Lumex was  instructed to segregate  its waste  into
three  categories  during  the  demonstration:  1) general
trash;  2)  lightly  contaminated  PPE  and  wipes;  and
3) contaminated soil (both analyzed and  unanalyzed)  and
other highly contaminated wastes.  General trash was not
included as IDW and is not discussed in this document. A
separate container was provided for each waste category.

Lightly  contaminated wastes  consisted primarily of used
surgical gloves,  wipes, and aluminum  foil.  The surgical
gloves  were  discarded for one of three  reasons: 1) they
posed a  risk of cross contamination (noticeably soiled),
2)  they posed a potential health and safety risk (holes or
tears), or 3) the operator needed to perform other tasks or
take a break.  The rate of waste generation was in excess
of what would be expected in a typical application of this
instrument.   In addition, the EPA evaluators occasionally
                                                    55

-------
contributed used gloves to this waste accumulation point.
Wipes were used primarily to clean injection spoons (after
cooling)  between  samples.   In  cases  where  cross
contamination is not a major concern (e.g., field screening
orall samples are in the same concentration  range), lesser
amounts of waste would likely be generated.  Aluminum foil
contained the soil while it was being weighed.  In the case
of soils, the foil contained virtually no residual soil, and was
discarded  in  this container.   Foil used  to  weigh wet
sediments was considered highly contaminated, and was
discarded with the soil.

Contaminated soils consisted primarily of soil placed in the
injection spoon and then removed because the weightwas
above the target weight.  Soil that was analyzed was also
placed in this waste container as a precaution, even though
it is expected that such soils would be free of mercury after
being heated  to high temperatures  in  the analytical
instrument. In some cases, these sample residuals may
not need to be handled as hazardous waste.

The contaminated soil, excess sample material, and lightly
contaminated  gloves  and  wipes  were   considered
hazardous wastes for purposes of this cost analysis.

7.1.6  Costs Not Included
Items for which costs were not included in the economic
analysis are discussed in the following subsections, along
with the rationale for exclusion of each.

Oversight of Sample Analysis Activities.  A  typical user
of the RA-915+/RP-91C would not be required to  pay for
customer  oversight  of   sample   analysis.    EPA
representatives  observed and documented all activities
associated with sample analysis during the demonstration.
Costs for this oversight were not included in  the economic
analysis because they were project specific.  For the same
reason,  costs for EPA oversight of the referee laboratory
were also not included in the analysis.

Travel and Per Diem  for Field Technician.   Field
technicians  may  be  available  locally.    Because  the
availability of field technicians is primarily a function of the
location of  the project site, travel and  per diem costs for
field  technicians  were  not included in the  economic
analysis.

Sample Collection and Management. Costs for sample
collection and management activities, including  sample
homogenization  and  labeling,  are site  specific  and,
therefore,  not   included  in  the  economic analysis.
Furthermore, these activities were not dependent upon the
selected  reference  method  or  field  analytical  tool.
Likewise, sample shipping, COC activities, preservation of
samples,  and  distribution  of samples were  specific
requirements  of this project that applied to all vendor
technologies and may vary from site to site. None of these
costs was included in the economic analysis.

Items Costing Less than $10. The costs of inexpensive
items,  such as paper  towels, was  not included in the
economic analysis.

Documentation Supplies. The costs for digital cameras
used to document  field activities were not included in
project costs.   These  were  considered  project-specific
costs that would not be needed in all cases. In addition,
these items can be  used for multiple projects.  Similarly,
the cost of supplies (logbooks, copies,  etc.) used  to
document field activities was not  included in the analysis
because such supplies  are project specific.

Health and Safety Equipment.  Costs  for rental of the
mercury vapor analyzer and  the  purchase  of PPE were
considered site specific and,  therefore, not included as
costs in  the  economic analysis.  Safety  glasses and
disposable gloves were  required for sample handlers and
would likely be required in most cases.  However, these
costs are not specific to  any one vendor or technology. As
a result, these costs were not included  in the  economic
analysis.

Mobilization and  Demobilization. Costs for mobilization
and demobilization were considered site specific, and not
factored  into  the  economic analysis.  Mobilization and
demobilization costs actually  impact laboratory analysis
more than field analysis. When a  field economic analysis
is  performed,  it may be  possible to perform  a single
mobilization  and  demobilization.   During  cleanup  or
remediation   activities,   several   mobilizations,
demobilizations, and associated  downtime costs may be
necessary when an off-site laboratory is used because of
the wait for analytical results.

7.2    RA-915+/RP-91C  Costs
This section presents information on the individual costs of
capital equipment, supplies, support equipment, labor, and
IDW disposal for  the  RA-915+/RP-91C.    Table  7-1
summarizes the RA-915+/RP-91C costs.
                                                    56

-------
Table 7-1. Capital Cost Summary for the RA-915+/RP-91C
Item Quantity Unit Cost
($)
Purchase RA-915+/RP-91C 1
Monthly Rental of RA-915+/RP-91C 1
Monthly Lease of RA-91 5+/RP-91C 1
Purchase Balance (Optional) " 1
Purchase Printer (Optional) " 1
$29,000
$3,500
$3,500
$600
$150
1 -Month
$29,000
$3,500
$3,500
$600
$150
Total Cost for Selected Project Duration
3-Month 6-Month 12-Month
$29,000
$10,500
$10,500
$600
$150
$29,000
$21,000
$21,000
$600
$150
$29,000
$42,000
$42,000
$600
$150
24-Month
$29,000
$84,000
$84,000
$600
$150
       A balance is required, but may be provided by the user. A printer is optional; it may also be provided by the user.
7.2.1  Capital Equipment Cost
During the  demonstration, the RA-915+/RP-91C was
operated for approximately two and one-half days and was
used to analyze 197 samples.

Figure 7-1 summarizes the RA-915+/RP-91C capital costs
for  the three  procurement options: rental, lease, and
purchase.  These costs reflect the basic RA-915+/RP-91C
system, with the optional computer. No other options (e.g.,
balance or  printer) and no supply or shipping costs are
included.  As would be expected, this chart clearly shows
thateither rental or leasing is the most cost-effective option
forshort-term projects (less than 8 months). When project
duration (or use  on  multiple  projects)  exceeds  eight
months, the purchase option is the most cost-effective.
These scenarios cover only capital  cost,  not the cost of
supplies, support equipment, labor, and IDW disposal.
                                      Purchase
                                   Rental
                                Lease
                   Munths
 Figure 7-1.  Capital equipment costs.
The RA-915+/RP-91C, including the auxiliary air pump and
flow  meter, and related electrical connections,  sells for
$29,000. Also included are four quartz injection spoons,
plastic  tubing  for air connections,  and an  instruction
manual. The portable computer/monitor is not included in
the cost, but the software is included.  A balance is also
required and can be purchased from Ohio Lumexfor $600,
or rented or leased for $150 per week. However, the user
can supply any existing balance. The costs presented in
Figure 7-1  do not include the cost of the balance.

7.2.2   Cost of Supplies

Supplies used during the demonstration included  solid
SRMs and a mercury trap.  NIST soil SRMs sell for $250
each; typically both  a high  and a low standard will be
required for many applications, for a total cost of $500.  If
sediments are analyzed, a NIST sediment SRM may be
obtained for $150.   No costs for a  sediment SRM  are
included in this analysis.  These standards  have a  life-
expectancy of one to three years (one year is assumed for
this cost analysis).  A mercury trap was  also  required
during the demonstration and would likely be needed for
most field  applications. The  proprietary trap costs $250
and  comes pre-assembled.    The trap  is  good for
approximately  1,000 samples.   Based on  the sample
throughput achieved  during the  demonstration, the trap
should  last three weeks if running one  shift per day and
one week if running three shifts per day.

7.2.3   Support Equipment Cost
Ohio Lumex  was provided with a  10x10 foot tent for
protection   from   inclement   weather   during   the
demonstration. Itwas also provided with one table and two
chairs for  use during sample preparation and analytical
activities. The rental costfor the tent (including detachable
sides, ropes, poles, and pegs) was $270 per week. The
rental costfor the table and two chairs for one week totaled
                                                    57

-------
$6. Total support equipment costs were $276 per weekfor
rental.

For longer projects, purchase of support equipment should
be  considered.     Two   folding   chairs  would  cost
approximately $40. A 1 0x10 foot tent would cost between
$260 and $1,000, depending on the construction materials
and the need for sidewalls and  other accessories (e.g.,
sand stakes, counterweights, storage bag, etc.). A cost of
$800 was used forthis cost analysis.  A folding table would
cost between $80 and $250, depending on the supplier.
For purposes of this  cost analysis, $160 was used. Total
purchase costs for support equipment are estimated at
$1,000.

The RA-915+/RP-91C requires an electrical source: either
110/220 volts 50/60 Hz AC at 1.2 amps or 12 volts DC at
18 amps.  No  cost was  calculated  for the DC electrical
source  used during the  demonstration because  any
instrument will  require a power source. The Ohio Lumex
instrument reportedly can  be operated on a rechargeable
12-volt battery for 3.5 hours.  (Ohio Lumex, 2003)  The
battery can be purchased for less than $1 00. Alternatively,
a standard 2,000 watt generator can  be used to power the
instrument.   The  estimated cost for a locally-supplied
generator is  $500; Ohio Lumex will also rent a generator
for $200 per week, or one can  be rented from a local tool
rental firm.

7.2.4   Labor Cost

One field technician was  required for 3 days  during the
demonstration to complete sample analyses and prepare
a data  summary. Based on a  labor rate of $300 per day,
total labor cost for application of the RA-915+/RP-91C was
$900 for the  2.5-day period (assumes the technician was
payed for a complete day on the third day).  Labor costs
assume qualified technicians are available locally, and that
no hotel or  per diem costs are applicable.   Table 7-2
summarizes labor costs  for various operational periods.
The costs presented  do not include supervision and quality
assurance because  these would be associated with the
use of  any analytical instrument and are a portion of the
overhead multiplier built into the labor rate.

7.2.5   Investigation-Derived Waste Disposal
        Cost
Ohio   Lumex  generated  waste   personal  protective
equipment,  contaminated wipes and aluminum foil, and
excess soil waste. The PPE  waste was charged to the
overall  project due to project constraints. The minimum
waste volume is a 5-gallon container.  Mobilization  and
container drop-off fees were $1,040; disposal of a 5-gallon
soil waste containerwas $400. (This cost was based on a
listed waste stream with a hazardous waste numberU151.)
The total IDW disposal cost was $1,440. These costs may
vary significantly from site to site, depending on whether
the waste is classified as hazardous or nonhazardous and
whether  excess sample material  requiring disposal  is
generated.   Table 7-3  presents IDW costs for various
operational periods, assuming that waste generation rates
were   similar  to  those  encountered  during  the
demonstration.
Table 7-2.  Labor Costs
 Item
             1
                            Months

                             6
                                     12
                                              24
 Technician  $6,300   $18,900   $37,800  $75,600   $151,200
 Supervisor  NA      NA      NA      NA      NA
 Quality    NA      NA      NA      NA      NA
 Control
 Total
$6,300  $18,900   $37,800  $75,600   $151,200
Table 7-3.  IDW Costs
Item
Drop Fee
Disposal
Total

1
$1,040
$400
$1
,440

$3
$1
$4
3
,120
,200
,320
Months
6
$6,240
$2,400
$8,640

$12
$4
12
,480
,800
$17,280

$24
$9
$34
24
,960
,600
,560
7.2.6  Summary of RA-915+/RP-91C Costs

The  total  cost  for  performing   mercury analysis  is
summarized in Table 7-4.   This  table reflects costs for
projects ranging from 1 -24 months. The rental option was
used for estimating  the  equipment  cost.  Table 7-5
summarizes total costs and the percentage of total costs
for the actual demonstration.
                                                    58

-------
Table 7-4. Summary of Rental Costs for the RA-915+/RP-91C
              Item
             Quantity    Unit
                Unit
                Cost
                ($)
                         Months
                 3         6
                                                                                               12
                                                                                     24
 Capital Equipment
   Monthly Rental                      1
 Supplies
  Quartz Injectors a                    1
  Solid SRM b                         2
  Mercury Trap (all components)         1
 Total Supply Cost
 Support Equipment0
  Table (optional) - weekly              1
  Chairs (optional) - weekly             2
  Tent (for in clement weather only)       1
 - weekly
 Total Support Equipment Cost
 Labor
  Field Technician (person day)          1
 IDW
  Drop Fee                          NA
  Disposal                           NA
 Total IDW Costs
 Total Cost
                         NA
              $3,500     $3,500    $10,500    $21,000    $42,000    $84,000
                        each     $150
                        each     $250
                        each     NA
                        each
                        each
                        each
                  $5
                  $1
                $270
                        hour
                        week
              $1,040
                $400
        $0
      $500
       $65
      $565

       $20
       $10
      $800

      $830
$0
$500
$250
$750
$150
$500
$500
$1,150
$300
$1,000
$1,000
$2,300
$600
$1,500
$2,000
$4,100
 $60
 $25
$800

$885
$120
 $40
$800

$960
$160
 $40
$800
$160
 $40
$800
                                                                         $1,000     $1,000
                 $38     $6,300    $18,900    $37,800    $75,600   $151,200
    $1,040     $3,120     $6,240    $12,480   $24,960
      $400     $1,200     $2,400     $4,800    $9,600
    $1,440     $4,320     $8,640    $17,280   $34,560
   $18,935    $32,645    $69,715   $138,630  $274,930
a   For solid samples and SRMs; a set of 4 comes standard and is assumed to last 2 years, with breakage of one per 6 months
b   Only for use with solid samples; assumes two SRMs are required (a low and a high standard) with a life expectancy of 1 year (some standards
    will have longer shelf lives).
c   Rental costs were used through the 3-month period for chairs and the 6-month period forthe table. Purchase costs were used for longer periods.
    Purchase costs for the tent were used for all periods.
d   Other than unit costs, all costs are rounded to the nearest $5.
e   The instrument is available for weekly rentals at $1,500 per week.
Table 7-5. RA-915+/RP-91C Costs by Category
Category
Instrument
Supplies
Support Equipment
Labor
IDW Disposal
Category Cost
w
$1,500
$500
$277
$900
$1,440
Percentage of
Total Costs
32.5%
10.8%
6.0%
19.5%
31.2%
 Total
$4,617
100.0%
The  cost  per analysis based upon  197 samples, when
renting the RA-915+/RP-91 C, is $23.44 per sample. The
cost per ana lysis for the 197 samples, excluding instrument
cost, is $15.82 per sample.

7.3     Typical Reference Method Costs
This section presents costs associated with the reference
method used  to analyze the demonstration samples for
mercury. Costs for other project analyses are not covered.
The referee laboratory utilized SW-846 Method 7471B for
all soil and sediment samples.  The  referee  laboratory
performed 421 analyses over a 21-day time period.
                                                       59

-------
A  typical  mercury  analysis cost,  along with percent    laboratory can be reduced to 14, 7, oreven fewer calendar
moisture for dry-weight calculation, is approximately $35.    days,  with  a cost multiplier of  from  125%  to 300%,
This cost covers sample management and preparation,    depending on project needs  and laboratory availability.
analysis, quality assurance,  and preparation of a data    This results in a cost range from $6,895 to $20,685. The
package. The total cost for 197 samples at $35 would be    laboratory  cost does  not include  sample  packaging,
$6,895. This is based on a standard turnaround time of    shipping, or downtime caused  to the project while awaiting
21-calendar days. The sample turnaround time from the    sample results.
                                                    60

-------
                                              Chapter 8
                            Summary of Demonstration Results
As discussed previously in this  ITVR, the Ohio Lumex
RA-915+/RP-91C was evaluated by having the vendor
analyze 197 soil and sediment samples.  These 197
samples   consisted  of  high-,  medium-,  and   low-
concentration field samples from four sites, SRMs, and
spiked field samples.  Table 8-1 provides a breakdown of
the numbers of these  samples for each sample type, and
concentration range orsource. Collectively, these samples
provided the different matrices, concentrations, and types
of mercury needed to perform a comprehensive evaluation
of the RA-915+/RP-91C.

8.1    Primary Objectives
The primary objectives of the demonstration were centered
on evaluation of the field instrument and performance in
relation to sensitivity, accuracy, precision, time foranalysis,
and cost. Each of these objectives was discussed in detail
in previous chapters and is summarized in the following
paragraphs. The overall demonstration  results suggest
that the experimental design was successful for evaluation
of the Ohio Lumex RA-915+/RP-91C.  Quantitative results
were reviewed. The results from this field instrument were
found to be comparable to standard analyses performed by
the  laboratory in terms  of precision, and  accuracy in
comparison to SRMs. Field  sample analyses were not
found to be comparable, however, to referee laboratory
results. The collected data provide  evidence to  support
these statements.

The two primary sensitivity evaluations performed for this
demonstration  were  the MDL  and POL.    Following
procedures established in 40  CFR Part 136, the MDL is
between 0.0053 and 0.042 mg/kg based on the results of
seven replicate analyses for low standards. The equivalent
MDL for  the referee  laboratory  is 0.0026 mg/kg.   The
calculated MDL is only intended as a statical estimation
and not a true test of instrument sensitivity.

The low standard calculations using MDL values suggest
that a POL for the Ohio Lumex field instrument may be as
low as 0.027mg/kg (5 times the lowest calculated MDL).
The   referee   laboratory  POL  confirmed  during the
demonstration  is 0.005 mg/kg with a %D of <10%.  The
%D for the average Ohio Lumex result for a tested sample
with  a referee  laboratory value of 0.06 mg/kg is  0.072
mg/kg, with a %D of 20%.  This was the lowest sample
concentration tested during the demonstration that is close
to, but not below, the calculated POL noted above. Both
the  MDL  and  POL were  determined for soils  and
sediments.

Accuracy was  evaluated by comparison to SRMs and
comparison to the referee laboratory analysis for field
samples. This included spiked field samples for evaluation
of additional concentrations not otherwise  available.   In
summary, Ohio  Lumex  data   were  within SRM 95%
prediction  intervals 93% of the  time,  which suggests
significant equivalence to  certified  standards.    The
comparison between the Ohio  Lumex field data and the
ALSI results, however, suggest that the two data sets are
notthe same. When a unified hypothesis testis performed
(which  accounts  for  laboratory bias), this  result  is
confirmed. Ohio Lumex data were found to be both above
and  below referee laboratory  concentrations, therefore
there is no implied or suggested bias.  The number of Ohio
Lumex average values less than 30% different from the
referee  laboratory results  or   SRM  reference  values;
however, was 19 of 33 different sample lots.  Ohio  Lumex
results, therefore, can often provide a reasonable estimate
of accuracy for field determination, and may be affected by
interferences notidentified bythis demonstration. Because
the Ohio Lumex  data compare favorably to  the SRM
                                                   61

-------
values,  the  differences  between  Ohio  Lumex and  the
referee   laboratory  are  likely  the  result  of  matrix
interferences.

The  precision  was determined  by analysis of replicate
samples. The precision of the Ohio Lumex field instrument
is better than the referee laboratory precision. The overall
average  RSD, is 22.3%  for  the referee   laboratory
compared to the Ohio Lumex average RSD of 16.1%. This
is primarily because of the better precision obtained for the
SRM  analyses by Ohio  Lumex.   Both  the  laboratory
precision and  the Ohio  Lumex precision goals of 25%
overall RSD  were achieved.

Time measurements were based on the length of time the
operator spent performing  all phases  of the  analysis,
including setup, calibration, and sample analyses (including
all reanalysis).  Ohio  Lumex analyzed  197 samples in
1,275 minutes  times 1.25 analysts over three days, which
averaged to 8.1 minutes per sample result. Based on this,
an operator could  be  expected to analyze 59 samples (8
hours x 60 minutes •*• 8.1 minutes/sample) in a 8-hourday.
                                    Cost of the Ohio Lumex sample analyses included capital,
                                    supplies, labor, support equipment, and waste disposal.
                                    The cost per sample was calculated both with and without
                                    the cost of the instrument included. This was performed
                                    because the first sample  requires that the instrument is
                                    either purchased or rented, and  as the sample  number
                                    increases,  the  cost  per  sample would  decrease.   A
                                    comparison  of the field  Ohio  Lumex cost to off-site
                                    laboratory cost was not made.  To compare the field and
                                    laboratory costs correctly,  it would be necessary to include
                                    the expense  incurred to  the  project  due to  waiting for
                                    analysis results to  return  from  the laboratory (potentially
                                    several mobilizations  and  demobilizations, stand-by fees,
                                    and other aspects associated with field activities).

                                    Table  8-2   summarizes  the  results  of the  primary
                                    objectives.

                                    8.2     Secondary Objectives
                                    Table 8-3  summarizes  the  results  of the  secondary
                                    objectives.
Table 8-1.  Distribution of Samples Prepared for Ohio Lumex and the Referee Laboratory
         Site
 (Subtotal = 17)
 Subtotal	
Concentration Range
                                          Soil
                                       Sediment
                                                                    Sample Type
           Spiked Soil
           SRM
Carson River
(Subtotal = 62)

Puget Sound
(Subtotal = 67)
Oak Ridge
(Subtotal = 51)
Manufacturing
Low(1-500ppb)
Mid (0.5-50 ppm)
High (50->1,000 ppm)
Low (1 ppb - 10 ppm)
High (10-500 ppm)
Low (0.1 -10 ppm)
High (10-800 ppm)
General (5-1,000 ppm)
3
0
0
30
0
10
3
10
10
0
0
0
3
7
6
0
7
7
0
14
7
7
0
0
7
28
0
13
0
14
4
7
                            56
26
42
73
                                                    62

-------
Table 8-2. Summary of RA-915+/RP-91C Results for the Primary Objectives
Demonstration
Objective
Instrument
Sensitivity
Evaluation Basis Performance Results
RA-915+/RP-91C
MDL. Method from 40 CFR Part 1 36. Between 0.0053 and 0.042
mg/kg
Reference Method
0.0026 mg/kg
 Accuracy
 Precision
 Time per Analysis
 Cost
PQL.   Low concentration SRMs or
samples.

Comparison to SRMs, field, and spiked
samples covering the entire range of the
instrument calibration.

Determined by analysis of replicate samples
at several concentrations.

Timed daily operations for 2.5 days and
divided the total time by the total number of
analyses.
Costs were provided by Ohio Lumex  and
independent suppliers of support equipment
and supplies.  Labor costs were estimated
based on a salary survey. IDW costs were
estimated from the actual costs encountered
at the Oak Ridge demonstration.
                                                              < 0.06 mg/kg
                          0.005mg/kg
Ohio Lumex data were within SRM 95% prediction
intervals 93% of the time. 19 of 33 different sample lots
within 30% of referee laboratory value.

Ohio Lumex overall average RSD; 16.1%
One technician performed half of the equipment setup
and demobilization, most sample preparation, and all
calibration checks and analyses. Individual analyses
took 1 minute each, but the total time per analysis
averaged approximately 8.1  minutes per sample.

The cost per analyses based upon 197 samples, when
renting the RA-915+/RP-91C, is $23.44 per sample. The
cost per analyses for the 197 samples, excluding capital
cost, is $15.82 per sample. The total cost for equipment
rental and  necessary supplies during the demonstration
is estimated at $4,617.  The cost breakout by category is:
capital costs, 32.5%; supplies, 10.8%; support
equipment, 6.0%; labor,  19.5%; and IDW, 31.2%.
                                                       63

-------
Table 8-3. Summary of RA-915+/RP-91C Results for the Secondary Objectives
 Demonstration
 Objective
Evaluation Basis
Performance Results
 Ease of Use
Field observations during the demonstration.
 Health and Safety   Observation of equipment, operating
 Concerns          procedures, and equipment certifications
                    during the demonstration.
 Portability of the
 Device
Review of device specifications,
measurement of key components, and
observation of equipment setup and tear
down before, during, and after the
demonstration.
 Instrument
 Durability
 Availability of
 Vendor
 Instruments and
 Supplies
Observation of equipment design and
construction, and evaluation of any
necessary repairs or instrument downtime
during the demonstration.

Review of vendor website and telephone
calls to the vendor after the demonstration.
The RA-915+/RP-91C combination is reasonably easy to
operate; lack of automation somewhat impairs the ease
of use. Operation requires one field technician with a
basic knowledge of chemistry acquired on the job or in a
university, and training on the instrument.

No significant health and safety concerns were noted
during the demonstration. The only potential health and
safety concerns identified were the generation of mercury
vapors and the potential for bums with careless handling
of hot quartz sample boats. The vendor provides a
mercury filter as standard equipment; exercising caution
and good laboratory practices can mitigate the potential
for burns.

The RA-915+ air analyzer was easily portable, although
the device, even when carried in the canvas sling, was
not considered light-weight. The addition of the RP-91C
and associated pump unit preclude this from being a truly
field portable instrument. The device and attachments
can be transported in carrying cases by two  people, but
must then be set up in a stationary location.  It was easy
to set up, but the combined instrument is better
characterized as  mobile rather than field portable.

The RA-915+/RP-91C combination was well designed
and constructed for durability.
The RA-915+/RP-91C combination is readily available
for rental, lease, or purchase. Spare parts and
consumable supplies can be added to the original
instrument order or can be received within 24-48 hours of
order placement. Standards are readily available from
laboratory supply firms or can be acquired through Ohio
Lumex.
                                                        64

-------
                                             Section 9
                                           Bibliography
Anchor   Environmental. 2000.  Engineering  Design
  Report, Interim Remedial Action Log Pond Cleanup/
  Habitat  Restoration  Whatcom   Waterway  Site,
  Bellingham, WA. Prepared  for Georgia Pacific West,
  Inc. by AnchorEnvironmental, L.L.C., Seattle, WA. July
  31, 2000.

Confidential Manufacturing Site. 2002. Soil Boring  Data
  from a Remedial Investigation Conducted in 2000.

Ohio Lumex, 2001.  Portable  Zeeman Mercury Anlyzer:
  RA-915+ Analyzer; RP-91 and RP-91C Attachments.
  2001.

Rothchild, E.R., R.R. Turner, S.H. Stow, M.A. Bogle, L.K.
  Hyder, O.M. Sealand, H.J. Wyrick. 1984.  Investigation
  of Subsurface Mercury at the Oak Ridge Y-12 Plant.
  Oak Ridge National Laboratory, TN. ORNL/TM-9092.

U.S. Environmental Protection Agency. 1994. Region 9.
  Human  Health  Risk Assessment  and  Remedial
  Investigation Report - Carson River Mercury Site
  (Revised Draft).  December 1 994.

U.S.  Environmental   Protection   Agency.   1995.
  Contaminants and  Remedial Options  at  Selected
  Metal-Contaminated  Sites. July  1995.  Washington
  D.C.  EPA/540/R-95/512.
U.S. Environmental Protection Agency.  1996.   Test
  Methods  for   Evaluating   Solid  Waste,
  Physical/Chemical Methods, SW-846 CD ROM, which
  contains updates for 1986, 1992, 1994,  and  1996.
  Washington DC.

U.S. Department of Energy.  1998.  Report  on the
  Remedial Investigation of the  Upper East Fork of
  Poplar Creek Characterization Area at the Oak Ridge
  Y-12 Plant, Oak Ridge, TN.  DOE/OR/01-1641&D2.

U.S.  Environmental   Protection   Agency.     1998.
  Unpublished.    Quality  Assurance  Project  Plan
  Requirements for Applied Research Projects, August
  1998.

U.S. Environmental Protection Agency. 2002a. Region
  9 Internet Web Site, www.epa.gov/region9/index. html.

U.S.   Environmental   Protection   Agency.  2002b.
  Guidance on  Data  Quality Indicators.   EPA  G-5i,
  Washington D.C., July 2002.

U.S. Environmental Protection  Agency. 2003.  Field
  Demonstration Quality Assurance Project Plan - Field
  Analysis of Mercury in Soil and Sediment. August
  2003. Washington D.C., EPA/600/R-03/053.

Wilcox,  J.W., Chairman.  1983.  Mercury at Y-12: A
  Summary of the  1983 UCC-ND Task Force Study.
  Report Y/EX-23, November 1983.

www.OhioLumex.com, 2003.
                                                  65

-------
                                             Appendix A
                                     Ohio Lumex Comments
Accuracy and Precision

The  accuracy of  the  instrument  was  tested  in field
conditions and this may have caused a loss of one sample
result data.  Also,  one sample was entered in the data
sheet as 0.16 ug/kg instead of 160 ug/kg. Nevertheless,
the demonstrated accuracy (95% for SRM) and precision
(average RSD for  reference laboratory was 22.3%,  the
average RSD for Ohio Lumex  was 16.1%  or  7.6 % for
SRM) of the Ohio  Lumex instrument  was better than
results obtained by a reference laboratory.

Method Detection  and Practical Quantitation Limits

The   method  detection  limits  (MDLs) and   practical
quantitation  limits (PQLs)  determined  by the  results of
testing were obtained for conditions specifically  set for the
instrument to expand the upper (high concentration) range
to 200 mg/kg. A simple change of instrument parameters
will enable the operator  to change the  MDL and POL to
0.001 mg/kg  and  0.005mg/kg respectively. A specifically
developed Pyro 915 attachment for ultra low direct mercury
measurements   enables  one   to  achieve   MDL/PQL
0.0001mg/kg and 0.0005 mg/kg.
Automation

Since the time of the testing, Ohio Lumex has developed
a balance interface to automatically enter sample size into
a computer spread sheet.

Auto  Sampler-   The  turnaround  time  to  analyze  an
individual sample is  1 minute.  25+ samples  can  be
manually processed in an hour over an  8-hour  day, an
average of 2.4 minutes per sample. The time required only
to load  an  auto  sampler will be up  to 10 minutes  per
sample. Also, addition of the auto sampler will affect  the
reliability and  portability of the system.

Portability

The instrument consist of two modules and can be easily
packed in one rolling pelican case with total weight of the
system not exceeding  60 pounds. No compressed gases
are required.  Set-up time  from  unpacking to operation is
within 1 hour.  We also have many customers using these
settings in the field in remote locations while using portable
power generators.
 This appendix was written solely by Ohio Lumex.  The statements presented in this appendix represent the developer's point of view and
 summarize the claims made by the developer regarding the RA-915+/RP-91C. Publication of this material does not represent EPA's approval
 or endorsement of the statements made in this appendix; performance assessment and economic analysis results for the RA-915+/RP-91C are
 discussed in the body of the ITVR.

                                                    66

-------
                                              Appendix B
                                         Statistical Analysis
Two separate hypothesis tests were used to compare the
referee laboratory samples to the vendor tested samples.
This appendix details the equations and information for
both of  these  statistical analyses. For purposes of this
appendix, we  have chosen to  call  the test comparing
sample populations using a  separate calculation for each
sample  lot  the  "hypothesis  test,"  and  the  statistical
comparison  of the entire sample set  (all 33  separate
sample lots) analyzed by the vendor and the laboratory the
"unified  hypothesis test," also known  as an  "aggregate
analysis" for all of the sample lots.

Hypothesis Test

A hypothesis  test  is used  to determine  if two sample
populations  are  significantly different.  The  analysis is
performed based on  standard statistical calculations for
hypothesis testing.    This  incorporates a comparison
between the two sample populations assuming a specified
level of significance. For establishing the hypothesis test,
it was  assumed that  both  sample  sets  are  equal.
Therefore, if the null  hypothesis is  rejected,  then the
sample  sets are not considered equal.  This test was
performed on all sample lots  analyzed by both Ohio Lumex
and the referee laboratory. H0 and Ha, null and alternative
hypothesis respectively, were tested with a 0.01 level of
significance (LOS). The concern  related  to this test is that,
if two sample populations have highly variable data (poor
precision), then  the  null hypothesis may be  accepted
because of the test's inability to exclude poor precision as
a mitigating factor. Highly variable data results in wider
acceptance windows, and therefore, allows for acceptance
of the null hypothesis. Conclusions regarding this analysis
are  presented  in the main body of the  report.

To  determine  if the  two sample sets are  significantly
different, the absolute value  of the difference between the
laboratory average XL and  the vendor  average xv is
compared to a calculated u.  When the absolute value of
the difference  is  greater than  u, then  the  alternate
hypothesis is accepted, and the two sets (laboratory  and
vendor) are concluded to be different.

To calculate u, the variances for the laboratory data set
and the vendor data  set are calculated by dividing their
standard deviations by the number of samples in their data
set. The effective number of degrees of freedom is then
calculated.
Where:
       f
       V,
       Vv
= effective number of degrees of freedom
= variance for the laboratory results
= number of samples for the laboratory
data set
= variance for the vendor results
= number of samples for the vendor data
set.
The  degrees of freedom  (f) is  used to  determine the
appropriate "t" value and used to calculate u at the 0.01
level of significance using the following:
                                                     67

-------
Unified Hypothesis Test
For a  specified vendor, let  Y,j be  the  measured Hg
concentration for the f replicate of the ith  sample for
/=1,2,...,l and)= 1,2,...,Ji. LetX^= log(Y^), where log is the
logarithm to the base 10.  Define x,log to be the average
over all log replicates for the Ith sample given by:
                                                        Where x M is approximately a chi-square random variable
                                                        with (1-1) degrees of freedom:
5  =  7
                                                                   -'
                                                                      tog
                          z'log
                                                                                                ilog
                   -J
                         '
                           log
                                                                                and
Denote the estimate of the variance of the log replicates for
the Ith sample to be:
                      -1
Now for the reference laboratory, let Y',y be the measured
Hg concentration for the/1 replicate of the ith sample for
/  =1,2,...,!' and j  = 1,2,. ...J',.   Denote  the reference
laboratory quantities X',j, x/, and  s'2 defined in a manner
similar to the corresponding quantities for the vendor.

Assumptions: Assume that the vendormeasurements, Y,y,
are independent and identically distributed according to a
lognormal distribution with parameters  u,and o2. That is,
X,y= log(Y,y) is distributed according to a  normal distribution
with expected value u,and variance o2. Further, assume
that  the  reference  laboratory measurements,  Y',j, are
independent and  identically distributed  according  to  a
lognormal distribution with parameters  u',and o'2.

The null hypothesis to be tested is:

     HQ : /jj = fj'j + 5,  for some S and i = I,..., I

against the alternative hypothesis that the equality does not
hold for at least one value of /.

The null hypothesis H0 is rejected for large values of:
    Zi-i =
            i-1
                                                                           i-i
                          1 pool
                                                        Critical values for  the  hypothesis test  are the upper
                                                        percentile of the chi-square distribution with (1-1) degrees
                                                        of freedom obtained from a chi-square table.

                                                        Results of Unified Hypothesis Test for Ohio Lumex

                                                        SAIC  performed a unified  hypothesis  test analysis  to
                                                        assess the comparability of analytical results provided by
                                                        Ohio Lumex and those  provided by ALSI.  Ohio Lumex
                                                        and ALSI  both  supplied  multiple  assays on  replicates
                                                        derived from a total of  33 different sample lots,  be  they
                                                        field materials or reference materials with sample lots 35
                                                        and  55  excluded  because  these  were  below  the
                                                        instrument POL.  The Ohio Lumex and ALSI data from
                                                        these assays formed the basis of this assessment.

                                                        The statistical analysis is  based on  log-transformed
                                                        (logarithm base 10) data and  uses a chi-square test for
                                                        equality of Ohio  Lumex and ALSI  population means for
                                                        given sample lot. Equality of variances is assumed.

                                                        Initially, the  null hypothesis tested  was that,  on average,
                                                        Ohio Lumex and ALSI  would  produce the same results
                                                        within  a given sample lot.  This hypothesis is  stated as

                                                         H10: (Ohio  Lumex Lot log mean) = (ALSI Lot log  mean)

                                                        H10 was rejected  in  that the chi-square  statistic  was
                                                        130.26, which exceeds the  upper 99th percentile of the
                                                        chi-square  distribution  with   33   degrees  of freedom
                                                        having  a value of 54.78.

                                                        The null hypothesis was rejected  in part because Ohio
                                                        Lumex results tended to exceed those from ALSI for the
                                                        same   sample  lot.   To  explore  this  effect,  the  null
                                                     68

-------
hypothesis  was revised to  included a bias term in the
form of

 H20: (Ohio Lumex Lot log mean) = (ALSI Lot log mean)
                      +(delta),

where delta is a single  value that does not change  from
one sample lot to  another, unlike the lot log means.  H20
was  rejected  strongly in that the  chi-square statistic was
101.46,  which exceeded the upper 99th percentile of the
chi-square distribution with 32 degrees of freedom with a
value of 53.49. In this analysis,  delta was estimated to
be 0.133 in logarithmic (base 10) space, which indicates
an average upward bias for Ohio Lumex of 100121=1.358
or about 36%.

For both hypotheses, the large values of the chi-square
test statistics summarize the disagreement between the
Ohio Lumex and ALSI analytical  results.  Furthermore, a
review of the statistical analysis details indicates that the
overall discordance  between  Ohio  Lumex and ALSI
analytical results cannot be traced to the disagreement in
results for one  or two sample lots.

Summary information on these analyses is  provided in
Table  B-1.    The  p-value  can  be considered  as  a
significance  level. This is a calculated value  and  usually
when one sets a p-value (e.g.,  95% confidence level
which translates to a p-value of 0.05), this value is used
to test the level of significance for comparison.  As noted
in  Table  B-1  the  p-value is calculated  from the test
statistics and therefore it can be seen that because the
p-value is  so small  (<  0.000000)  the two  sample
populations  are considered  to be  non-equivalent and
hence the large chi-square value.
Table B-1. Unified Hypothesis Test Summary Information
Hypothesis
HIO
H™
1 Ulcll OeUlljJII
Lots
33
33
" Excluded Lot
35, 55
35,55
DF
33
32
S pool
0.03967
0.03967
Delta
0.0000
0.1329
Chi-square
130.26
101.46
P-value
0.000000
0.000000
                                                    69

-------