June 2006
    Environmental Technology
    Verification Report


    MONITORINGSYSTEMS GMBH
    DlOXINMONITORINGSYSTEM
             Prepared by
             Battelle
             Batreiie
           7he Business of Innovation

         Under a cooperative agreement with

        CtTr\ U.S. Environmental Protection Agency
ET1/ET1/ET1/

-------
                THE ENVIRONMENTAL TECHNOLOGY VERIFICATION
                                        PROGRAM
   U.S. Environmental Protection Agency
                                                                       Batreiie
                                                                  Trtc Business o/ Innovation
                      ETV Joint Verification Statement
      TECHNOLOGY TYPE:  Dioxin Emission Monitoring System

      APPLICATION:          Monitoring Incinerator Emissions
      TECHNOLOGY
      NAME:

      COMPANY:

      ADDRESS:
      WEB SITE:
      E-MAIL:
DioxinMonitoringSystem

Monitoring Systems GmbH

Schloss 2, A-2542          PHONE: +43 664 2527239
Kottingbrunn, Austria     FAX:    +43 2252 70592

www.dioxinmonitoring.com/
office @ dioxinmonitoring.com
The U.S. Environmental Protection Agency (EPA) has established the Environmental Technology Verification
(ETV) Program to facilitate the deployment of innovative or improved environmental technologies through
performance verification and dissemination of information. The goal of the ETV Program is to further
environmental protection by accelerating the acceptance and use of improved and cost-effective technologies.
ETV seeks to achieve this goal by providing high-quality, peer-reviewed data on technology performance to
those involved in the design, distribution, financing, permitting, purchase, and use of environmental
technologies. Information and ETV documents are available at www.epa.gov/etv.

ETV works in partnership with recognized standards and testing organizations, with stakeholder groups
(consisting of buyers, vendor organizations, and permitters), and with individual technology developers. The
program evaluates the performance of innovative technologies by developing test plans that are responsive to
the needs of stakeholders, conducting field or laboratory tests (as appropriate), collecting and analyzing data,
and preparing peer-reviewed reports. All evaluations are conducted in accordance with rigorous quality
assurance (QA) protocols to ensure that data of known and adequate quality are generated and that the results
are defensible.

The Advanced Monitoring Systems (AMS) Center, one of six technology areas under ETV, is operated by
Battelle in cooperation with EPA's National Exposure Research Laboratory.  The AMS Center evaluated the
performance of the Monitoring Systems GmbH DioxinMonitoringSystem in  monitoring emissions of
polychlorinated dibenzo-p-dioxins (PCDD) and polychlorinated dibenzofurans (PCDF). This verification
statement provides a summary of the test results.

-------
VERIFICATION TEST DESCRIPTION

The performance of the DioxinMonitoringSystem was evaluated in terms of relative accuracy (RA), range,
data completeness, and operational factors (ease of use, maintenance, and consumables/waste generated). RA
and range were determined by comparing DioxinMonitoringSystem results to those from Method 23
reference samples collected simultaneously. Range was determined from measurements over a variety of
defined operating conditions that produced differing levels of dioxins. Data completeness was assessed as the
percentage of maximum data return achieved by the DioxinMonitoringSystem over the test period.
Operational factors were evaluated by means of operator observations and records of needed maintenance,
vendor activities, and expendables used.

A 2.94 thousand British thermal unit per hour, 3-Pass Wetback Scotch Marine Package Boiler (SMPB),
manufactured by Superior Boiler Works, Inc., and located at the EPA Research Triangle Park facility, was
used for the verification test. During this verification test, the SMPB was fully instrumented with continuous
emission monitors for a variety of species including oxygen, carbon monoxide, carbon dioxide, water, and
hydrogen chloride. Reference samples were collected and  analyzed for dioxins using Method 23 with several
modifications.

QA oversight of verification testing was provided by Battelle and EPA. Battelle QA staff conducted a
technical systems audit, a performance evaluation audit, and a data quality audit of 10% of the test data.
Additionally, EPA QA staff conducted an independent technical systems audit.

This verification statement, the full report on which it is based, and the test/QA plan for this verification test
are all available at www.epa.gov/etv/centers/centerl.html.

TECHNOLOGY DESCRIPTION

The following description of the DioxinMonitoringSystem is based on information provided by the vendor.
This technology description was not verified in this test.

The DioxinMonitoringSystem is a long-term sampling device for measuring the concentrations of PCDDs in
gas streams.  It is an automatic isokinetic sampler for measurement of PCDDs, PCDFs, and other persistent
organic pollutants. The system comprises (1) a stack-mounted dual probe system including automatic probe
switching, blowback, and cleaning, with particle filter and polyurethane foam (PUF) cartridge housing
attached and (2) a remote control unit for isokinetic sampling enabling automatic measurement control,
remote control and data download, standby/restart, and calibration. The control unit includes both menu-
driven software and a process computer. The computer monitors the function of all aggregates and registers
all data required for the subsequent evaluation of the samples taken. At regular intervals, data are stored on a
static random access memory (SRAM) card. The data on the SRAM card are later interrogated together with
the analysis results to ascertain the mass concentration.

The gas is sampled isokinetically from the gas stream by alternating the use of one of two titanium probes.
The collected gas is transferred to a titanium mixing chamber where it is diluted with dried and cooled air.
Thus, the sampled gas is cooled by keeping the dew point  below the gas  mixture temperature, which avoids
any condensation. The dry gas mixture then passes through a filter stack where the PCDDs are collected. The
filters are designed to collect the dust fraction and the gas  (or more exactly, the material passing through the
filter) fraction separately. The DioxinMonitoringSystem allows most  of the sampling to be conducted in an
unattended fashion after an initial run configuration by the operator. This device is configured specific to the
sampling location on installation, partially by the sampling institution or laboratory preparing and analyzing
the cartridges and partially by the operator.

The system can also be configured as a single probe device. Both configurations can handle high dust
loadings (up to 150 milligrams per cubic meter) without change in performance, and flue gas velocities up to

-------
30 meters per second can be accommodated within the isokinetic control range of the overall system. The
system can also be configured to collect samples for determining heavy metals.
VERIFICATION RESULTS
Parameter Evaluated
Accuracy
Range
Data completeness
Ease of use
Maintenance
Consumables/waste
generated
Method of Evaluation
Comparison to Method 23
reference samples
Comparison to Method 23
reference samples by
concentration and sample
collection time
Ratio of number of samples
successfully collected to
number of potential samples
that could have been
collected
Operator observations
Operator observations
Operator observations
Results

RA
(RA)(a)
Intermethod RSD
(Intermethod RSD)(b)
Intramethod RSD
PCDDs PCDFs
• 106% • 18.4%
• (16.8%) • (17.8%)
• 85.4% • 10.3%
• (16.3%) • (10.4%)
• 10.0% • 8.4%
PCDD/Fs
• 22.6%
• (17.5%)
• 9.7%
• (10.4%)
• 8.4%
• No dependence of accuracy on PCDD/F toxic equivalent
(TEQ) over range of approximately 1 to 6 nanograms
TEQ/dry standard cubic meter
• No dependence of accuracy on sample duration over range of
4 to 16 hours.
100% completeness in number of samples collected.
• Installation of the DioxinMonitoringSystem was completed by
a representative of MonitoringSystems, GmbH, within 48
hours
• Effectively operated after 1-2 hours of training in basic
operation
• Installation of sampling media and removal of sampling media
completed in approximately 5-15 minutes each
• Less than 1% downtime
No maintenance was required during the verification test.
PUF cartridges were used in the sampling cartridges for sample
collection.
w RA calculated using only congeners detected in both the DioxinMonitoringSystem and Method 23 samples.
(b) Intermethod relative standard deviation (RSD) calculated using only congeners detected in both the DioxinMonitoringSystem and Method
23 samples.
Original signed bv Gregory A. Mack 6/6/06 Original signed bv Lawrence W. Reiter 7/26/06
Gregory A. Mack Date Lawrence W. Reiter
Vice President Director
Energy, Transportation, and Environment Division National Exposure Research Laboratory
Battelle Office of Research and Development
U.S. Environmental Protection Agency
Date
NOTICE: ETV verifications are based on an evaluation of technology performance under specific,
predetermined criteria and the appropriate quality assurance procedures. EPA and Battelle make no expressed or
implied warranties as to the performance of the technology and do not certify that a technology will always
operate as verified. The end user is solely responsible for complying with any and all applicable federal, state,
and local requirements. Mention of commercial product names does not imply endorsement.


-------
                                      June 2006
Environmental Technology Verification
                 Report

   ETV Advanced Monitoring Systems Center

        MonitoringSystems GmbH
         DioxinMonitoringSystem
                    by

                  Ken Co wen
                  Tom Kelly
                  Amy Dindal
                Zachary Willenberg
                  Karen Riggs
                   Battelle
               Columbus, Ohio 43201

-------
                                       Notice

The U.S. Environmental Protection Agency (EPA), through its Office of Research and
Development, has financially supported and collaborated in the extramural program described
here. This document has been peer reviewed by the Agency. Mention of trade names or
commercial products does not constitute endorsement or recommendation by the EPA for use.
                                          11

-------
                                      Foreword

The U.S. Environmental Protection Agency (EPA) is charged by Congress with protecting the
nation's air, water, and land resources. Under a mandate of national environmental laws, the
Agency strives to formulate and implement actions leading to a compatible balance between
human activities and the ability of natural systems to support and nurture life. To meet this
mandate, the EPA's Office of Research and Development provides data and science support that
can be used to solve environmental problems and to build the scientific knowledge base needed
to manage our ecological resources wisely, to understand how pollutants affect our health, and to
prevent or reduce environmental risks.

The Environmental Technology Verification (ETV) Program has been established by the EPA to
verify the performance characteristics of innovative environmental technology across all media
and to report this objective information to permitters, buyers, and users of the technology, thus
substantially accelerating the entrance of new environmental technologies into the marketplace.
Verification organizations oversee and report verification activities based on testing and quality
assurance protocols developed with input from major stakeholders and customer groups
associated with the technology area. ETV consists of six verification centers. Information about
each of these centers can be found on the Internet at http://www.epa.gov/etv/.

Effective verifications of monitoring technologies are needed to assess environmental quality
and to supply cost and performance data to select the most appropriate technology for that
assessment. Under a cooperative agreement, Battelle has received EPA funding to plan,
coordinate, and conduct such verification tests for "Advanced Monitoring Systems for Air,
Water, and Soil" and report the results to the community at large. Information concerning this
specific environmental technology area can be found on the Internet at
http://www.epa.gov/etv/centers/centerl.html.
                                           111

-------
                                Acknowledgments

The authors wish to acknowledge the support of all those who helped plan and conduct the
verification test, analyze the data, and prepare this report. Many thanks to Dahman Touati of
ARCADIS and Dennis Tabor of U.S. Environmental Protection Agency (EPA) for their
contributions and to the Battelle staff who conducted the verification testing. We would also like
to thank Mr. Ernest Bouffard of the Connecticut DEP, Mr. Thomas Logan of USEPA, and Mr.
Todd Abel of the Chlorine Chemistry Council for their technical review of the test/quality
assurance plan and for their careful review of this verification report. We also thank the
following organizations for financial support of this verification test:

•  Chlorine Chemistry Council
•  U.S. EPA Office of Solid Waste and Emergency Response
•  U.S. EPA Office of Air Quality Planning and Standards
•  U.S. EPA Office of Research and Development.
                                          IV

-------
                                      Contents
                                                                                  Page
Notice	ii
Foreword	iii
Acknowledgments	iv
List of Abbreviations	vii
Chapter 1  Background	1
Chapter 2  Technology Description	2
Chapter 3  Test Design and Procedures	4
      3.1 Introduction	4
      3.2 Experimental Setup	5
           3.2.1 Test Facility	5
           3.2.2 Reference Samples	6
           3.2.3 DioxinMonitoringSystem Installation and Operation	8
      3.3 Test Design	9
           3.3.1 Relative Accuracy	9
           3.3.2 Range	10
           3.3.3 Data Completeness	10
           3.3.4 Operational Factors	11
Chapter 4  Quality Assurance/Quality Control	12
      4.1 Audits	12
           4.1.1 Performance Evaluation Audits	12
           4.1.2 Technical Systems Audits	13
           4.1.3 Audit of Data Quality	14
      4.2 Quality Assurance/Quality Control Reporting	14
      4.3 Data Review	14
Chapter 5  Statistical Methods and Reported Parameters	15
      5.1 Relative  Accuracy	15
      5.2 Range	16
      5.3 Data Completeness	16
      5.4 Operational Factors	16
Chapter 6  Test Results	17
      6.1 Relative  Accuracy	19
      6.2 Range	22
      6.3 Data Completeness	23
      6.4 Operational Factors	23
           6.4.1 Ease of Use	23
           6.4.2 Maintenance	24
           6.4.3 Consumables/Waste Generation	24
Chapter 7  Performance Summary	25
Chapters  References	26

-------
                                       Figures

Figure 2-1.  Photograph of DioxinMonitoringSystem Probe System	2
Figure 3-1.  Wetback Scotch Marine Package Boiler	5
Figure 3-2.  Illustration of Flue Gas Duct with Sampling Locations	6
Figure 3-3.  Installed DioxinMonitoringSystem Sampling Probe	8
Figure 3-4.  DioxinMonitoringSystem Control Unit	9
                                       Tables

Table 3-1. Test Run Summary	10
Table 4-1. Methods and Acceptance Criteria for PE Audit Measurements	13
Table 6-1. Summary of Test Runs and Testing Conditions	17
Table 6-2. Reference Method 23 Results	18
Table 6-3. Results from the Method 23 Reference Samples	19
Table 6-4. DioxinMonitoringSystem Results	20
Table 6-5. Summary of Results from the Method 23 Reference Samples and
          DioxinMonitoringSystem	21
Table 6-6. Relative Accuracy Results for the DioxinMonitoringSystem	22
Table 6-7. Summary of Percent Difference by Sampling Duration	22
Table 6-8. Activity Summary for DioxinMonitoringSystem	23
Table 7-1. Summary of Verification Test Results for DioxinMonitoringSystem	25
                                         VI

-------
                             List of Abbreviations

AMS             Advanced Monitoring Systems
APCS            air pollution control system
CEM             continuous emission monitor
dscm             dry standard cubic meter
EMS             emission monitoring system
EPA             U.S. Environmental Protection Agency
ETV             Environmental Technology Verification
HW             hot/wet
NIST             National Institute of Standards and Technology
PCDD           polychlorinated dibenzo-p-dioxins
PCDF            polychlorinated dibenzofurans
PE               performance evaluation
PUF             polyurethane foam
QA              quality assurance
QC              quality control
QMP             quality management plan
RA              relative accuracy
RSD             relative standard deviation
RTF             Research Triangle Park
SRAM           static random access memory
SMPB           Scotch Marine Packaged Boiler
TEQ             toxic equivalent
TSA             technical systems audit
                                        vn

-------

-------
                                      Chapter 1
                                     Background


The U.S. Environmental Protection Agency (EPA) supports the Environmental Technology
Verification (ETV) Program to facilitate the deployment of innovative environmental
technologies through performance verification and dissemination of information. The goal of the
ETV Program is to further environmental protection by accelerating the acceptance and use of
improved and cost-effective technologies. ETV seeks to achieve this goal by providing high-
quality, peer-reviewed data on technology performance to those involved in the design,
distribution, financing, permitting, purchase, and use of environmental technologies.

ETV works in partnership with recognized testing organizations; with stakeholder groups
consisting of buyers, vendor organizations, and permitters; and with the full participation of
individual technology developers. The program evaluates the performance of innovative
technologies by developing test plans that are responsive to the needs of stakeholders,
conducting field or laboratory tests (as appropriate), collecting and analyzing data, and preparing
peer-reviewed reports. All evaluations are conducted in accordance with rigorous quality
assurance (QA) protocols  to ensure that data of known and adequate quality are generated and
that the results are defensible.

The EPA's National Exposure Research Laboratory and its verification  organization partner,
Battelle, operate the Advanced Monitoring Systems (AMS) Center under ETV. The AMS Center
recently evaluated the performance of the MonitoringSystems, GmbH, DioxinMonitoringSystem
in monitoring emissions of polychlorinated dibenzo-p-dioxins (PCDD) and polychlorinated
dibenzofurans (PCDF)

-------
                                      Chapter 2
                              Technology Description
The objective of the ETV AMS Center is to verify the performance characteristics of
environmental monitoring technologies for air, water, and soil. This verification report provides
results for the verification testing of the DioxinMonitoringSystem. Following is a description of
the DioxinMonitoringSystem, based on information provided by the vendor. The information
provided below was not verified in this test.

The DioxinMonitoringSystem is a long-term sampling device for measuring the concentrations
of PCDDs in gas streams. It is an automatic isokinetic sampler for measurement of PCDDs,
PCDFs, and other persistent organic pollutants.

                               The system comprises:

                               •  A stack mounted dual probe system including automatic
                                  probe switching, blowback and cleaning, with particle
                                  filter and polyurethane foam (PUF) cartridge housing
                                  attached (shown in Figure 2-1).

                               •  A remote control unit for isokinetic sampling enabling
                                  automatic measurement control, remote control and data
                                  download, standby/restart, and calibration. Measurement
                                  data for each sample cartridge can be accessed after the
                                  sampling period.

                               The control unit includes both menu-driven software and a
                               process computer. The system is operated by five keys and a
                               liquid crystal display screen. This screen is also used to set
                               parameters and retrieve important operational data. All data
                               relevant for measurements are stored in the form of parameters
                               that can be released only  by means of a key switch. The computer
                               monitors the function of all aggregates and registers all data
                               required for the subsequent evaluation of the samples taken. At
regular intervals, data are stored on a static random access memory (SRAM) card. The data on the
SRAM card are later interrogated together with the analysis results to ascertain the mass
concentration.
Figure 2-1 Photograph of
DioxinMonitoringSystem
Probe System

-------
The gas is sampled isokinetically from the gas stream by alternating the use of one of two
titanium probes. The collected gas is transferred to a titanium mixing chamber where it is diluted
with dried and cooled air. Thus, the sampled gas is cooled by keeping the dew point below the
gas mixture temperature, which avoids any condensation. The dry gas mixture then passes
through a filter stack where the PCDDs are collected. The filters are designed to collect the dust
fraction and the gas (or more exactly, the material passing through the filter) fraction separately.
The collected samples are then retrieved and sent to a laboratory for analysis. The time required
for sample analysis will vary depending on the method employed and the laboratory response
time. Typical turnaround times for PCDD/F analysis are between two and four weeks. For this
verification test, the collected gas samples were analyzed in the same laboratory and by the same
method as the reference samples collected during the test.

The DioxinMonitoringSystem allows most of the sampling to be conducted in an unattended
fashion after an initial run configuration by the operator. This device configuration is done
specific to the sampling location on installation, partially by the sampling institution or
laboratory preparing and analyzing the cartridges and partially by the operator.

The system can also be configured as a single probe device. Both configurations can handle high
dust  loadings (up to 150 milligrams per cubic meter) without change in performance, and flue
gas velocities up to 30 meters per second can be accommodated within the isokinetic control
range of the overall system. The system can also be configured to collect samples for
determining heavy metals.

The system can be controlled and periodically checked using a local area network interface or
remote access via the internet. Data can be downloaded through these links, and remote services
can be implemented.

-------
                                     Chapter 3
                            Test Design and Procedures
3.1 Introduction

EPA Method 23(1) is the certified extractive method used for quantifying dioxin emissions from
incinerators in the United States as well as in many other countries. This method is labor-
intensive, expensive, and requires an extended time for subsequent laboratory analysis of
collected samples. As a result, Method 23 measurements are made infrequently only for
compliance purposes and not for long- or short-term performance monitoring. Emerging
technologies are being developed to provide semi-continuous monitoring or long-term sampling
of dioxins and may have the potential to provide more information on dioxin source emissions
than the relatively few samples required under federal or state regulations. For example, in
Europe, mainly in Belgium and Germany, long-term sampling of PCDD/PCDFs has been used
for compliance measurements since 2000. However, the performance of these newly introduced
technologies has not been evaluated in the United States to determine their relative operational
capabilities.

The purpose of this verification test was to generate performance data on the
DioxinMonitoringSystem emission monitoring system. The test was conducted at EPA's
Research Triangle Park (RTF), North Carolina, campus over a period of two weeks in September
2005 and was supported by ARCADIS under a subcontract from Battelle. The accuracy and
range of the DioxinMonitoringSystem were determined through comparisons to a modified
version of Method 23 integrated sampling method for PCDD/PCDFs, with modifications as
described in Section  3.2.2 of this report.(1) Other performance parameters such as data
completeness and operational factors were determined from operator observations.

This verification test was conducted according to procedures specified in the Test/QA Plan for
Verification of Dioxin Emission Monitoring Systems (EMSs),(2) and the Quality Management
Plan (QMP)for the ETV/AMS Center.^ As described in this report, the performance of the
DioxinMonitoringSystem was evaluated in terms of

•  Relative accuracy (RA),
•  Range,
•  Data completeness, and
•  Operational factors (ease of use, maintenance, and consumables/waste generated).

-------
RA and range were determined by comparing DioxinMonitoringSystem results to those from
reference samples collected simultaneously using Method 23 sampling trains. Range was
determined from measurements over a variety of defined operating conditions that produced
differing levels of PCDDs. Data completeness was assessed as the percentage of maximum data
return achieved by the DioxinMonitoringSystem over the test period. Operational factors were
evaluated by means of operator observations and records of needed maintenance, vendor
activities, and expendables used.
3.2 Experimental Setup
3.2.1  Test Facility

A 2.94 thousand British thermal unit per hour, 3-Pass Wetback Scotch Marine Package Boiler
(SMPB), manufactured by Superior Boiler Works, Inc., and located at the EPA RTF facility, was
used for the verification test. This boiler (Figure 3-1) is capable of firing natural gas or a variety
of fuel oils. In this test, the oil burner was used; this burner is a low-pressure, air-atomizing
nozzle that delivered a fine spray at an angle that ensured proper mixing with the air stream. The
                                                      boiler has 33 square meters of
                                                      heating surface and generates up to
                                                      1,090 kilograms per hour of
                                                      saturated steam at pressures up to
                                                      15 pounds per square inch. Fuel
                                                      flows were measured with a liquid
                                                      volume totalizer, and stoichiometric
                                                      ratios are verified through oxygen
                                                      (O2) and carbon dioxide (CO2)
                                                      emission concentrations.
                                                      During this verification test, the
                                                      SMPB was fully instrumented with
                                                      continuous emission monitors
                                                      (CEMs) for a variety of species
                                                      including O2, carbon monoxide
                                                      (CO), CO2, water (H2O), and
Figure 3-1. Wetback Scotch Marine Package Boiler
hydrogen chloride (HC1). Continuous emission monitoring of chemical species was performed
with two shared CEMs for the packaged boiler facility. One CEM bench included four gas
analyzers: high-range CO, low-range CO, O2, and CO2. HC1 was measured by a self-contained
bench-scale CEM system (Bodenseewerk), which uses an Altech Hot/Wet (HW) sampling
system and a Perkin-Elmer MCS-100 Infrared Multi-Component Analyzer. The MCS is capable
of measuring up to eight compounds simultaneously, using gas filter correlation and single-beam
dual-wavelength techniques. The HW probe assembly samples flue gases, while maintaining
temperatures at elevated levels. The flue gas from the unit passes through a manifold to an air
pollution control system (APCS) consisting of a natural-gas-fired secondary combustion
chamber, a fabric filter, and an acid gas scrubber to ensure proper removal of pollutants. All
emission measurements are taken prior to the APCS. The SMPB facility was modified prior to
testing to accommodate all the requirements of the verification test. These modifications
included the addition of a section of duct equipped with several sampling ports at the exit of the
                                           5

-------
boiler to allow for the simultaneous installation of multiple dioxin EMSs and operation of
duplicate Method 23 sampling trains. Figure 3-2 shows a schematic illustration of the duct,
identifying the sampling locations for the reference sample trains and the
DioxinMonitoringSystem. As this figure shows, one Method 23 train sampled from a port
upstream in the flue gas flow from the DioxinMonitoringSystem's sampling port, and the other
sampled downstream.

                                                            ^EXISTING
               DioxinMonitoringSystem
               V
                _3" HALF-COUPLING
                 TYP.
_Vl" HALF-COUPLING
 TYP. OF 4
                                          Method 23 trains
                |	25.0-
Figure 3-2. Illustration of Flue Gas Duct with Sampling Locations

A surrogate chlorinated chemical (1,2-dichlorobenzene) and a source of metal atoms (copper
naphthenate) were added to the boiler fuel to promote dioxin formation for the EMS testing.(4) A
surrogate feed system was designed to safely tap the surrogate feed line to the fuel line just
before the burner nozzle. The feed system consisted of a 37-liter pressurized stainless steel tank,
in which the surrogate and the copper naphthenate were mixed.

Values for the stack gas composition  from the SMPB for each test run conducted during the
verification test are presented in Section 6.1 of this verification report.

3.2.2  Reference Samples

Reference samples were collected and analyzed  for dioxins using Method 23, with the following
modifications established before any sample collection took place:

•  Analysis was completed by high-resolution gas chromatography/low-resolution mass
   spectrometry.

-------
•  Mass locking was not used with low-resolution mass spectrometry.

•  The front and back halves of the reference samples were extracted and analyzed together
   rather than separately.

•  The internal, surrogate, and recovery standards included several that were not required in the
   standard method.

•  Extraction procedures called for in Method 23 were modified to allow more efficient
   extraction of mono- through tri-chlorinated dioxins and furans (see Section4.1.2).

ARCADIS collected the reference method samples and coordinated their analysis, which was
conducted by EPA staff at the EPA RTF facility. To minimize potential bias caused by
interlaboratory analysis differences, the  DioxinMonitoringSystem samples were also analyzed by
EPA staff. EPA staff ensured that the analytical instrumentation was calibrated and the samples
were analyzed according to the requirements of the modified Method 23 and that the appropriate
QA/quality control (QC) activities were conducted according to the method. Records of all
calibrations and sample analyses were provided to Battelle and are maintained in the test files.

3.2.2.1  Reference Sample Collection
As shown in Figure 3-2, the Method 23  samples were collected at the two extreme locations of
the stack gas sampling section, to bracket the locations of the technologies being evaluated in
this verification test. The reference method sampling included pre-spiking the XAD-2 traps with
carbon-13 labeled PCDD/F pre-sampling surrogates. Both sampling trains consisted mainly of a
heated probe, heated box containing a cyclone and a filter, water-cooled condenser, water-cooled
XAD-2 cartridge, impinger train for water determination, leak-free vacuum line, vacuum pump,
and a dry gas and orifice meter with flow control valves and vacuum gauge. Temperatures were
measured and recorded in the hot box (set at 125°C), at the impinger train outlet, at the XAD-2
cartridge outlet (maintained to be below ambient temperature), and at the inlet and outlet of the
dry gas meter. Leak checks were conducted at the beginning and end of each sample run. Prior to
sampling, all glassware, probe materials, glass wool, and aluminum foil were cleaned following
the Method 23 cleaning procedure.

3.2.2.2  Sample Recovery
Following completion of each test run, each sampling train was recovered in a clean area; and
the cleanup procedure began as soon as  the probe was removed from the sample source location.
During the transportation between the test facility and the designated recovery area, both ends of
the heated probe and openings of the impinger assembly were sealed with aluminum foil or glass
caps.

The front-half and back-half trains were recovered separately but analyzed together since no
gas/solid phase PCDD/F speciation was required for this verification test. The probe and front
half of the filter housing for each sample train were rinsed with acetone followed by dichloro-
methane and collected in a single 250-milliliter  (mL) amber jar. The probe and front-half filter
housing were then rinsed with toluene and collected in a separate 250-mL amber jar. The filter
was recovered and placed in a Petri dish sealed with Teflon tape.

                                            7

-------
The back-half sample train, which consisted of an XAD-2 cartridge, the back-half filter housing,
glass connection, and condenser, were recovered separately. The XAD-2 resin cartridge from
each train was capped at both ends and wrapped in aluminum foil during transport. As with all
sample fractions, the XAD-2 resin cartridges remained refrigerated during storage and transport.
The back-half glassware was rinsed and collected in the same way as the front-half rinses. The
solvent rinse jars for both the front- and back-half sample trains were capped with Teflon-lined
caps, sealed with Teflon tape to prevent leakage, and stored in a refrigerated space before being
sent for analysis.

3.2.3 DioxinMonitoringSystem Installation and Operation

Figure 3-3 shows the DioxinMonitoringSystem sampling unit on the duct. Immediately prior to
each test run, a PUF sampling cartridge was installed in the DioxinMonitoringSystem sampling
                                                      unit. During the verification test, the
                                                      DioxinMonitoringSystem was
                                                      manually started and programmed to
                                                      stop automatically after completion
                                                      of each test run. The
                                                      DioxinMonitoringSystem can also be
                                                      programmed for automated start-up
                                                      to allow for unattended operation.
                                                      After completion of each test run, the
                                                      sampling cartridge was removed and
                                                      stored in a freezer until transport to
                                                      the laboratory for analysis. Sampling
                                                      data for each test run were
                                                      downloaded, printed out, and
                                                      supplied to the laboratory for use in
                                                      determining PCDD/F concentrations.
 Figure 3-3. Installed DioxinMonitoringSystem
 Sampling Probe                                      Figure 3-4 shows the control unit of
                                                      the DioxinMonitoringSystem system
which was located approximately 2 meters from the sampling unit.

-------
Figure 3-4. DioxinMonitoringSystem
Control Unit
                                            3.3 Test Design

                                            RA, range, data completeness, and operational
                                            factors for the DioxinMonitoringSystem were
                                            evaluated.

                                            3.3.1 Relative Accuracy

                                            The RA of the DioxinMonitoringSystem was
                                            evaluated by comparing its results to
                                            simultaneous results obtained by reference
                                            samples of the flue gas collected using Method
                                            23. During the verification test, a series of nine
                                            Method 23 test runs were conducted using
                                            duplicate Method 23 trains. The Method 23
                                            trains sampled from ports located at each end
                                            of the sampling  region where the
                                            DioxinMonitoringSystem was installed, as
                                            shown  in Figure 3-2. The reference samples
                                            were recovered and submitted for analysis by
                                            the modified version of Method 23 described in
                                            Section 3.2. The PCDD/F concentrations
                                            determined by the reference methods were
                                            compared to corresponding results from the
DioxinMonitoringSystem, averaged over the period of each Method 23 test run. During each of
the test runs, the boiler operation was maintained as constant as possible. However, the duration
of the sampling periods and the operating conditions of the boiler were changed from run to run
to provide a range of conditions under which the DioxinMonitoringSystem was evaluated. Two
sets of operating conditions were used for the test runs to generate expected high (5-10 ng
TEQ/dscm) and low (1-2 ngTEQ/dscm) PCDD/F concentrations.  Test runs of various durations
were conducted under each set of operating conditions. Sampling periods of four hours were
used to assess short-term accuracy, whereas long-term accuracy was assessed from composite
samples collected over two 8-hour sampling periods on successive days (i.e., totaling 16 hours
per sample). Table 3-1 shows the sampling durations and boiler operating conditions for each of
the nine test runs. Two Method 23 trains were used to collect each reference sample during  each
test run. These trains sampled isokinetically from a single point in the gas flow, with one of the
trains sampling at each end of the sampling region.

Upon completion of each test run, the Method 23 trains were dismantled for sample recovery in
the field by ARCADIS staff, and all collected sample fractions were logged and stored for
transfer to the analytical laboratory. Subsequent to analysis, ARCADIS reviewed the data and
reported final PCDD/F concentrations from all trains in units of toxic equivalents per dry
standard cubic meter (TEQ/dscm), corrected to 7% Oi. The results from the simultaneously
collected Method 23 trains were used to assess the degree of PCDD/F loss (if any) in the duct
between the two reference method sampling ports. Unless discrepancies of greater than 30%
were observed between the reference samples collected simultaneously for total measured TEQs,
the results from the reference method samples were averaged together to produce the final

-------
Table 3-1. Test Run Summary
Date
9/12/05
9/13/05
9/14/05 & 9/15/05
9/16/05
9/17/05
9/18/05 & 9/19/05
9/20/05
Test Run
1
2
3,4
5
6
7,8
9
Sampling Duration
4 hours
4 hours
16 hours (2x8 hours)
4 hours
4 hours
16 hours (2x8 hours)
8 hours
Expected PCDD/F
Concentration^
Low
Low
High
High
High
Low
High
(a)  Expected concentrations based on results of baseline testing. "High" corresponds to expected total PCDD/F
   TEQ of roughly 5-10 ng TEQ/dscm, and "low" corresponds to expected concentrations of roughly 1-2 ng
   TEQ/dscm.

reference data used for comparison to the DioxinMonitoringSystem results. If discrepancies of
greater than 30% were observed, the data were flagged and the samples treated as independent
samples for comparison to the DioxinMonitoringSystem.

3.3.2  Range

Range was assessed in terms of RA over the range of measured PCDD/F concentrations and
sampling periods. The reference method samples were collected over a range of expected
PCDD/F concentrations to assess the degree of agreement of the DioxinMonitoringSystem with
the reference method. Based on results from baseline testing of the boiler conducted prior to the
verification test, the dopant injection rate and firing conditions were changed for different test
runs to achieve  different expected PCDD/F concentrations (i.e., high or low concentration).
Additionally, the duration of the test runs was varied to achieve a range of sampling periods from
4 to 16 hours. During each test run, the flue gas HC1 level was used as an indicator of the
expected PCDD/F concentrations in the flue gas and the dopant injection rate was varied to
achieve different expected PCDD/F levels for the test runs.

3.3.3  Data Completeness

Data completeness was assessed based on the overall data return achieved by the
DioxinMonitoringSystem.  It was reported in terms of the percentage of acceptable samples
collected during the verification test  and in terms of percentage of time that the
DioxinMonitoringSystem system  was collecting samples compared with the Method 23 sampling
trains.
                                           10

-------
3.3.4 Operational Factors

Operational factors such as maintenance needs, data output, consumables used, ease of use, and
repair requirements were evaluated based on observations recorded by Battelle and facility staff,
and in some cases by the vendor. A laboratory record book maintained at the test facility was
used to enter daily observations on these factors.
                                            11

-------
                                      Chapter 4
                        Quality Assurance/Quality Control


QA/QC procedures were performed in accordance with the QMP for the AMS Center(3) and the
test/QA plan(2) for this verification test.


4.1 Audits
4.1.1  Performance Evaluation Audits

A performance evaluation (PE) audit was conducted to assess the quality of the critical
measurements associated with the reference sampling and analysis methods. In the PE audit,
critical measurements were checked by comparing them with appropriate National Institute of
Standards and Technology (NIST)-traceable standards, when available. Table 4-1 shows the
critical measurements that were audited, the audit procedures and acceptance criteria for the
audit comparisons, and the audit results. An initial PE audit of the Method 23  gas flow rate did
not meet the acceptance criterion. However, the flow transfer standard used for the audit was
found to be working improperly and therefore not appropriate for comparison. The audit was
repeated using a different flow transfer standard. The results of the second audit are presented in
the table.

The PE audit of the internal standard recovery was performed by spiking one blank Method 23
train with an NIST-traceable PCDD/F solution, provided by Battelle, and independent of the
internal standards used for the reference method  samples. The spiked train was not used to
collect a flue gas sample, but was recovered and  analyzed in the same manner as the other
Method 23 trains; and the  analytical results were compared with the spike amount to assess
recovery. The target criteria for this PE audit were 40% to 130% recovery of the internal
standards for the  tetra- through hexachlorinated compounds and 25% to 130% for the hepta- and
octachlorinated compounds. The actual recoveries were well within these limits, ranging from
101% to 120% for all compounds.
                                          12

-------
Table 4-1.  Methods and Acceptance Criteria for PE Audit Measurements
Critical
Measurement
Method 23 gas
sample flow rate
Method 23 stack
gas temperature
Barometric
pressure
PCDD/F internal
standard recovery
PCDD/F surrogate
standard recovery
PE Audit Method
Compare to independent flow
measurement device
Compare to independent
temperature measurement device
Compare to independent pressure
gauge
Method spike with an independent
PCDD/F standard
Field spike with an independent
PCDD/F standard
Acceptance Criteria
±5%
±2% absolute
temperature
±1% absolute pressure
40tol30%fortetra-
through hexachlorinated
compounds; and
25tol30%forhepta-
and octachlorinated
compounds
70 to 130% recovery
Audit Results
2.2-3.4%
Pass
0.0-0.55%
Pass
0.4%
Pass
101 - 120%
Pass
91 - 107%
Pass
The PE audit of the surrogate standard recovery was performed by spiking one blank XAD-2
cartridge with an NIST-traceable dioxin surrogate standard solution provided by Battelle, and
independent of the surrogate standards used for the reference method samples. This spiked
cartridge was extracted and analyzed in the same manner as the other cartridges. The target
criterion for this PE audit was 70 to 130% recovery of the surrogate standards. The actual
recoveries were well within these limits, ranging from 91% to 107% for all compounds.

4.1.2  Technical Systems Audits

The Battelle Quality Manager performed a technical systems audit (TSA) on September 13 and
14, 2005, to ensure that the verification test was being performed in accordance with the AMS
Center QMP,(3) the test/QA plan,(2) published reference methods, and any standard operating
procedures used by the test facility. In the TSA, the Battelle Quality Manager toured the test site,
observed Method 23 sampling and sample recovery, inspected documentation of reference
sample chain of custody, and reviewed laboratory record books. The Quality Manager also
checked standard certifications and Method  23 data acquisition procedures. A TSA report  was
prepared, including a statement of no significant findings or corrective actions were identified.

A single deviation from the test/QA plan was documented as a result of the TSA. This deviation
involved differences between the extraction procedures used by the EPA laboratory and the
procedures in Method 23. The EPA laboratory used modified procedures that allowed for the
extraction and quantification of lower chlorinated PCDD/PCDFs  (e.g., mono- through
trichlorinated PCDD/PCDFs). The modified procedures did not impact the quality of the data for
this verification test.

Additionally, the EPA AMS Center Quality Officer conducted a TSA on September 14, 2005.
There were no significant findings  or correctives identified during that audit.
                                           13

-------
4.1.3  Audit of Data Quality

At least 10% of the data acquired during the verification test were audited. Battelle's Quality
Manager, or designee, traced the data from the initial acquisition, through reduction and
statistical analysis, to final reporting, to ensure the integrity of the reported results. All
calculations performed on the data undergoing the audit were checked.
4.2 Quality Assurance/Quality Control Reporting

Each assessment and audit was documented in accordance with Section 3.3.4 of the QMP for the
ETV AMS Center.(3) Once the assessment report was prepared, the Battelle Verification Test
Coordinator ensured that a response was provided for each adverse finding or potential problem
and implemented any necessary follow-up corrective action. The Battelle Quality Manager
ensured that follow-up corrective action was taken. The results of the TSA were sent to the EPA.
4.3 Data Review

Data generated during this test were reviewed by a Battelle technical staff member within two
weeks of generating the data. The reviewer was familiar with the technical aspects of the
verification test, but was not the person who generated the data. The person performing the
review added his/her initials and the date to a hard copy of the record being reviewed.
                                           14

-------
                                      Chapter 5
                  Statistical Methods and Reported Parameters
The statistical methods presented in this chapter were used to verify the RA, range, and data
completeness of the DioxinMonitoringSystem during this verification test.
5.1 Relative Accuracy

The RA of the DioxinMonitoringSystem with respect to the reference sample results was
assessed as a percent bias, using Equation (1):
                                    d
                                            *ln I
                                                 xlOO
                                        RM
where:
     =  the absolute value of the mean of the differences between the DioxinMonitoringSystem
        and reference sample results for each test run,
 t0915 =  the one-tailed t-value for the 97.5% confidence level,
Sd =    the standard deviation of the differences between the DioxinMonitoringSystem and
reference sample results for each test run, and
RM =   the mean of the reference method results.

In addition to the RA, the intermethod relative standard deviation (RSD) was also calculated
according to Equation (2):
                                                                                    (2)
where

                                          15

-------
   i =    the standard deviation of the paired DioxinMonitoringSystem and reference method
         results for test ran i,
Xj =     the average of the paired DioxinMonitoringSystem and reference method results for test
         ran i, and
n =      the number of test runs.

The intramethod RSD was also calculated using Equation (2) where the standard deviations and
averages were calculated from the duplicate reference method results for each test ran.
5.2 Range

The range of the DioxinMonitoringSystem is reported in terms of its bias relative to the reference
method, expressed both as a percent difference and absolute difference, under the variety of
boiler operating conditions and sampling durations used during the test runs.
5.3 Data Completeness

Data completeness was calculated as the percentage of the total possible data return over the
entire field period. The cause of any substantial incompleteness of data return was established
from operator observation or vendor records and noted in the discussion of data completeness
results.
5.4 Operational Factors

Operational factors were evaluated based on operator observations. No statistical comparisons of
operational factors were made.
                                           16

-------
                                      Chapter 6
                                     Test Results
The results of the verification test of the DioxinMonitoringSystem are presented below for each
of the performance parameters. Test runs were designed to be either 4- or 8-hour periods at high
or low PCDD/F concentrations. Table 6-1 presents a summary of the test runs that were
completed during the verification test along with a summary of the flue gas conditions.

Table 6-1.  Summary of Test Runs and Testing Conditions
Test
Run
1
2
3
4
5
6
7
8
9
Date
9/12/2005
9/13/2005
9/14/2005 (a)
9/15/2005(a)
9/16/2005
9/19/2005
9/20/2005 (a)
9/2 1/2005 (a)
9/22/2005
Duration
(hours)
4
4
8
8
4
4
8
8
8
Expected
PCDD/F
Cone.
Low
Low
High
High
High
High
Low
Low
High
Stack
Temp.
(°F)
312.0
313.5
305.5
309.5
319.0
316.5
303.0
305.5
315.5
02
Cone.
(%)
4.28
4.72
4.30
5.38
5.04
5.09
4.8
3.12
3.38
C02
Cone.
(%)
12.85
12.77
12.98
12.22
12.31
12.23
12.36
13.35
13.04
H2O Cone.
(%)
11.0
10.8
11.1
11.0
11.0
10.8
11.9
11.7
11.1
The samples for Test Runs 3 and 4 and 7 and 8 were collected on a single cartridge for
DioxinMonitoringSystem and analyzed as a single 16-hour test run.
                                                                     the
Table 6-2 lists the reference method results for each test run. The results are presented for the
Method 23 samples that were collected at the first sampling port (Port 1) and the seventh
sampling port (Port 7). The top portion of the table shows the readings for individual dioxin and
furan congeners. The lower portion of the table summarizes the TEQ values for each test run
according to PCDDs, PCDFs, and the total. All results were corrected to 7% Oi.
                                           17

-------
     Table 6-2.  Reference Method 23 Results
Compound
2,3,7,8 - TeCDD
1,2,3,7,8 -PeCDD
1,2,3,4,7,8 -HxCDD
1,2,3,6,7,8 -HxCDD
1,2,3,7,8,9 -HxCDD
1,2,3,4,6,7,8 -HpCDD
1,2,3,4,6,7,8,9 - OCDD
2,3,7,8 - TeCDF
1,2,3,7,8 -PeCDF
2,3,4,7,8 - PeCDF
1,2,3,4,7,8 -HxCDF
1,2,3,6,7,8 -HxCDF
2,3,4,6,7,8 - HxCDF
1,2,3,7,8,9 -HxCDF
1,2,3,4,6,7,8 -HpCDF
1,2,3,4,7,8,9 -HpCDF
1,2,3,4,6,7,8,9 - OCDF

Total PCDD TEQ
Total PCDF TEQ
Total PCDD/F TEQ
Concentration [ng/dscm @ 7% O2]
Test Run 1
Port 1
0.0
0.2
0.1
0.1
0.1
0.5
0.8
0.7
0.8
1.8
1.6
1.1
0.9
0.1
3.2
0.4
1.0
Port?
0.0
0.2
0.1
0.1
0.1
0.5
0.8
0.6
0.8
1.8
1.6
1.2
0.9
0.1
3.5
0.5
1.3
Test Run 2
Portl
0.0
0.1
0.1
0.1
0.0
0.4
0.7
0.4
0.6
1.3
1.2
0.9
0.6
0.0
2.6
0.3
0.9
Port?
0.0
0.1
0.1
0.1
0.1
0.4
0.6
0.4
0.5
1.1
1.1
0.8
0.5
0.0
2.4
0.3
0.9
Test Run 3
Portl
0.1
0.3
0.3
0.3
0.2
1.6
3.0
2.5
3.2
6.8
6.1
4.8
3.3
0.3
12.7
2.0
6.2
Port?
0.1
0.3
0.3
0.3
0.2
1.8
3.3
2.5
3.4
7.2
6.8
5.3
3.7
0.3
13.7
2.2
6.5
Test Run 4
Portl
0.1
0.3
0.3
0.3
0.2
2.0
4.6
2.0
2.9
6.2
6.5
4.9
3.2
0.2
15.9
2.1
8.6
Port?
0.1
0.3
0.3
0.3
0.2
2.0
4.5
2.3
3.4
7.1
7.3
5.6
3.8
0.3
16.7
2.2
7.9
Test Run 5
Portl
0.1
0.3
0.3
0.3
0.2
1.8
3.2
1.8
3.0
6.5
7.2
5.4
3.6
0.3
15.5
2.1
6.7
Port?
0.0
0.2
0.2
0.3
0.2
1.4
2.6
1.6
2.4
5.2
5.7
4.2
2.7
0.2
12.2
1.6
5.3
Test Run 6
Portl
0.0
0.2
0.2
0.3
0.1
1.4
3.1
1.6
2.3
5.4
5.7
4.3
3.0
0.2
13.3
1.4
4.8
Port?
0.1
0.2
0.2
0.2
0.1
1.3
2.8
1.4
2.2
4.9
5.3
4.1
2.8
0.2
12.5
1.4
4.5
Test Run 7
Portl
0.0
0.1
0.1
0.1
0.1
0.4
0.7
0.4
0.6
1.3
1.6
1.2
0.8
0.1
3.7
0.4
1.1
Port?
0.0
0.1
0.1
0.1
0.0
0.4
0.6
0.4
0.6
1.2
1.5
1.1
0.7
0.1
3.4
0.3
1.0
Test Run 8
Portl
0.0
0.1
0.1
0.1
0.0
0.3
0.5
0.2
0.4
1.0
1.2
0.9
0.6
0.1
2.7
0.3
0.9
Port?
0.0
0.0
0.1
0.1
0.1
0.4
0.6
0.2
0.4
0.9
1.2
0.9
0.6
0.0
2.8
0.3
0.8
Test Run 9
Portl
0.0
0.1
0.2
0.2
0.1
1.0
1.8
1.6
2.1
4.6
4.5
3.4
2.3
0.2
9.6
1.4
4.3
Port?
0.0
0.1
0.2
0.2
0.1
1.1
1.8
1.5
2.0
4.4
4.6
3.4
2.3
0.2
9.7
1.5
4.1
Concentration [ng TEQ /dscm @ 7% O2]
0.22
1.41
1.63
0.23
1.39
1.62
0.17
1.03
1.19
0.14
0.88
1.01
0.42
5.39
5.81
0.46
5.76
6.22
0.42
5.13
5.55
0.44
5.82
6.26
0.42
5.41
5.84
0.35
4.28
4.63
0.31
4.43
4.74
0.29
4.08
4.37
0.11
1.13
1.24
0.10
1.07
1.17
0.10
0.83
0.93
0.07
0.81
0.87
0.23
3.71
3.94
0.25
3.60
3.85
oo

-------
The TEQ values for each test run are also presented in Table 6-3, along with the calculated
percent difference between the results from the two Method 23 trains. With the exception of the
TEQ results for PCDDs in Test Run 8, the results from the two trains are all within 30%,
indicating no substantial biases based on the sampling port locations. Even for Test Run 8, the
large relative difference observed for the PCDDs is magnified because of the low absolute
concentrations of PCDDs in that run. The PCDFs for that test run agree well for the two trains,
indicating that there was no substantial bias between the ports for that run, the average of the
results was used in all cases for evaluation of the DioxinMonitoringSystem.
Table 6-3.  Results from the Method 23 Reference Samples
Test
Run
1
2
3
4
5
6
7
8
9
PCDD TEQ
Port #1
0.22
0.17
0.42
0.42
0.42
0.31
0.11
0.10
0.23
Port #7
0.23
0.14
0.46
0.44
0.35
0.29
0.10
0.07
0.25
% Diff.
-5.5%
17.7%
-7.5%
-5.3%
18.9%
6.6%
12.0%
36.4%
-10.0%
PCDF TEQ
Portttl
1.41
1.03
5.39
5.13
5.41
4.43
1.13
0.83
3.71
Port #7
1.39
0.88
5.76
5.82
4.28
4.08
1.07
0.81
3.60
% Diff.
0.3%
16.1%
-6.8%
-12.0%
23.1%
8.1%
6.1%
6.3%
2.4%
Total PCDD/F TEQ
Port #1
1.63
1.19
5.81
5.55
5.84
4.74
1.24
0.93
3.94
Port #7
1.62
1.01
6.22
6.26
4.63
4.37
1.17
0.87
3.85
% Diff.
0.6%
16.4%
-6.8%
-12.0%
23.1%
8.1%
5.8%
6.7%
2.3%
6.1 Relative Accuracy

Table 6-4 displays the analytical results of the DioxinMonitoringSystem samples for individual
dioxin and furan congeners, as well as the TEQ values for PCDDs, PCDFs and the PCDD/F
totals. Note that a single composite sample was collected for Test Runs 3 and 4, as well as for
Test Runs 7 and 8. As with the reference method samples, these results have been corrected to
7% Oi. In Table 6-5, the DioxinMonitoringSystem results are presented along with the averaged
result from the reference method for each test run. In this table, the reference method results for
Test Runs 3 and 4, and also for Test Runs 7 and 8 were each combined to represent a single
sample totaling 16 hours. The percent difference between the reference method results and the
DioxinMonitoringSystem results is shown for each test run. For all but one test run, the
DioxinMonitoringSystem results were lower than the reference method results. The percent
differences range from -18.7% to 18.0% for all the test runs.
                                           19

-------
Table 6-4.  DioxinMonitoringSystem Results
Compound
2,3,7,8 - TeCDD
1,2,3,7,8 - PeCDD
1,2,3,4,7,8 - HxCDD
1,2,3,6,7,8 - HxCDD
1,2,3,7,8,9 - HxCDD
1,2,3,4,6,7,8 - HpCDD
1,2,3,4,6,7,8,9 - OCDD
2,3,7,8 - TeCDF
1,2,3,7,8 - PeCDF
2,3,4,7,8 - PeCDF
1,2,3,4,7,8 - HxCDF
1,2,3,6,7,8 - HxCDF
2,3,4,6,7,8 - HxCDF
1,2,3,7,8,9 - HxCDF
1,2,3,4,6,7,8 - HpCDF
1,2,3,4,7,8,9 - HpCDF
1,2,3,4,6,7,8,9 - OCDF
Concentration [ng/dscm @ 7% O2]
Test
Runl
ND
ND
ND
ND
ND
0.86
1.96
0.84
1.04
2.57
1.77
1.36
1.22
ND
4.05
0.92
2.00
Test
Run 2
ND
ND
ND
ND
ND
0.42
0.94
0.42
0.54
1.27
1.15
0.86
0.75
ND
2.34
0.46
1.22
Test
Run
3-4
ND
0.25
0.21
ND
0.20
1.55
2.77
2.11
3.08
6.05
6.21
6.10
3.12
0.26
12.66
1.76
5.71
Test
RunS
ND
ND
ND
ND
ND
1.55
2.72
1.39
2.42
5.28
5.40
4.09
2.81
0.27
12.82
1.64
5.08
Test
Run 6
ND
ND
0.23
0.30
ND
1.13
2.08
1.28
1.96
4.42
4.58
3.51
2.56
0.18
10.51
1.18
3.31
Test
Run
7-8
ND
0.06
0.05
ND
0.04
0.34
0.58
0.30
0.52
1.11
1.45
1.09
0.73
0.06
3.07
0.32
0.89
Test Run
9
ND
0.13
0.16
0.28
0.12
0.95
1.53
1.37
1.85
4.15
4.29
3.21
2.21
0.19
8.78
1.21
3.13
                            Concentration [ng TEQ/dscm @ 7% O2]
Total PCDD TEQ
Total PCDF TEQ
Total PCDD/F TEQ
0.01
1.91
1.92
0.00
1.01
1.01
0.31
5.10
5.41
0.02
4.30
4.32
0.06
3.64
3.70
0.08
0.98
1.06
0.20
3.39
3.59
       ND - Not detected
                                           20

-------
Table 6-5.  Summary of Results from the Method 23 Reference Samples and
DioxinMonitoringSystem
Test Run
1
2
3 and 4(a)
5
6
7 and 8(a)
9
Average Method 23
Total PCDD/F
Results
(ng TEQ/dscm)
1.62
1.10
5.96
5.23
4.55
1.05
3.89
DioxinMonitoringSystem
Total PCDD/F Results
(ng TEQ/dscm)
1.92
1.01
5.41
4.32
3.70
1.06
3.59
Difference Percent
(ng TEQ/dscm) Difference
0.29
-0.09
-0.54
-0.91
-0.85
0.00
-0.30
18.0
-8.3
-9.1
-17.5
-18.7
0.3
-7.7
(a)
   The samples for Test Runs 3 and 4 and 7 and 8 were collected on a single cartridge for the
   DioxinMonitoringSystem and analyzed as a single 16-hour test run.
Table 6-6 shows the relative accuracy results for the DioxinMonitoringSystem, expressed as a
percent as calculated by Equation (1) (Section 5.1). The RA result for combined PCDD/F
measurements is 22.6%. Separately, RA calculations are 106.0% for the PCDDs and 18.4% for
the PCDFs. None of the PCDD results were above 0.5 ng TEQ/dscm, so the RA% is large
(106.0%) although the magnitude of the differences between the reference method and the
DioxinMonitoringSystem are small, such that it has little impact on the total PCDD/F RA%. This
calculation of RA includes the absolute differences between the measurements for the test runs
as well as the standard deviation of the differences for all the runs. As a result, the RA
percentage results reported in Table 6-6 are greater than the percent differences shown in Table
6-5. Furthermore, as seen in Table 6-4, in several instances there were congeners that were not
detected in the analysis of the DioxinMonitoringSystem samples. The DioxinMonitoringSystem
is typically used to collect samples over periods of weeks rather than hours, so it is not
unexpected that some congeners were not detected in the collected samples. To remove the
influence of non-detects, the RA was also calculated using only those congeners that were
detected in both the DioxinMonitoringSystem and the Method 23 samples. These values are
included parenthetically in Table 6-6. In addition, the intermethod RSD of the differences
between the DioxinMonitoringSystem and  average of the Method 23 results is shown along with
the intramethod RSD between the two Method 23 trains. The intermethod RSD calculated
excluding non-detected congeners is presented parenthetically.
                                          21

-------
Table 6-6. Relative Accuracy Results for the DioxinMonitoringSystem
Parameter
PCDD TEQ (n = 7)
PCDF TEQ (n = 7)
PCDD/F TEQ (n = 7)
RA (%)
106.0 (16.8%)(a)
18.4 (17.8%) (a)
22.6 ( 17.5%) (a)
Intermethod RSD Intramethod RSD
(%) (%)
85.4 (16.3%)(b)
10.3 (IQ.4%)^
9.7 (IQ.4%)^
10.0
8.4
8.4
(^- RA calculated using only congeners detected in both the DioxinMonitoringSystem and Method 23 samples.
  - Intermethod RSD calculated using only congeners detected in both the DioxinMonitoringSystem and Method
   23 samples.
6.2 Range

The range of the DioxinMonitoringSystem is reported in terms of percent difference from the
reference method under the variety of boiler operating conditions and sampling durations used
during the test runs. Table 6-5 shows that, overall, no clear pattern exists in terms of the percent
difference as a function of total TEQ concentration. The greatest absolute percent difference
between the DioxinMonitoringSystem and Method 23 results was 18.7%, and the smallest
absolute percent difference was 0.3%.

Table 6-7 summarizes the test runs by sampling duration. The average of the absolute values of
the individual percent differences for 4-hour test runs was 15.6%, and the average for the 8- and
16-hour test runs was 5.7%. The percent differences varied considerably within both  groups. For
example, the largest positive difference  (18.0%)  and the largest negative difference (-18.7%)
both occurred in the set of 4-hour samples. There was no apparent dependence of
DioxinMonitoringSystem accuracy relative to Method 23 on the length of the sampling run
during this test, since the observed differences were on the same order of magnitude as the
differences between the duplicate Method 23 trains.

Table 6-7.  Summary of Percent Difference by Sampling Duration
Duration
16 hr
16 hr
8hr
> 4 hr Average
Absolute % Diff
4hr
4hr
4hr
4hr
4 hr Average Absolute
% Diff
Test Run
3 and 4
7 and 8
9

1
2
5
6

% Difference
-9.1
0.3
-7.7
5.7
18.0
-8.3
-17.5
-18.7
15.6
                                           22

-------
6.3 Data Completeness

Samples were successfully collected from all of the sampling test runs, and the results of the
analyses of these samples are presented in Section 6.1. As a result, the data completeness for the
DioxinMonitoringSystem was 100% for the verification test.
6.4 Operational Factors

Table 6-8 summarizes the activities performed on the DioxinMonitoringSystem during the
verification test, as well as the time required to perform those activities and the amount of down
time experienced to complete those activities.

Table 6-8.  Activity Summary for DioxinMonitoringSystem
Date
9/12/05
9/12/05
9/13/05
9/13/05
9/14/05
9/15/05
9/16/05
9/16/05
9/19/05
9/19/05
9/20/05
9/20/05
9/21/05
9/22/05
9/22/05
Duration
15 minutes
5 minutes
15 minutes
5 minutes
15 minutes
5 minutes
15 minutes
5 minutes
15 minutes
5 minutes
15 minutes
10 minutes
5 minutes
15 minutes
5 minutes
Activity
Sample installation, instrument set-up,
diagnostics
Sample recovery, data retrieval
Sample installation, instrument set-up,
diagnostics
Sample recovery, data retrieval
Sample installation, instrument set-up,
diagnostics
Sample recovery, data retrieval
Sample installation, instrument set-up,
diagnostics
Sample recovery, data retrieval
Sample installation, instrument set-up,
diagnostics
Sample recovery, data retrieval
Sample installation, instrument set-up,
diagnostics
Failed leak test, inspected inner filter and
removed excess filter from seal
Sample recovery data retrieval
Sample installation, instrument set-up,
diagnostics
Sample recovery data retrieval
Down Time
NAa
NAa
NAa
NAa
NAa
NAa
NAa
NAa
NAa
NAa
NAa
10
NAa
NAa
NAa
(a)
   NA = Not applicable. Sample installation and recovery are performed outside of sampling period.
6.4.1  Ease of Use

The DioxinMonitoringSystem was installed by a single representative of MonitoringSystems,
GmbH and was completely ready for testing within 2 days after the start of installation.
Operation of the DioxinMonitoringSystem during the verification test was conducted by a
representative of MonitoringSystems, GmbH. Instruction was given to representatives of Battelle
                                           23

-------
for approximately one hour on operation of the DioxinMonitoringSystem, including installation
and retrieval of sampling media, and programming of the system for automated sample
collection. This instruction was sufficient for basic operation of the DioxinMonitoringSystem.
More thorough instruction would be necessary for more advanced activities associated with the
system.

Installation and retrieval of the sampling media required approximately 5 to 15 minutes for each
process. The DioxinMonitoringSystem experienced approximately 10 minutes of downtime
during the verification test which accounts for <1% of the total sampling time for all test runs
combined.
6.4.2 Maintenance

For the purpose of this verification report, sample installation/recovery and system setup were
not considered to be maintenance activities. Outside of routine sample installation/recovery and
system setup, no maintenance was performed on the DioxinMonitoringSystem during the
verification test.

6.4.3 Consumables/Waste Generation

During the verification test, the DioxinMonitoringSystem required the use of several standard
consumable materials. The consumables that were used included PUF cartridges that were used
in the sampling cartridge for sample collection. Additional consumables included solvents and
dioxin standards used in the extraction and analysis of the collected samples.
                                           24

-------
                                       Chapter 7
                               Performance Summary
Table 7-1 presents a summary of the results of the verification of the DioxinMonitoringSystem
system during this verification test.

Table 7-1.  Summary of Verification Test Results for DioxinMonitoringSystem
Parameter Evaluated
Accuracy
Range
Data completeness
Ease of use
Maintenance
Consumables/waste
generated
Method of Evaluation
Comparison to Method 23
reference samples
Comparison to Method 23
reference samples by
concentration and sample
collection time
Ratio of number of samples
successfully collected to
number of potential samples
that could have been
collected
Operator observations
Operator observations
Operator observations
Results

RA
(RA)(a)
Intermethod RSD
(Intel-method RSD)(b)
Intramethod RSD
PCDDs
• 106%
• (16.8%)
• 85.4%
• (16.3%)
• 10.0%
PCDFs
• 18.4%
• (17.8%)
• 10.3%
• (10.4%)
• 8.4%
PCDD/Fs
• 22.6%
• (17.5%)
• 9.7%
• (10.4%)
• 8.4%
• No dependence of accuracy on PCDD/F TEQ over range of
approximately 1 to 6 ng TEQ/dscm
• No dependence of accuracy on sample duration over range of
4 to 16 hours.
100% completeness in number of samples collected.
• Installation of the DioxinMonitoringSystem was completed by
a representative of MonitoringSystems, GmbH, within 48
hours
• Effectively operated after 1-2 hours of training in basic
operation
• Installation of sampling media and removal of sampling media
completed in approximately 5-15 minutes each
• Less than 1% downtime
No maintenance was required during the verification test.
PUF cartridges were used in the sampling cartridges for sample
collection.
(a) — RA calculated using only congeners detected in both the DioxinMonitoringSystem and Method 23 samples.
  - Intermethod RSD calculated using only congeners detected in both the DioxinMonitoringSystem and Method
   23 samples.
                                            25

-------
                                    Chapter 8
                                    References
1.  U.S. EPA Method 23—Determination of Poly chlorinated Dibenzo-p-dioxins and
   P oly chlorinated Dibenzofuram from Municipal Waste Combustors, U.S. Environmental
   Protection Agency, February 1991. Available at: http://www.epa.gov/ttn/emc/promgate/m-
   23.pdf.

2.  Test/QA Plan for Verification ofDioxin Emission Monitoring Systems (EMSs), Battelle,
   Columbus, Ohio, September 6, 2005.

3.  Quality Management Plan (QMP)for the ETV Advanced Monitoring Systems Center,
   Version 5.0, U.S. EPA Environmental Technology Verification Program, Battelle,
   Columbus, Ohio, March 2004.

4.  George C. Clark, Michael Chu, Dahman Touati, Barry Rayfield, Jon Stone, and
   Marcus Cooke, "A Novel Low-Cost Air Sampling Device (AmbStack Sampler) and
   Detection System (CALUX Bioassay) for Measuring Air Emissions of Dioxin, Furan, and
   PCB on a TEQ Basis Tested With a Model Industrial Boiler," Organohalogen Compounds,
   40 (1999),
                                        26

-------