PROMOTING IMPROVED AIR QUALITY THROUGH
ENVIRONMENTAL TECHNOLOGY VERIFICATIONS
Paper # 565656
Theodore G. Brna
and Jack R. Farmer
U. S. Environmental Protection Agency
Office of Research and Development
National Risk Management Research Laboratory
109 T. W. Alexander Drive (E305-01)
Research Triangle Park, NC 27711
Research Triangle Institute
Engineering and Technology Division
3040 Cornwallis Road
P. O. Box 12194
Research Triangle Park, NC 27709
ABSTRACT
In 1995, the U. S. Environmental Protection Agency's (EPA's) Office of Research and
Development began the Environmental Technology Verification (ETV) Program in response to
President Clinton's "Bridge to a Sustainable Future" and Vice President Gore's "National
Performance Review" to work with the private sector to establish a market-based verification
process available to all environmental technologies. In its 1995-2000 pilot period, up to 12 pilot
programs operated under the ETV Program to accelerate the commercialization of innovative or
improved technologies through independent third-party verification and reporting of their
performance. Normally, nonprofit organizations were selected competitively by EPA to manage
these programs whose verification activities followed ETV guidelines. These guidelines include:
a technology whose performance is to be verified must be commercially ready and voluntarily
offered by its vendor who agrees to cost-share testing, and testing is by an independent
organization that complies with the stakeholder-developed protocol approved by EPA.
Operational features of the ETV Program and accomplishments during its' pilot period are
discussed, while lessons learned are summarized. Restructuring of the ETV Program into six
centers was begun in late 2000, and the new structure is noted briefly. Discussed in detail is the
Air Pollution Control Technology Verification Center and its operation by the Research Triangle
Institute. This includes its organization, operational features, technology areas of interest,
verifications completed, and work in progress.
The Environmental Technology Verification (ETV) Program was begun in 1995 by the U. S.
Environmental Protection Agency (EPA) as directed by the national environmental strategy in
President Clinton's "Bridge to a Sustainable Future" and Vice President Gore's "National
Performance Review" as a means of working with the private sector to establish a market-based
verification process available to all environmental technologies.1 Under EPA's Office of
Research and Development, the ETV Program verifies the performance of innovative or improved
solutions to problems which threaten human health and the environment and is designed to
accelerate the acceptance of verified technologies by the domestic and international marketplaces.
In the ETV Program, verification means establishing the truth of a technology's performance
INTRODUCTION
1

-------
under specific, predetermined criteria or protocols and adequate quality assurance (QA)
procedures.1
The immediate goal of the ETV Program is to verify the performance characteristics of
commercially ready environmental technologies through evaluating objective and quality-assured
data and to provide the results to potential users. By the end of the 5-year pilot period in 2000, 12
ETV pilot programs were operational, each with the overall objective of accelerating the
commercialization of innovative or improved environmental technologies through third-party
verification and reporting of their performance. In late 2000, the ETV Program was reorganized
from 12 separate pilots into 6 verification centers. The Environmental Technology Evaluation
Center (EvTEC) pilot became independent and likely will continue its verification activities, some
jointly with ETV centers. These centers include: Advanced Monitoring Systems; Air Pollution
Control Technology, Greenhouse Gas Prevention Technology; Drinking Water Systems; Water
Protection Technologies; and Pollution Prevention (P2), Recycling, and Waste Treatment.
Although the P2, Recycling, and Waste Technologies Center was created, it did not receive any
EPA funding in the current fiscal year. It will continue operating until residual funds are
expended and thereafter if additional funds are provided. Salient qualities of the ETV Program,
including some achievements and lessons learned, will be presented below. Additional
information on the ETV program may be obtained from the ETV web site:
httr>://www.ena.gov/etv and Reference 3.
The Air Pollution Control Technology Verification Center (APCTVC) began operation as a pilot
program in September 1997 with Research Triangle Institute (RTI) having been selected through
competition by EPA to operate this pilot. As a pilot program, it focused on particulate matter
(PM), nitrogen oxides (NOx), and hazardous air pollutant (HAP) control technologies. As a
center, it will continue work in these three technology areas but will also address emissions from
mobile sources, volatile organic compounds (VOCs), and indoor air products. Operational
features of the APCTVC, verifications completed, and work in progress will be discussed later.
ETV PROGRAM QUALITIES
The goal of the ETV Program is to verify the performance characteristics of commercially ready
technologies by independently obtaining objective and quality-assured data and to provide
credible performance information to potential technology users, such as buyers, permitters,
consultants, environmentalists, and the public.4 Thus, the Program does not use unmonitored
vendor-generated data as the sole basis for ETV, but may use these data in test planning. In
pursuing its goal, the Program's primary purpose has remained unchanged from its start in 1995:
performing technology tests to obtain credible performance data to help decision-makers in the
environmental marketplace. In this process, the Program does not rank, approve, or disapprove a
technology; its verification of a technology's performance provides quality-assured performance
data from independent, limited testing with clearly defined objectives. The Program does not
certify or guarantee a technology's performance or that the technology will always perform as
verified.
The ETV Program is voluntary, is nonregulaiorv (although permitters may use ETV data in their
considerations), is restricted to commercially ready technologies, may share verification costs
2

-------
with vendors, and uses independent, third-party testing organizations. Stakeholder groups,
composed of technology vendors/manufacturers, users, permitters, consultants, and interested
others (environmentalists, insurers, venture capitalists, and professional/technical society
representatives) assist centers in seeking, selecting, and prioritizing technologies for testing and
publicizing test results. They may also aid in developing test protocols and reviewing test/quality
assurance (QA) plans and verification reports and statements. Stakeholder group membership is
balanced to promote fairness and dynamic to meet changing needs. Over 1100 stakeholders in 18
groups participated in the ETV Program by October 2001.5
QA is vital in ensuring credible data in the ETV Program, and its implementation is directed and
guided by the Program's Quality and Management Plan 6 This plan is the basic QA reference for
development of generic verification protocols (GVPs) and test/QA plans. Each center has its own
EPA-approved quality management plan (QMP) which also complies with ANSI/ASQC E4-1994,
"Specifications and Guidelines for Quality Systems for Environmental Data Collection and
Environmental Technology Programs."7
VERIFICATION PROCESS
Figure 1 shows the sequence of activities in the ETV process.
Figure I: Environmental Technology Verification Process Flow Chart
Conduct
Form partnerships	technology
J-
M-
o

Identify technology	Oevelop test/QA plans
categories
Identify vendors
testing
Develop generic
test protocols
Create stakeholder group	Evalus lata
Wnte verification report
Starting at the upper left of the figure, EPA selected its ETV partners through competitive
cooperative agreements with nonprofit entities, interagency agreements, or memoranda of
understanding with governmental agencies. EPA's partners then formed stakeholder groups,
selected and prioritized technology areas for verification, formulated quality management plans,
and identified technology vendors. Testing and reporting were performed following preparation
of GVPs and test/QA plans.
3

-------
A technology is selected for verification testing by the center after input from the stakeholders.
Vendors of technologies in a selected technology area are solicited to apply to have the
performance of their technologies verified. Applications received from vendors are evaluated,
and vendors of selected technologies are invited to participate in planning for the verification test.
If the center and vendors agree to proceed with testing, the center arranges for a qualified
independent testing organization to prepare the test/QA plan for the voluntarily offered
technology for the specific test site using the GVP for guidance. After approval of the test/QA
plan by the center and EPA, the test is performed at the selected location, preferably where the
technology is being operated commercially. The testing organization evaluates the data using
data quality objectives in the test/QA plan and derives performance results. The center and EPA
also perform quality checks of the test data and results, and one or both may perform audits at the
test site. The testing organization drafts the verification report, may draft the verification
statement, and provides the report to the center. If the testing organization does not prepare the
verification statement, the center does so. The center coordinates the review and approval process
of the report and included statement. The final report with the verification statement (at the
option of the vendor), signed by an EPA official (usually a laboratory director) and a center
official, is published on the EPA and center web sites. Copies may be available from the vendor.
Outputs
During its pilot period (1995-2000). the ETV Program operated 12 pilot programs, which verified
111 technologies and produced 60 GVPs and 96 test/QA plans for verification testing. Table 1
gives information on the pilot programs and their verifications, fifty-three additional
verifications were completed in fiscal year 2001. Reference 5 lists the 164 technologies verified,
white information on the centers formed in late 2000 is given in Reference 3.
Lessons Learned from the ETV Program's Pilot Period
for the ETV pilot period, the median cost and time required to complete a verification were
$95,100 and 1 5.5 months, respectively, with the cost and time ranging from $36,900 to $283,000
and 5 to 33 months, respectiv ely, for a single product test. Of these median amounts, the
preparation of the test/QA plan cost S22.000. ranging from 0 to $167,000, while the time ranged
from 0 to 24 months. To execute the test/QA plan, the median cost was $34,000 and ranged from
$7,200 to $162,200. while the median lime required was less than a month but ranged up to 18
months. The median cost and time for report preparation and approval were $27,500 (ranging
from $6,700 to $93,000) and 7 months (ranging from 1 to 27 months). When several products
were tested as a group, the median cost of the group test was higher than for a single product, but
the cost per product was lower. Not considered in these data are the lime and cost of EPA staff,
volunteers serving as stakeholders, and those assisting in the development and review of protocols
or reports. It also does not consider the future cost savings of the developed program's
infrastructure or the use of test QA plans and report formats produced in the pilot period.
feedback from participants in the ETV Program during its pilot period suggested that:
1. EPA's involvement in the ETV Program is critical to credible environmental technology
verification Participants lelt that EPA's continued participation is needed to maintain its
4

-------
Table 1, Operational ETV Pilot Programs, September 2000.
No.
Pilot Program
Start
Date
Stakeholder*
Verifi-
cations
EPA'* Partner
Contacts:
Partner (EPA)
!
Advanced
Monitoring
Systems
10,'9?
Air:
Water: 35
15
BattelSc Memorial
Institute
Columbus. OH
Karen Rtggs
614.424-7379
(Robert Fuerst
919-541-2221)
2
Air Pollution
Control
Technology
10/97
30
21
Research Triangle
Institute,
Research Triangle
Park. NC
Jack Fanner
919-541-6909
{Ted Brna
919-541-2683)
3
Drinking Water
Systems
I0."95
95
11
NSF International
Ann Arbor, Ml
Bruce Battley
734-769-5148
(Jeff A
6
Indoor Air
Products
10"9S
179
•it,
Research Triangle
Institute
Research Triangle
Park, NC
David finsor
919-542-67} «>
(Les Sparks 919-
54I-2458'i

Pollution Prevent-
ion tP2) Innova-
tive Coalings and
CoMinfc Equipment
10,'96
30
5
Concurrent
Technologic*
Corporation
Johnstown, PA
Brian Schweitzer
814-269-2772
(Mike Kdiusko
919-541-2734)
! 1
F2 Metal Finishing
Technologies
6>98
30
1
Concurrent
Technologies ,
Corporal Lofl
Larco, FL
Dorm Brown
727-549-7007
(Alva Ctanicls
513-569-7693!
9
P2, Recycling,
and Waste
Treatment
SyMerm
7m
22
3
Cali'EPA Department
of Toxic Substances
and Control
Sacramento, CA
Greg Williams
916-322*1670
("Ncftna Lewis
5 J 3-569-7665)
10
Site
Characterization
and Monitoring
technologies
10/94
lt>
44
Sandia National
Laboratories
Albuquerque, NM
Oak Ridge National
Laboratory
Oak Ridge, TN
Wayne Einfeld
5(!,S.845.iai4
Roger Jenkins
86S-5 76-8594 1
(Eric Koghn
702-798-2432)
11
Source Water
Protection
Technologies
5-'98
136
0
NSF International
Ann Arbor, Ml
Torn Stevens
734-769-5347

-------
credibility and the value of performance verifications. When the Program began, EPA had
expected the pilot programs to become self-sustaining at the end of their pilot periods.
2.	EPA's involvement adds value to the verifications. Vendors use EPA- and center-signed
verification statements as marketing tools containing independent, quality-assured
performance data on their technologies.
3.	Staffing by adequate and qualified EPA and partner personnel is essential to a successful
verification process. 1'he ETV team approach with managers for media-specific technology
areas functioned effectively under the director of the ETV Program. The centralized direction
and coordination enabled consistent policies; timely and consistent communication: program
data collection, analysis, and reporting; and outreach, including operation of a web site to
publicize reports and program information.
4.	The verification process is more time consuming, costly, and complex than was expected.
Accurate verifications rely on adequate QA, and this needs to be made known to vendors
Protocol and test/QA plan development can be major cost items for verifications.
5.	Quick reporting of verification test results enhances their value to vendors and users. Median
time for the public release of verification reports was 7 months after completing a test during
the pilot period. Vendors felt that rapid public release of reports aided their marketing and
recommended release of verification reports shortly after verification testing.
6.	More effective marketing of the ETV Program to reach and inform technology vendors and
potential users is needed. Users of environmental technologies need to recognize the value of
technology-verifications so that they may seek ETV data from vendors,
AIR POLLUTION CONTROL TECHNOLOGY VERIFICATION CENTER
The Center is operated by the nonprofit Research Triangle Institute (RTI), having begun in late
1997 as a pilot ETV program. Its purpose remains unchanged from its pilot period: to accelerate
the use of new or improved commercially ready technologies through indppendent verification
and reporting of their performance. As a pilot program, the Center focused on air pollution control
technologies for particulate matter (PM), especially fine PM (particles with an aerodynamic
diameter of 2.5 micrometers (pm) and less-PM2.5], nitrogen oxides (NOx). and hazardous air
pollutants, especially volatile organic compounds (VOCs). More recently the Center began work
for controls of dust from unpaved roads and exhaust emissions from diesel engines.
Verification testing at an industrial site where the technology is operating is preferred, but testing
of a product or its vital element! s) may occur in a laboratory if circumstances warrant. Usually
the Center uses a contractor to prepare the test/QA plan using the GVP for guidance, conduct
testing, and draft the test report. The Center and EPA approve the test'QA plan prior to testing,
conduct QA audits, obtain peer reviews of the draft report, and approve the final report w ith its
verification statement (if the vendor agrees to its inclusion) before its public release. A test report
is prepared for all verification tests for public use, with or without a verification,statement.
Center Verification Process
The Center uses it stakeholders advisory committee (SAC) to assist it in identifying, selecting.
and prioritizing technologies for testing and to provide guidance in overall operation of the
b

-------
("enter. The general sequence of activities for the verification process follows that shown in
Figure 1 after formation of the KPA/RTI partnership. After selecting a technology area for
verification, the Center uses a technical panel (TP) appointed by the Center's director and
composed of very know ledgeable personnel, most volunteering their time, in the technology area
of concern to develop a generic verification protocol (GVP) which provides guidance in
developing a test/QA plan for a specific technology to be tested. The Center appoints a facilitator
who chairs each TP meeting (which is open to the public), presents a draft GVP w-ith test
objectives and QA prov isions to the TP to initiate discussion, records input on the draft from the
TP, and prepares minutes of the meeting for posting on the Center's web site. A copy of the
revised draft GVP is sent to the TP members. Center, and EPA for review. Additional TP
meetings are held and the GVP revised until the draft GVP meets the approval of the TP, Center,
and EPA. This GVP remains a draft until verification testing is performed, revised as needed
based on information from testing, and then finalized. The GVP in final form is posted on the
ETV and Center web sites: httn://wvvw.cpa.gov/ctv~ and http://etv.rti.org/anct.8
Membership of the TP includes vendors who indicated an interest in having their technologies
tested. The ("enter solicits applications from vendors and holds a meeting with applicants after
the GVP has been drafted to discuss planning for testing. Technology tests are performed at a
single field site if feasible. Otherwise, a technology is tested at an industrial site where it is being
operated with the vendor arranging for testing at the site. The Center's test contractor maintains
coordination with the Center, the concerned vendor(s), and test site representative(s) during the
development of-the test/QA plan, conduct of the test, and drafting of the test report.
Testing proceeds only after the test/QA plan has been approved by the Center and EPA. This
plan follows guidance in the pertinent GVP and uses input from the vendor, test site operator, and
the Center's staff and advisors. This plan gives the test objectives, procedures for attaining them
along with data quality objectives, and a schedule for the test and reporting of results. Reporting
of results includes findings of the QA audits and action taken to correct any deficiencies found.
i
The draft report and verification statement (at the vendor's option) after the Center's review are
submitted to the vendor by the Center for comment. The vendor's comments are considered in
revising the draft if needed before the Center sends it to EPA for review, '['he approved report
with the verification statement (if present) signed by the director of EPA's National Risk
Management Research laboratory is returned to the Center for its director's signature. A copy of
the verification report and signed (original) verification statement is provided to the vendor, EPA,
and the Center. The report is also posted on the ETV (http://cpa.gov/etv~) and Center
(http://etv.rti.org/apcts) web sites, and the vendor may distribute copies.
As described above, the ("enter's verification process follows the sequence shown in Figure 1.
Figure 2 shows an organization structure for verifying the performance of baghouse filtration
products (BFPs).1' Similar structures apply for the ETV of other technologies as suggested by the
low er part of this figure. The Center's director provides direction to contractors performing
verifications and involved in business/marketing. The director's relationship with EPA and
QA/QC personnel is a cooperating and coordinating one.
7

-------
figure 2: Air Pollution control Technology Verification Center Organization Highlighting
Baghouse Filtration Products4
EPA
ETVPROGRAM
APCT
PfOfiCl
Tftaodora Btni
QAIQC
EPA Quality Managtr:
Paul Grort
x= I
BUSINESS/
MARKETING
ETS, Inc.
APCTVC
A PC TVC Director:
Jack Farmer
APCTVC DapMtyOi'tctor:
Oeugla* VenO«
-------
face air velocity al the POA was 0.61 m/s (2 feet/second dps)]- I he sclieniatic of the test facility
is shown in Figure 3. while the results lor the range ol performance of PDAs lor collecting solid
particles are given in Figure 4 lor both new and existing lacililies, icspectively.
figure 3: Test facility for Paint Overspray Arrestors"
Exhaust
ASMF No?/rie
Outlet Filter Bank
Oownitfoam Mixer
Room
Aif
Optic ol Particle Counter (OPC) »~
Backup
Filter
Holder
(Used When
Oust-loading)
inlot Filler
Bank
Btowcr
Flow Control
Vaive
Overview of Test Duct Configuration (Top View)
#o*e««atD
Pom Met UpstrMm
90- B»"C to
Point lolei U|«s**/n
Attxrtd 0>»TJ«
1
Mach«ix*d T*
Aerosol Sampling System
(End View)
Aerosol Generation System
(Side View)
Anvtime a manufacturer changes a product, including process changes in production of raw-
materials or arrestor assembly, the verification statement is no longer valid for the changed or^
new product. A verification test for the changed or new product will be required it a verification
statement is desired. In the case of a POA. there is a reasonable probability of an unintentional
product change occurring in a 12-month production cycle due to variations in assembly lines,
media, and'or components. To address this product variability, it is assumed that sufficient
<¦)

-------
changes could occur over a 12-month period that a new verification test is warranted. Thus, a
new verification test will be required every 12 months for POAs bearing the same model number
as a previously verified manufacturer's product.
Figure 4: Filtration Efficiency of Solid Particles for the Range of Performance of Paint
Overspray Arrestors Tested and the National Emission Standards for Hazardous Air
Pollutants for Aerospace Manufacturing and Rework Facilities
&>
h—

CO
New Facilities (Construction began after 10/29.-96 j
i oo
f-K )
Range of
I'ciiiirmMtice
i
i i O
A-NESHAP
(NESHAP=N
1 OO
C.3
¦C3~

Existing Facilities
o cj
so
so
<5 C.J
f->0
/I O
Ranse of
Performance
o 1
-i	to
Particle Diameter ( m m >
-I OO
The filtration performance of 13 POAs was verified in three rounds of testing during 1999, 2000,
and 2001. Verification reports are posted on the EPA ETV and APCTVC web sites from their
effective dates until the verifications for the products expire.
BFPs or filter fabrics were tested to determine their performance in capturing fine and total PM
and the pressure drop in the 6-hour test period. This followed consideration by the Center's SAC
which recommended a test program. ETS, Inc. was contracted to lead the development of the
GVP and was assisted by the appointed TP. ETS located a German test method which, with some
modifications, had the desired elements. Since a vendor had a test facility similar to that desired,
ETS used this design in building its test facility. ETS also developed the test/QA plan which was
approved. Figure 5 is a diagram of the test facility used for the verification testing of BFPs.
Its current operation is for pulsc-jet-cleaned filter material, but ETS is adapting it to test fabrics
for reverse-air-cleaned baghouses.
10

-------
Figure 5: Diagram of Test Apparatus for Baghouse Filtration Products9
DUST FEED
PHOTOMETER OR ANDERSEN
X IMPACTOR
FILTER FIXTURE AND TEST
/ FILTER
OPTICA!. PARTICLE COUNTER
CtEANING
SYSTEM
ABSOLUTE FILTER OR
ANDERSEN IMPACTOR
MOT WIRE
ANEMOMETER
FILTER
TU
CLEAN-AfR PUMP
MOT WIRE DIRTY AIR
ANEMOMETER PUMP
DUST
CONTAINER
The filter material to be tested has an exposed diameter of 0.14 m (5.5 in.) in the inlet of the
horizontal 0.15 m (6-in.) cylindrical extraction tube. Electrically neutralized dust particles are
carried by air in the vertical channel to the inlet of the extraction tube where air at a filtration
velocity (gas-to-cloth ratio) of 0.050 m/s (0.16 fps) drawn through the tube carries some of the
dust to and through the filter. Following conditioning and recover)' periods, each filter is tested
continuously for 6 hours. The PM2.5 and total mass concentrations passing through the filter and
the pressure drop across the filter are measured as the performance parameters.
The results for 13 fabrics tested in the first two rounds of tests are shown in Figure 6. The
membrane fabric samples tested generally gave lower PM2.5 (first row) and total PM mass
(middle row) concentrations downstream of the filter and lower pressure drop (third row) across it
than did the nonmembrane filters. Considering these results and input for the nonmembrane BFP
vendors, the Center requested ETS to perform testing at a filtration velocity and fabric
conditioning more representative of operating conditions for these fabrics. As a result of the ETS
tests, the GVP and test/QA plan were changed to reduce the filtration velocity to 0.033 m/s (0.11
fps), but the conditioning and recovery cycles for the 6-hour test remained unchanged. No test
data are yet available for BFPs to permit assessing the effect of the noted change.
11

-------
12
Figure 6: Results for Verification Tests of 13 Baghouse Filtration Products
OPM2.5, 0.1 mg/dscm
QTotal mass, 0.1 mg/dscm
~ Pressure drop, cm H20
Pressure drop, cm H20
Total mass, 0.1 mg/dscm
PM2.5, 0.1 mg/dscm
A-G: Membrane fabrics
H-M: Nonmembrane fabrics
The GVP developed for BFP testing has been accepted by a committee of the American Society
of Testing Materials (ASTM) as the basis for a national standard. The GYP is also being
considered by an international group for use in their countries.
The Xonon™ (a catalytic combustion system) was tested as a NOx control technology on a 1.5-
MW natural-gas-fired turbine operating at an electrical generating station in Santa Clara, CA.
Figure 7 compares the temperature profile of a conventional turbine combustor and Xonon™,
each having the same inlet combustor and outlet turbine temperatures, while Figure 8 shows the
temperatures for Xonon™ relative to other components of the gas turbine as well as the expected
exhaust NOx concentration. The lower combustion temperature of Xonon™ limits NOx
formation to very low values. Using the GVP developed earlier for NOx emission control and a
test/QA plan developed by RTI's contractor, Midwest Research Institute (MRI), a short-term
verification test of this technology at the Gianera power station was completed in July 2000.13
Results showed a mean NOx emission concentration of 1.13 parts per million by volume (ppmv)
of dry gas. This value was lower than the lowest best available control technology (BACT) and
lowest achievable emission rate (LAER) also shown for similar applications in Figure 9.
Information on the low NOx concentration measurement is given in Reference 14.
Technical and performance audits executed by RTI and EPA for the verification test of Xonon™
12

-------
Figure 7: Temperature Profiles of a Conventional Gas Turbine Combustor and Xonon™
(Courtesy of Catalytica Energy Systems, Inc.)
Conventional
combustor
fuel
Xonon
combustor
fuel
Conventional
Temperature
Xonon
High temperature in
flame produces NOx
Same Turbine
Inlet
Temperatures
Air
Figure 8: Operational Features of Xonon™
(Courtesy of Catalytica Energy Systems, Inc.)
Main Fuel
1260 -C
(2300 °F)
Preburner Fuel
Module
1260 °C
350 °C
jjixhaust
NOx 3= ppmv
Compressor
Drive Turbine
were acceptable. The data quality objectives of the test met the requirements of the test/QA plan,
and the audits of data quality were satisfactory.
13

-------
Figure 9: Exhaust NOx Concentration from Gas Turbine in Xonon™ Tests13
T3
>
E
CL
Q.
cf
o
re
w
"c
O
o
c
o
o
X
o
z
o
6>
CJJ
c
10
a:
1.2
1.1
1.0
Mean = 1.13 pptnvd
Best Available Control Technology (BACT) = 9-25 ppmv
Lowest Achievable Emission Rate (LAER) = 3-15 ppmv
Day 1
Day 2
An emulsified fuel oil was tested for NOx control and other performance characteristics using an
EPA boiler in Research Triangle Park, NC, under direction of an Air Pollution Prevention and
Control Division staff member. The NOx control portion of the testing was deemed to be
verification testing as it met ETV guidelines and received partial ETV funding.
It is noted that the selected performance parameters of all verification tests performed were
attained. Technology users may require other performance factors to meet their needs, including
guarantees for energy use, operating and maintenance requirements, and safe operation.
Verifications Planned
Currently the Center is developing GVPs for four technologies in addition to one recently
completed for mobile sources for retrofitting PM and other emission controls to diesel engines.
The GVPs concern dust suppression and soil stabilization for unpaved roads, use of biofilters to
control VOCs, and two more for mobile sources. One of these concerns selective catalytic
reduction for NOx control and the other the emissions related to use of fuels/additives and
lubricants/additives for diesel engines. The TPs expect to complete these GVPs early this
summer.
The test/QA plan for a retrofitted emission control device for diesel engines is being prepared by
the Southwest Research Institute under contract to RTI. Testing is scheduled to begin in the
Summer of 2002. Since the mobile source testing is being coordinated with EPA's Office of
Transportation and Air Quality (OTAQ) which assisted in developing the pertinent GVP, the
completed verification is expected to earn credits in OTAQ's Voluntary Retrofit Program.
14

-------
The GVP for dust suppression and soil stabilizers is being developed as a joint effort by the
APCTVC and the Environmental Technology Verification Center (EvTEC), one of the 12 pilot
ETV programs which became independent of the ETV Program after EvTEC1 s pilot period
ended. Information for validating the dust suppression part of the GVP was obtained by MRI,
RTFs testing organization, in late 2001 and early 2002 using seven suppressants applied by
vendors to a test section of unpaved road at Fort Leonard Wood, MO. During this preliminaiy
testing, a mobile dust sampler was towed by a vehicle on the test road and its collection of dust
was compared with the profiling method used by MRI. Satisfactory correlation of the results
from the device with those of the profiling method resulted in acceptance of the device for 1 -year
field tests at Maricopa County, AZ, and Fort Leonard Wood. The vehicle-drawn dust sampler is
much cheaper to operate and permits quicker data collection than the profiling method. These
field tests began in early June 2002, with the application of five dust suppressants by three
vendors at Fort Leonard Wood. Quarterly (seasonal) data collection will begin this fall and result
in verification reports in the Fall of 2003.
The GVP for bio filtration control of VOCs was begun in late 2001 and is projected for
completion in the Summer of 2002. Microorganisms will be used to decompose organic
compounds into carbon dioxide and water. The GVP for closed biofiltration systems will
measure the removal efficiency and identify the associated resources needed to operate this
technology. Testing is expected to begin this summer,
CONCLUSIONS
In the 1995-2000 pilot period of the ETV Program, 111 technology verifications by independent
testing organizations were made using the 60 GVPs and 96 test/QA plans developed. Over 1100
stakeholders in 18 groups worked with the 12 pilot programs in the verification process (see
Figure 1). The Program's goal of providing credible performance information on environmental
technologies to accelerate their acceptance in the marketplace was successful based on feedback
from vendors whose technologies were verified. Although the Program was intended to end after
the pilot programs became self-sustaining, feedback from vendors indicated that EPA's continued
participation in the Program is critical to credible environmental verifications, adds significant
value to verifications, and its staffing by adequate and qualified personnel is essential to its
success. The verification process is more expensive, time consuming, and complex than was
expected, with protocol development (especially in the absence of test methods) being a major
cost item. Faster reporting of test results and more effective outreach to attract more participants
in technology verifications were recommended to improve the Program.
The Air Pollution Control Technology Verification Center verified 21 technologies, developed 3
GVPs, and prepared 3 test/QA plans in the 3 years (1997-2000) it operated during the ETV
Program's pilot period. It has verified 29 technologies to date: 27 for the control of PM in
laboratory testing and the others for controlling NOx. The laboratory testing of paint overspray
arrestors used commercial and full-size filters. Results of baghouse filtration product testing
showed good PM capture and generally better performance by the membrane filters. A catalytic
combustion system on a 1500-kW gas turbine over a 2-day test during the Summer of 2000
yielded a mean NOx emission of 1.13 ppmv. This value is lower than both the best available
control technology (RACT) and lowest achievable emission rate (LAER) values reported.
15

-------
Planning for testing other technologies is progressing rapidly, with testing in at least three new
areas expected to begin by this summer. These technologies involve retrofitted diesel engine
emission controls, dust suppressants and soil stabilizers for unpaved roads, and biofiltration to
reduce emissions of VOCs.
REFERENCES
1.	U. S Environmental Protection Agency. Environmental Technology Verification
Program: Verification Strategy, Office of Research and Development, Washington, DC,
E PA/600/K.-96/003 (NTIS PB97-160006), February 1997.
2.	I J. S. Environmental Protection Agency. Environmental Technology Verification
Program, ETV web site; http://www.epa.cov/etv. Office of Research and Development.
3.	Etna, T. G.» "Environmental Technology Verification: Credible Performance Data for
Technology Users," In Proceedings: 2002 International Conference on Incineration and
Thermal Treatment Technologies, New Orleans, LA, May 2002 (in press).
4.	U. S. Environmental Protection Agency. Environmental Technology Verification Program,
Office of Research and Development, Washington, DC, EPA/600/F-98/015, September
1998,
5.	U. S. Environmental Protection Agency. Environmental Technology Verification Program,
Quarterly Report, October 2001. Available at web site:
http://wwv.epa.gQv/elv/dload/qr_octO 1 .pdf.
6.	U. S. Environmental Protection Agency. Environmental Technology Verification Program;
Quality and Management Plan for the Pilot Period (1995-2000), Office of Research and
Development, Cincinnati, OH, EPA/600/R-98/064, May 1998
7.	American National Standards Institute/American Society for Quality Control
(ANSI/ASQC), American National Standard E4-1994, "Specifications and Guidelines for
Quality Systems for Environmental Data Collection and Environmental Technology
Programs," American Society for Quality Control, Milwaukee, WI, 1994.
8.	Research "I riangle Institute. Air Pollution Control Technology Verification Center,
APCTVC web site: http://etv.rti.org/apct.
9 ETS. Inc. and Research Triangle Institute. Generic Verification Protocol for Baghouse
Filtration Products, ETS, Inc., Roanoke, VA, and Research Triangle Institute, Research
Triangle Park. NC. October 2001. Available at web site:
http://etv.ili.orir'anct/pd 17GVP Revised.pdf.
10. Federal Register. 40 Code of Federal Regulations, Part 63. March 27, 1998, Government
Printing Office, Washington, DC.
11 Hanley. J T., M. K. Owen, J. R. Farmer, and T. G Brna, "Environmental Technology
Verification (ETV) of Painl Ovcrspray Arrestors Filtration and Separations Technologies
for 2000, 13rh Annual Technical Conference and Exposition, American Filtration Society,
Myrtle Beach, SC. March 2000.
12. Baghouse Filtration Products Reports, 2000 and 2001. Available at web site:
http://ctv.rti.org/apct/documents.cfm.
1 3. Catalytica Energy Systems, Inc. Xonon™ Cool Combustion System, 2001. Available at
web site: http://etv.rti.org/apct/documcnts.clm.
14. Clapsaddie. C. A., A.H, Trenholm, and A. M. Marshall. "Low-Concentration NOx Emission
Measurement," Abstract No. 248, Air and Waste Management Association Annual Meeting
and Exhibition, Orlando, FL, June 2001.
16

-------
NTPMDT BTP P TECHNICAL REPORT DATA
1NK1 U\L,-t\ii r-DOD (Please read Instructions on the reverse before completing)
—
1. REPORT NO. 2.
EPA/600/A-02/088
3. REC
4. TITLE AND SUBTITLE
Promoting Improved Air Quality Through Environmental
Technology Verifications
5. REPORT DATE
6. PERFORMING ORGANIZATION CODE
7. AUTHOR(S) / _
Theodore G. Brna (EPA) and Jack R. Farmer (RTI)
8. PERFORMING ORGANIZATION REPORT NO.
9. PERFORMING ORGANIZATION NAME AND ADDRESS
Research Triangle Institute
P.O. Box 12194
Research Triangle Park, North Carolina 27709
10. PROGRAM ELEMENT NO.
11. CONTRACT/GRANT NO.
NA
12. SPONSORING AGENCY NAME AND ADDRESS
EPA, Office of Research and Development
Air Pollution Prevention and Control Division
Research Triangle Park, NC 27711
13. TYPE OF REPORT AND PERIOD COVERED
Published paper; 1995-2002
14. SPONSORING AGENCY CODE
EPA/600/13
is. supplementary notes APPCD project officer is Theodore G. Brna, E305-01, 919/541-2683.
For presentation at AWMA Annual Meeting, Baltimore, MD, 6/23-27/02.
16. abstract The paper discusses the promotion of improved air quality through environ-
mental technology verifications (ETVs). In 1995, the U.S. EPA's Office of Research'
ana Development began the ETV Program in response to President Clinton's "Bridge to a
Sustainable Future" and Vice President Gore's "National Performance Review" to work
with the private sector to establish a market-based verification process available to
all environmental technologies. In its 1995-2000 pilot period, up to 12 pilot pro-
grams operated under the ETV Program to accelerate the commercialization of innovative
or improved technologies through independent third-party verification and reporting of
their performance. Normally, nonprofit organizations were selected competitively by
EPA to manage these programs whose verification activities followed EPA guidelines.
These guidelines include: a technology whose performance is to be verified must be
commercially ready and voluntarily offered by its vendor who agrees to cost-share tes-
ting, and testing is by an independent organization that complies with the stakeholder-
developed protocol approved by EPA. Operational features of the ETV Program and ac-
complishments during its pilot period are discussed, while lessons learned are sum-
marized. Restructuring of the ETV Program into six centers was begun in late 2000, and
the new structure is noted briefly. Discussed in detail in the Air Pollution Control
Technology Verification Center and its operation by the Research Triangle Institute.
This includes its organization, operational features, and technology areas of interest.
17. KEY WORDS AND DOCUMENT ANALYSIS
a. DESCRIPTORS
b. I DENT! F I E RS/OPEN ENDED TERMS
c. COSATI Field/Group
Pollution Waste Treatment
Verifying
Performance
Monitors
Potable Water
Greenhouse Effect
vletal Finishing
Pollution Control
Technologies
Commercia1i za tion
Indoor Air
Pollution Prevention
13B
14B
14G
08H
04A
13H
18. DISTRIBUTION STATEMENT
Release to Public
19. SECURITY CLASS (This Report)
Unclassified
21. NO. OF PAGES
16
20. SECURITY CLASS (This page)
Unclassified
22. PRICE
EPA Form 2220-1 (9-73)

-------