IT3'02 Conference, May 13-17, 2002, New Orleans, Louisiana
ENVIRONMENTAL TECHNOLOGY VERIFICATION:
CREDIBLE PERFORMANCE DATA FOR TECHNOLOGY USERS
Theodore G. Brna
U. S. Environmental Protection Agency
Office of Research and Development
National Risk Management Research Laboratory
Air Pollution Prevention and Control Division
Research Triangle Park, North Carolina 27711
ABSTRACT
The Environmental Technology Verification (ETV) Program was begun by the U. S.
Environmental Protection Agency in 1995 to accelerate the commercialization of innovative or
improved environmental technologies through third-party verification of their performance.
During its pilot period, 1995-2000, up to 12 pilot programs operated in the market-based
verification process involving nonprofit private/public partnerships. The ETV Program is a
voluntary (nonregulatory) program which is restricted to commercially ready technologies and
uses credible independent testing organizations to execute peer-reviewed test/quality assurance
plans. Accomplishments of the ETV Program are summarized and lessons learned during the
pilot period are discussed. Restructuring of the pilot programs into six centers began in late 2000
and is noted briefly. Discussed in detail is the Air Pollution Control Technology Verification
Center, including its organization, operational features, technology areas of interest, and
verifications performed.
INTRODUCTION
The Environmental Technology Verification (ETV) Program was begun in 1995 by the U. S.
Environmental Protection Agency (EPA) as directed by the national environmental strategy in
President Clinton's "Bridge to a Sustainable Future" and Vice President Gore's "National
Performance Review" to work with the private sector to establish a market-based verification
process available to all environmental technologies [l]a. Operated under the EPA's Office of
Research and Development (ORD), the ETV Program verifies the performance of innovative or
improved solutions to problems threatening human health and the environment. The verification
process aims to substantially accelerate the acceptance of the verified technologies into domestic
and international marketplaces.
Several factors influenced ORD to conduct a program to verify the performance of innovative or
improved commercially ready environmental technologies offered by the private sector. These
included: (1) acceptance of technologies is enhanced by independent, objective, and credible
performance information; (2) the private sector produces nearly all the new or improved
technologies purchased; and (3) independent evaluations of commercially ready technologies
"Numbers in brackets refer to items in REFERENCES.
1
-------
IT3'02 Conference, May 13-17,2002, New Orleans, Louisiana
that are innovative, cheaper, and more effective than current technologies foster market
penetration in a conservative risk-avoiding marketplace. By the end of the pilot period in
October 2000, 12 ETV pilot programs were operational, each with the overall objective of
accelerating the commercialization of innovative or improved environmental technologies
through third-party verification and reporting of their performance. Restructuring of the ETV
Program in late 2000 resulted in six verification centers focusing on advanced monitoring; air
pollution control; greenhouse gas; drinking water; water protection; and pollution prevention
(P2), recycling, and waste treatment technologies. In fiscal year (FY) 2002, reduced funding for
the ETV Program resulted in no additional funding for the P2, recycling, and waste treatment
center: however, the center will continue to operate until residual funds are expended while
seeking additional funding.
ETV PROGRAM FEATURES
The goal of the ETV Program is: to verify the environmental performance characteristics of
commercially ready technologies through evaluating objective and quality-assured data and to
provide results to potential technology users—buyers, permitters, consultants, and others—as
independent, credible information [2], Consequently, the Program does not accept unmonitored
vendor-generated data as the basis for ETV, although it may use such data in planning for
testing. The Program's goal has remained unchanged from its inception in 1995, although the
Program has matured from its pilot phase. In pursuing this goal, the ETV Program's primary
performance evaluation objective is to obtain and provide essential data to the environmental
marketplace for its consideration in decision-making. The Program does not rank, approve, or
disapprove technologies; it provides quality-assured performance information gained by
independent, limited testing with clearly defined objectives.
As used in the ETV Program, verification means establishing or proving the truth of a
technology's performance under specific, predetermined criteria or protocols and adequate
quality assurance (QA) procedures [2]. Thus the ETV Program does not certify or guarantee that
a technology will always perform as verified. This follows because the technology may have
undergone changes due to adjustments, aging, or operation under conditions differing from those
prevailing in its verification test.
The ETV Program is voluntary, is restricted to commercially ready technologies, may share
verification costs with technology vendors, and uses credible independent testing organizations.
The centers rely on stakeholders to recommend technologies for testing as well as their
prioritization and to publicize the results of verification testing. Stakeholder groups are balanced
to include the interests of technology vendors/manufacturers, users, permitters, consultants, and
others such as environmentalists, insurers, and venture capitalists. Stakeholders may also assist
in the development and review of generic verification protocols (GVPs), test/(QA) plans, and
verification test reports. Over 1100 stakeholders in 18 groups have taken part in the ETV
Program which had produced 164 verification reports and statements by October 2001. These
verified technologies are listed in the ETV Program Quarterly Report, October 2001 [3],
QA plays a vital role in assuring credible data in the ETV Program, and its implementation is
governed by the Program's Quality and Management Plan [4], Each center has its own quality
2
-------
IT3'02 Conference, May 13-17,2002, New Orleans, Louisiana
management plan which includes essential features of the Program's plan and serves as a basic
reference in the development of GVPs and test/QA plans. Each center's plan complies with
ANSI/ASQC E4-1994, "Specifications and Guidelines for Quality Systems for Environmental
Data Collection and Environmental Technology Programs," [5] and is comparable to the
International Standards Organization (ISO) 9000 standards series.
Test reports normally contain verification statements signed by an EPA official (often a
laboratory director) and a verification center official. These reports and generic verification
protocols are placed on internet web sites operated by the ETV Program (www.epa.gov/etv) and
the centers. Copies of the verification test reports may also be available from vendors whose
technologies have been verified.
Table I gives information on the 12 pilot programs operating during the October 1, 1995-
September 30, 2000 pilot period [2], Note that two of the pilot programs were operational before
the start of the ETV pilot period, having begun under another EPA program before their
inclusion in the ETV Program. Brief information on these pilot programs as well as the ETV
Program is also given in Reference 2.
As noted earlier, restructuring of the ETV Program in late 2000 led to the establishment of six
centers. Three of the centers combined pilot programs (see Table II for the combined pilots) from
the Program's pilot period, while two pilot programs were ended. Trade organizations assumed
the responsibilities for testing using protocols developed in cooperation with the Indoor Air
Products pilot. Since the Environmental Technology Evaluation Center (EvTEC) was not
limited to media or focus area, it was expected to operate independently and self-sufficiently
after the ETV Program's pilot period ended.
Table II lists the centers to include the former pilot programs and the number of outputs
(protocols and verifications completed) during the pilot period and verifications during fiscal
year (FY) 2001 or FY01. While not shown in Table II, the number of verifications during the
pilot period increased each fiscal year (from October 1 through September 30). The initial two
verifications occurred in FY97, and the maximum number of verifications reached 58 in FY00
before decreasing to 53 in FY01. Thus, the number of verifications in the pilot period totaled
111 [6], and the total for the ETV Program reached 164 by the end of FY01 [3].
ETV PROGRAM LESSONS LEARNED
During the pilot period, the 12 pilot programs spent $31.4 million on their start-up, management,
outreach, and verification activities. Most of this funding was from EPA (87%), while
EPA's partners were the next largest funding source (6%). Vendors/manufacturers of
technologies verified provided 4% of the funding, while others (such as contractors to EPA's
partners and groups interested in the performance of environmental technologies) contributed
3%. Considering that 111 technology verifications occurred in the pilot period, the average cost
of a technology verification was $282,000. This sum does not consider contributions of time by
EPA staff and volunteers serving as stakeholders and technology panelists in developing
protocols and reviewing reports. It also does not allocate any costs of the Program's developed
3
-------
IT3'02 Conference, May 13-17,2002, New Orleans, Louisiana
Table I. Operational ETV Pilot Programs, September 2000
No.
Pilot Program
Start
Date
No. of
Stakeholders
EPA's Partner
Contacts:
Partner (EPA)
1
Advanced
Monitoring
Systems
10/97
Air: 44
Water: 35
Battelle Memorial
Institute
Columbus, OH
Karen Riggs, 614-424-7379
(Robert Fuerst, 919-541-
2221)
2
Air Pollution
Control Technology
10/97
30
Research Triangle Institute,
Research Triangle Park, NC
Jack Fanner, 919-541-6909
(Ted Brna, 919-541-2683)
3
Drinking Water
Systems
10/95
95
NSF International
Ann Arbor, Ml
Bruce Bartley, 734-769-
5148
(Jeff Adams, 513-569-7835)
4
Environmental
Technology
Evaluation Center
10/96
176
Civil Engineering
Research Foundation
Washington, DC
William Kirksey, 202-842-
0555
(Norma Lewis, 513-569-
7665)
5
Greenhouse Gas
Technology
12/97
118
Southern Research Institute
Research Triangle Park, NC
Stephen Piccot, 919-806-
3456
(Dave Kirchgessner, 919-
541-4021)
6
Indoor Air Products
10/95
179
Research Triangle Institute
Research Triangle Park, NC
David Ensor, 919-542-6735
(Les Sparks, 919-541-2458)
7
Pollution Prevent-
ion (P2) Innovative
Coatings and
Coating Equipment
10/96
30
Concurrent Technologies
Corporation
Johnstown, PA
Brian Schweitzer, 814-269-
2772
(Mike Kosusko, 919-541-
2734)
8
P2 Metal Finishing
Technologies
6/98
30
Concurrent Technologies
Corporation
Largo, FL
Donn Brown, 727-549-7007
(Alva Daniels, 513-569-
7693)
9
P2, Recycling, and
Waste Treatment
7/95
22
Cal/EPA Department of
Toxic Substances and
Control
Sacramento, CA
Greg Williams, 916-322-
3670
(Norma Lewis, 513-569-
7665)
10
Site
Characterization and
Monitoring
Technologies
10/94
16
Sandia National
Laboratories
Albuquerque, NM
Oak Ridge National
Laboratory
Oak Ridge, TN
Wayne Einfeld, 505-845-
8314
Roger Jenkins, 865-676-
8594
(Eric Koglin, 702-798-2432)
11
Source Water
Protection
Technologies
5/98
136
NSF International
Ann Arbor, MI
Tom Stevens, 734-769-5347
(Ray Frederick, 732-321-
6627)
12
Wet Weather
Flow Technologies
6/98
52
NSF International
Ann Arbor, MI
John Schenk, 734-769-5786
(Mary Stinson, 732-321-
6683)
4
-------
IT3'02 Conference, May 13-17,2002, New Orleans, Louisiana
Table II. Environmental Technology Verification Program Outputs, September 2001 [3, 6]
Center
Generic
Pilot Period
Verifications
Comments
Verification
Verifications
FY 2001
(See Table I for contacts)
Protocols
(1995-2000)
Advanced Monitoring
1. Advanced Monitoring
2
15
21
Pilot programs I and 10 of
Table I combined in 2001
Systems
10. Site Characterization
1
44
5
as a center
Monitoring Technologies
2. Air Pollution Control
Technology
4
21
6
Pilot program 2 changed to
a center in 2001
3. Drinking Water Systems
33
11
12.
Pilot program 3 changed to
a center in 2001
5. Greenhouse Gas Technology
• 2
8
5
Pilot program 5 changed to
a center in 2001
Pollution Prevention (P2),
Pilot programs 7, 8, and 9
Recycling, and Waste Treatment
combined in 2001 as a
Technologies
7. P2 Innovative Coatings
4
5
0
center
and Coating Equipment
8. P2 Metals Finishing
0
1
1
Technologies
9. P2, Recycling, and Waste
Treatment Systems
0
3
3
Water Protection Technologies
Pilot programs 11 and 12
combined in 2001 as a
center
11. Source Water Protection
4
0
0
Technologies
12, Wet Weather Flow
7
0
0
Technologies
4. Environmental Technology
Evaluation Center
1
1
1
Pilot program ended after
pilot period but center now
operates independently
6. Indoor Air Products
2
2
Not
Applicable
Pilot program ended by
2000
5
-------
IT3'02 Conference, May 13-17,2002, New Orleans, Louisiana
infrastructure to the future or of developed protocols and test/QA plans as assets to the Program
in the future.
Some programmatic lessons learned from feedback from participants in the Program include:
1. EPA's involvement in the ETV Program is critical to credible environmental verification.
While it was EPA's intent at the onset of the Program that it would become self-sustaining
using income from mainly verification testing, participants believe that EPA's involvement is
needed to maintain its credibility and the value of performance information obtained in
verification testing.
2. EPA's participation in the ETV program adds significant value to the verifications.
Verification statements approved by EPA are useful marketing tools for vendors and provide
independent, quality-assured performance data to permitters and users of the technology.
3. Adequate and qualified EPA and partner staffing and management are essential to the success
of this verification program. The ETV team approach with managers for media-specific
entities in each pilot program functioned well under a central director. Central coordination
facilitated the development and use of consistent policies; timely and consistent
communication; procurement and allocation of funding to the pilot programs; program data
collection, compilation, and reporting; and outreach, such as operation of a web site to
publicize protocols and verifications as well as program-related activities.
4. The verification process is more time consuming, costly, and difficult than was expected.
Accurate verification needs should be conveyed to participants in the verification process.
Protocol and test/QA plan development can be major cost elements of verifications.
5. Rapid reporting enhances the value of verifications to vendors and users. Vendors whose
technologies were verified complained that the time between test completion and the
availability of a published verification report was too long. Median time for public release of
verification reports was 7 months, about half of which was used in the multi-tiered EPA
review and related report revision process. Vendors felt that rapid release of verification
reports after testing aided their marketing.
6. More effective marketing of the ETV Program is needed to reach technology vendors and
merit their participation in the ETV Program. Users need to be informed of the availability
and value of technology verifications so that they can request ETV verification reports from
vendors whose technologies are being considered for permit approval or purchase.
AIR POLLUTION CONTROL TECHNOLOGY VERIFICATION CENTER (APCTVC)
Features of this center are discussed in detail in Reference 7, while an organization chart is
shown in Figure 1 [8]. The operational features are briefly described below and are unchanged
from the pilot period. These features are also similar to those of the other pilots and centers.
The major focus areas of the APCTVC are fine particulate matter (PM), nitrogen oxides (NOx),
volatile organic compounds (VOCs), and hazardous air pollutants, all being significant air
pollutants.
Verification Process
The Center uses its stakeholders advisory committee (SAC) to assist in selecting and prioritizing
technologies for testing and to provide guidance in the overall operation of the Center. After
6
-------
H
EPA
ETV PROGRAM
APCT
Project Manager:
Theodore Brna
BUSINESS/
MARKETING
ETS, Inc.
APCTVC
APCTVC Director:
Jack Farmer
APCTVC Deputy Director:
Douglas VanOsdell
SPECIFIC TECHNOLOGY VERIFICATIONS
i.
QA/QC
EPA Quality Manager:
Paul Groff
QA/QC
APCTVC Quality Manager:
Robert Wright
APCTVC BFP Quality Manager:
Gene Tatsch
STAKEHOLDERS
ADVISORY
COMMITTEE
ADD-ON NOx CONTROLS VERIFICATION
PAINT OVERSPRAY ARRESTORS VERIFICATION
BAGHOUSE FILTRATION VERIFICATION
APCTVC Task Leader: Jim Turner
BFP Task Leader: John Mycock
Technical
Panel
Baghouse Filtration
Products QA Officer
Technology Area
Technlcal Staff
N>
o
o
B
3
B
n
a
%
22
%
O
5T
»
B
en
r
o
c
p
B
Figure 1. Air Pollution Control Technology Verification Center (APCTVC) Organization
Highlighting Baghouse Filtration Products (BFP) Air Pollution ControlTechnology [8]
-------
IT03 '02 Conference, May 13-17, 2002, New Orleans, Louisiana
recommendation by the SAC, the Center's director establishes technical panels (TPs) for
technology areas where there are interested verification participants, such as technology vendors
and users. Each TP consists of persons having specific expertise in the technology of concern
and serves in the development of a generic verification protocol (GVP). Chaired by a Research
Triangle Institute (RTI) staff member appointed by the Center's director, this staffer drafts the
GVP for review and approval by first the TP, then by the Center's director, and finally by EPA.
Both the Center's director and EPA's manager for the APCTVC are listed as contacts in number
2 of Table I. The draft GVP as approved by EPA remains a draft until the initial verification
testing occurs so that changes to the GVP may be made if needed.
Since the TP's membership has vendors, the next step following the draft GVP's approval is to
determine which vendors want their technologies tested and will share testing costs. If a group
of vendors have similar technologies which can be tested at the same location, the location is
decided and development of the test/QA plan is begun by a qualified testing organization
selected by the APCTVC. This plan, which considers vendor inputs, is subject to the approval
by the participating vendors, APCTVC, and EPA. If a single technology's performance is to be
verified at an industrial site, the vendor arranges with the site's owner for this commercially
ready technology to be tested using the APCTVC's approved testing organization. Coordination
between the testing organization, site owner, and technology's vendor is maintained as needed
throughout the test project as well as between the testing organization and the APCTVC which
keeps EPA informed.
The testing of technology occurs only after the test/QA plan complying with the relevant GVP
has been approved by the parties noted above. QA checks are made during testing in compliance
with the ETV Program's QA requirements by the APCTVC or EPA or both. The reporting of
test results in verification reports documents the QA checks made, any deficiencies noted, and
corrective actions taken. It also indicates if the data quality objectives for the verification testing
were achieved. Comments on the draft verification report and statement (included in the report
at the option of the technology's vendor) by the technology's vendor are solicited by the
APCTVC before these documents are prepared for review by EPA. If the technology does not
meet its vendor's performance objective(s) during verification testing, a test report is published
on the ETV and APCTVC's web sites (without a verification statement if the technology's
vendor desires) after the report completes the EPA review process. If a verification statement is
included in the report, the director of the National Risk Management Research Laboratory signs
three original verification statements and returns the verification reports for the signature of the
APCTVC's director. The APCTVC distributes reports with signed verification statements to the
vendor and EPA center manager, retains a report for its files, and posts the report on the
appropriate web sites.
The APCTVC has completed verifications in two technology areas: fine PM (i.e., particles with
aerodynamic diameters of 2.5 micrometers or less which are designated as PM2.5) and NOx
emissions control. Prior to protocol development, the SAC considered the importance of the air
pollution problem being addressed, availability of commercially available teclinologies,
availability of test methodologies, willingness of vendors/manufacturers to share testing costs,
scale of the technology to be tested and testing time for meaningful results, and the technology's
potential market.
8
-------
ri'3'02 Conference, May 13-17,2002, New Orleans, Louisiana
Technology Areas
The first round of PM tests during the ETV pilot period involved one filter model from each of
seven vendors of paint overspray arrestors, and testing was performed in accordance with the
prepared test/QA plan which compiied with the now APCTVC's "Generic Verification Protocol
for Paint Overspray Arrestors." This protocol contains all the requirements of EPA Method 319:
"Determination of Filtration Requirements for Hazardous Air Pollutants (NESHAP) for
Aerospace Manufacturing and Rework Facilities," which requires a face velocity of 0.61 m/s
(120 fpm) at the filter [9]. Tests of full-size filters {0.61 x 0.61 x 0.38 m (24 x 24 x 15 in.)}
occurred in RTI's laboratory test facility [10]. Six of the filters exceeded the filtration efficiency
requirements of Method 319 for both new and existing facilities. One filter model was
withdrawn from testing after the vendor found it was an incorrect model. Thus far, 13 arrestors
bearing different model numbers have been verified as having filtration efficiencies meeting the
requirements of Method 319 for both existing and new sources in three rounds of testing.
Figure 2 provides views of the test duct facility [10], Figure 3 shows the filtration efficiency
results of the best and worst performing tested paint overspray arrestors compared with
requirements for new aerospace manufacturing and rework facilities, while Figure 4 gives a
similar comparison for existing facilities. All arrestors represented exceeded Method 319's
requirements.
Baghouse filtration products (BFPs) or filter bag fabrics were tested in the second set of the fine
PM control verification tests performed. No standard test method existed for these fabrics, but
vendors indicated an interest in participating in developing a testing process and facility. After
the SAC's recommendation, RTI contracted with ETS, Inc. to develop a GVP and propose a
facility that would credibly test the fabrics. ETS located a German test method which met many
of the requirements sought. By modifying this method and using a test facility design similar to
that of a baghouse vendor, ETS in cooperation with the appointed TP produced a GVP which
met performance criteria [8]. ETS also developed the test/QA plan which was approved by the
APCTVC and EPA. A photograph of the test facility built by ETS at its laboratory in Roanoke,
VA, is shown in Figure 5 [11], and a diagram of the facility is in Figure 6 [8].
The filter material to be tested has an exposed diameter of 14 cm (5.5 in.) and is located at the
entry of the horizontal circular duct shown in Figures 5 and 6. A dust approved for verification
testing is discharged into a vertical channel, and some of it is entrained by air which is drawn
through the vertical filter at the entry of the horizontal circular duct shown in Figures 5 and 6.
An Andersen impactor is used to capture the various sized dust particles that have passed
through the filter. The performance parameters of the filter are particle concentration at the
outlet of the filter and pressure drop across the filter.
Three rounds of testing involving 15 different fabric materials have been conducted, but only 13
verification reports for the first two rounds are available on the ETV and RTI web sites as the
third round of testing was recently completed. Performance results for the filter materials tested
in the first round of testing are shown in Figure 7.
9
-------
IT3'02 Conference, May 13-17,2002, New Orleans, Louisiana
Exhaust
ASME Nozzle
Outlet Filter Bank
Room
Downstream Mixer
Room
Air
-~Optical Particle Counter (OPC)
Device
Section
Backup
Filter
Holder
(Used When
Dust-loading)
Inlet Filter
Bank
Upstream Mixer
Blower
Aerosol
Generator
Flow Control
Valve
Overview of Test Duct Configuration (Top View)
Spray Nozzle
Spray Tower x J
Syringe Pump
I
Atomizing Air
Air Control Panel
Air Source
HEPA Filter
Capsules
Aerosol Charge
Neutralizer
Test Duct
HEPA
Aerosol Generation System
(Side View)
90° Bend to
Pont Infet Upstream
Sample Lines
Valve
Actuator
Ban valve
f Interchangeable
L~~ Isokinetic
Sampling Tip
90 Bend to
Point Inlet Upstream
^Electrical
Cable
Downstream Duct
Multichannel
Analyzer
OPC
Data Acquisition
Computer
Upstream Duct
Machined *Y"
Aerosol Sampling System
(End View)
Figure 2. Test Facility for Paint Overspray Arrestors [10]
10
-------
IT3'02 Conference, May 13-17,2002, New Orleans, Louisiana
New Facilities ®
100
2?
90
ft
c
®
B0
o
LU
70
c
o
(0
1
60
il
50
Best
m
t
Worst41
7
[A
NESHAP *
0.1 1 10
Particle Diameter (pm)
100
• New Facilities-Facilities whose construction began after October 29 1996
# NESHAP-National Emission Standards for Hazardous Air Pollutants'
Figuic 3. Paint Overspray Filtration Efficiency for Best and Worst Performing
Anestors Tested and National Emission Standards lor Hazardous Air
Pollutants (NESHAP) for Aerospace Manufacturing and Rework
Facilities for New Facilities
(Courtesy of Research Triangle Institute)
Existing Facilities
&
c
QJ
[u
UJ
c
_o
c£
iZ
/
NESHAP
Worst
4-4
0.1
100
1 10
Particle Diameter (pm)
Figure 4. Paint Overspray Filtration Efficiency for Best and Worst Performing
Arrestois Tested and National Emission Standards for Hazardous Air
Pollutants (NESHAP) for Aerospace Manufactruing and Rework
Facilities for Existing Facilities
(Courtesy of Research Triangle Institute)
11
-------
'02 Conference, May 13-17,2002, New Orleans, Louisiana
New Facilities*
100
90
£
c
©
B0
o
LU
70
c
o
fO
60
il
50
1 1—
Best
w
WorstJ'
A
NESI-
AP *
0.1 1 10
Particle Diameter (pm)
100
• New Facilities-Facilities whose construction began after October 29,1996
# NESHAP-National Emission Standards for Hazardous Air Pollutants
Figuic 3. Paint Ovcrspray Filtration Efficiency for Best and Worst Performing
Auestors tested and National Emission Standards lor Hazardous Aii
Pollutants (NESHAP) for Aerospace Manufacturing and Rework
Facilities for New Facilities
(Courtesy of Research Triangle Institute)
Existing Facilities
c
Q>
O
LU
c
o
TO
il
100 -j
90 ]
80 -}
70 -
60
50
^0
30
20
10
Best
Worst
4-4-
wiH"
T
V
6
jt
~4
n
A
t
i
NESHAP
0.1
100
If*
1 10
Particle Diameter (pm)
Figure 4. Paint Ovcrspray Filtration Efficiency for Best and Worst Perform! nc
Arrestors Tested and National Emission Standards for Hazardous Ait
Pollutants (NESHAP) for Aerospace Manufactruing and Rework
Facilities for Existing Facilities
(Courtesy of Research Triangle Institute)
II
-------
DUST FEED IN
RECTANGULAR CHANNEL
[PHOTOMETER
CLEANING SYSTEMl
[CYLINDRICAL EXTRACTION T\iil
FILTER FIXTURE AND TEST FILTER
PARTIClE_COU£(TER
x CHART RECORDil
Figure 5. Photograph of Test Apparatus for Baghouse Filtration Product Tests [11]
-------
IT3'02 Conference, May 13-17, 2002, New Orleans, Louisiana
DUST FEED IN FROM EXTERNAL HOPPER
DUST CHARGE NEUTRALIZER
RECTANGULAR CHANNEL
DUST
FEEDER
PHOTOMETER
SCALE
7777V777
' PLATFORM /
FILTER FIXTURE AND TEST FILTER
CYLINDRICAL EXTRACTION TUBE
CLEAN-GAS SAMPLE PORT
RAW-GAS SAMPLE PORT
CLEANING SYSTEM
ABSOLUTE FILTER AND
ANDERSENIMPACTOR
MASS FLOW
CONTROLLER
BACKUP FILTER
ADJUSTABLE
VALVES
ABSOLUTE
ANDERSEN
CALIBRATED
ORIFICE
BLOW TUBE
DIRTY-AIR
FILTER
CLEAN-AIR PUMP
MASS FLOW
CONTROLLER
DIRTY-AIR
PUMP
DUST
CONTAINER
Figure 6. Diagram of Test Apparatus for Baghouse Filtration Product Tests [8]
(Courtesy of Hosokawa Mikropul)
13
-------
c
Membrane vendors-!
Non-membrane
vendors
Legend and unit
ordinate scale
a PM2.5
0.1 mg/dscm
B Total mass
0.1 mg/dscm
o Average
pressure
drop
cm water
Figure 7. Results of Verification Tests for Nine Baghouse Filtration Products
(Courtesy of Research Triangle Institute)
-------
rO'02 Conference, May 13-17, 2002, New Orleans, Louisiana
Because of the better performance noted for membrane fabrics than for nonmembrane materials,
the gas-to-cloth (G/C) ratio was re-examined as the value used in the tests was higher than used
for nonmembrane filter bags in commercial baghouses. Testing by ETS resulted in modification
of the test method to reduce the G/C ratio from 180 to 120 m/hr (9.8 to 6.6 1pm) for both
membrane and nonmembrane materials, as the membrane materials were insensitive to flows
over the range tested. Results of tests using the reduced value of G/C ratio have not yet been
reported.
A verification test of a NOx emission control technology was made on a 1500 kW turbine fueled
by natural gas and equipped with the Xonon™ combustion system. Figure 8 shows the Xonon™
cool combustion concept while Figure 9 features the gas temperatures relative to the combustor.
The Xonon™ system controls NOx emission by limiting the combustion temperatures via
catalytic combustion (see the combustor and its Xonon™ module in Figure 10) to values
corresponding to low NOx formation.
Using the GVP developed earlier for NOx emission control and a test/QA plan developed by
RTFs contractor. Midwest Research Institute (MRI), short-term testing of this turbine at the
Gianera power station in Santa Clara, CA, occurred during July 2000. As shown in Figure 11.
the meaii NOx concentration at the turbine's exit over the 2-day test was 1.13 ppm by volume of
dry gas which is lower than the lowest best available control technology (BACT) and lowest
allowed emission rate (LAER) values indicated on this figure. Information on the low
concentration NOx emission measurement is given in Reference 12.
Technical and performance audits performed by RTI and EPA during the verification testing
discussed above were acceptable. In addition, the data quality objectives were achieved as stated
in the test/QA plans, and the audits of data quality were satisfactory.
CONCLUSIONS
During its 5-year pilot period, the ETV Program verified 111 technologies using independent
third-party testing organizations to test the commercially ready technologies voluntarily offered
by vendors. Up to 12 pilot programs operated during this period to provide management and
direction in developing test protocols, verification testing, reporting, and outreach to pursue the
ETV Program's goal of providing credible information on environmental technologies to
accelerate their market acceptance. Feedback from participants in the Program indicated that
EPA's participation in the Program is critical to credible environmental verification, adds
significant value to verification, and its staffing by adequate and qualified personnel is
essential to its success. The verification process is more costly, time-consuming, and difficult
than expected. Faster reporting of test results and more effective outreach activities to attract
more participants in the verification process were recommended to improve the Program.
Of the Air Pollution Control Technology Verification Center's 27 verified technologies since its
beginning in 1997. 21 occurred when it was a pilot program until late in 2000. The Center has
verified technologies concerning fine PM and NOx emissions control, 25 being for PM control in
laboratory testing. A field test of a natural-gas-fired turbine equipped with a catalytic dry
15
-------
Conventional
combustor
fuel
Xonon
combustor
fuel
Conventional
Xonon
—. j/ /—,—
Ju,J
I-*!
JZiU
High temperature in
flame produces NOx
ame
turbine
inlet
temperature
Figure 8. Comparison of the Xonon™ and Conventional Combustor Features
(Courtesy of Catalytica Energy Systems, Inc.)
-------
w
Prebumer Fue
Air
Main Fuel
1260 °C
(2300 °F)
1260 °C
(2300 °F)
Exhaust
NUrogen
Oxides ~
3 ppmv
o
o
3
s*
3
B
8
w
w
I
to
§
K>
v*
2!
I
O
Hi
ar
0
r
o
s
M
s
SO
Compressor
Drive Turbine
Figure 9. Operational Features of the Xonon™ Combustor System
(Courtesy of Catalytica Energy Systems, Inc.)
-------
Preburner
Air Inlet
00
Combustor
Discharge
Fuel
njectors
Mixing
Zone
H
©
bJ
O
©
9
(V
n
«>
S
n
w
I
O
o
2!
«
O
a
re
ss
3
c»
r
©
B
s
»
Burnout
—y
Zone
Xonon
Module
Figure 10. Cutaway View of the Xonon™ Combustion System
(Courtesy of Catalytica Energy Systems, Inc.)
-------
H
w
o
ls>
O
o
p
"¦h
ftp
H
ft
P
n
as
-~D
"D
>
£
a
a
c
o
<+5
<0
v-
C
a>
o
c
o
o
X
o
2
i+_
o
c
ra
ck
1.2
1.1
1.0
Day 1
Mean = 1.13 ppmvd
Best Available Control Technology
(BACT) = 9 to 25 ppmv
Lowest Available Emission Rate
(LAER) = 3 to 15 ppmv
Day 2
sa
w
I
w
e
0
ISJ
s*
Z
1
o
S*
»
0
V!
r
o
G
M
P
»
Figure 11. Exhaust NOx Concentration from Gas Turbine in Xonon™ Tests
(Courtesy of Research Triangle Institute)
-------
IT3'02, Conference, May 13-17,2002, New Orleans, Louisiana
combustion system yielded an average NOx emission concentration of 1.13 ppm by volume of
exhaust gas.
REFERENCES
1. U. S. Environmental Protection Agency. Environmental Technology Verification
Program: Verification Strategy, Office of Research and Development, Washington, DC,
EPA/600/K-96/003 (NTIS PB97-160006), February 1997.
2. U. S. Environmental Protection Agency. Environmental Technology Verification Program,
Office of Research and Development, Washington, DC, EPA/600/F-98/015, September
1998.
3. U. S. Environmental Protection Agency. Environmental Technology Verification Program,
Quarterly Report, October 2001. Available at web site:
http://www.epa.gov/etv/dload/qr_octO 1 .pdf.
4. U. S. Environmental Protection Agency. Environmental Technology Verification Program;
Quality and Management Plan for the Pilot Period (1995-2000), Office of Research and
Development, Cincinnati, OH, EPA/600/R-98/064, May 1998.
5. American National Standards Institute/American Society for Quality Control
(ANSI/ASQC), American National Standard E4-1994, "Specifications and Guidelines for
Quality Systems for Environmental Data Collection and Environmental Technology
Programs," American Society for Quality Control, Milwaukee, WI, 1994.
6. U. S. Environmental Protection Agency. Environmental Technology Verification Program,
Quarterly Report, October 2000. Available at web site:
http://www.epa.gov/etv/dload/qr_oct00.pdf.
7. Farmer, J. R., D. W. VanOsdell, and T. G. Brna, "Verification Testing of Air Pollution
Control Technology," Paper No. 98-A1070, Air and Waste Management Association
Annual Meeting and Exhibition, San Diego, CA, June 1998.
8. ETS, Inc. and Research Triangle Institute. Generic Verification Protocol for Baghouse
Filtration Products, ETS, Inc., Roanoke, VA, and Research Triangle Institute, Research
Triangle Park, NC, October 2001. Available at web site:
http://etv.rti.org/apct/pdf/GVP_Revised.pdf.
9. Federal Register, 40 Code of Federal Regulations, Part 63, March 27, 1998, Government
Printing Office, Washington, DC.
10. Hanley, J. T., M. K. Owen, J. R. Farmer, and T. G. Brna, "Environmental Technology
Verification (ETV) of Paint Overspray Arrestors," Filtration and Separations Technologies
for 2000, 13th Annual Technical Conference and Exposition, American Filtration Society,
Myrtle Beach, SC, March 2000.
11. Mycock, J. C., S. M. Winemiller, J. H. Turner, J. R. Farmer, and T. G. Brna,"Baghouse
Filtration Products: An Update," Filtration and Separations Technologies for 2000, 13th
Annual Technical Conference and Exposition, American Filtration Society, Myrtle Beach,
SC, March 2000.
12. Clapsaddle, C. A., A.H. Trenholm, and A. M. Marshall, "Low Concentration NOx
Emission Measurement," Abstract No. 248, Air and Waste Management Association
Annual Meeting and Exhibition, Orlando, FL, June 2001.
20
-------
T, rnrx TECHNICAL REPORT DATA
WKrlKL-Klr-r-O/U {Please read Instructions on the. reverse before completir,
1. REPORT NO, , , 2.
IPA/600/A-02/073
3. R
4. TITLE AND SUBTITLE
Environmental Technology Verification: Credible
Performance Data for Technology Users
5, REPORT DATE
6. PERFORMING ORGANIZATION CODE
7. AUTHOR(S)
Theodore G. Brna
8. PERFORMING ORGANIZATION REPORT NO.
9. PERFORMING ORGANIZATION NAME AND ADDRESS
See Block 12
10. PROGRAM ELEMENT NO.
11. CONTRACT/GRANT NO.
NA
12, SPONSORING AGENCY NAME AND ADDRESS
EPA, Office of Research and Development
Air Pollution Prevention and Control Division
Research Triangle Park, NC 27711
13. TYPE OF REPORT AND PERIOD COVERED
Published papers; 2-4/02
14. SPONSORING AGENCY CODE
EPA/600/13
15. SUPPLEMENTARY NOTES ATm^n .. r r • . ~
APPCD project officer is Theodore G. Brna, Mail Drop E305-01, 919/
541-2683. For Int. Conf. on Incineration and Thermal Treatment Technologies, New
Orleans, LA, 5/13-17/02.
16. abstract ^ par,er summarizes accomplishments of EPA's Environmental Technology
Verification (ETV) Program, discusses lessons learned during the pilot period, briefly
notes the restructuring of the pilot programs into six centers, beginning in late 2000,
and discusses in detail the Air Pollution Control Technology Verification Center,
including its organization, operational features, technology areas of interest, and
verifications performed. (NOTE: The ETV Program was begun by the EPA in 1995 to accel-
erate commercialism of innovative or improved environmental technologies through third-
party verification of their performance. During its pilot period, 1995-2000, up to 12
pilot programs operated in the market-based verification process involving nonprofit
private/public^ partnerships. The ETV Program is a voluntary (nonregulatory) program
vhich is restricted to commercially ready technologies and uses credible independent
testing organizations to execute peer-reviewed test/quality assurance plans.)
17. KEY WORDS AND DOCUMENT ANALYSIS
a. DESCRIPTORS
b.lDENTIf IERS/OPEN ENDED TERMS
c. COSATI Field/Group
Air Pollution
Verifying
Tests
Quality Assurance
Air Pollution Control
Stationary Sources
Technologies
Cortmercial i zing
13B
14B
13G,14D
18. DISTRIBUTION STATEMENT
Release to Public
19. SECURITY CLASS (This Report)
Unclassified
21. NO. OF PAGES
20
20. SECURITY CLASS {This page)
Unclassified
22. PRICE
EPA Form 2220-1 (9-73}
------- |