August 16, 2000

EPA-SAB-EEC-00-012

Honorable Carol M. Browner
Administrator
U.S. Environmental Protection Agency
1200 Pennsylvania Avenue, NW
Washington, DC 20460

       Subject:       Review of EPA's Environmental Technology Verification Program

Dear Ms. Browner:

       The Technology Evaluation Subcommittee of the Science Advisory Board's (SAB)
Environmental Engineering Committee met March 6-8, 2000 in Washington, DC to review the degree
to which quality management is incorporated into the Environmental Technology Verification (ETV)
program. The SAB reviewed a proposed framework for this program in 1995 (SAB, 1995) and the
Agency's Quality Program (SAB, 1998) and its implementation (SAB, 1999) in  1998. Both the
Agency and the SAB agree on the major elements of the Agency's Quality System; the challenge has
been its implementation. This letter summarizes the evaluation of the Technology Evaluation
Subcommittee while the attached report describes the Subcommittee's views in greater detail.

       The ETV is a pilot program designed to test different approaches to environmental technology
verification.  ETV has tested, or has tests  underway for, 150 technologies. Because the technologies
addressed are diverse, as are their applications, ETV has made extensive use of  stakeholders and
technical panels to design testing protocols to assure the data quality needs of the customers for the
data are met.

       Overall, the ETV program effectively incorporates the Board's 1995 recommendations; the
Subcommittee considers the ETV program fundamentally sound and valuable. Moreover, the ETV
program has successfully adopted major elements of the Agency's Quality System early, well, and with
enthusiasm. While additional Quality System requirements remain to be adopted by the ETV program,
what has already been implemented is commendable. The Subcommittee is optimistic that the ETV
program's proven success implementing an effective Quality System in a geographically and
organizationally diverse and decentralized program will encourage other Agency  programs to embrace
the Quality System.

-------
       In this letter, the Subcommittee would like to highlight three important issues.  First,
environmental protection still requires the use of effective technologies based on sound scientific
principles.  As environmental protection moves beyond command-and-control, end-of-pipe solutions to
more decentralized and sustainable approaches, the number and variety of decision-makers increase.
By encouraging the development and evaluation of innovative technologies, EPA broadens the options
available to decision-makers.  Programs such as ETV that address the market for credible information
on technologies aimed at reducing or eliminating environmental risks are important and will  remain so
for the foreseeable future.

       Second, while the overall framework for the implementation of the ETV program's  quality and
management plan is in place and functioning, three aspects of the Quality System require more
consistent implementation.

       a)     The ETV program must consistently employ a systematic data quality planning
              approach, such as the Data Quality Objective Process (EPA QA/G-4), during the
              development of generic test protocols and technology specific Test/QA plans.

       b)     The verification testing should be a function of the inherent performance variability of a
              specific technology.

       c)     The ETV verification partners and their subcontractors must all fully comply with the
              Agency's quality system.

       Finally, a policy issue directly impacts the ETV program's credibility. This is the option that, at
any time, a technology vendor may voluntarily withdraw a product prior to the completion of
verification testing provided the government is reimbursed for any incurred costs. The availability of this
option means that, if the verification data demonstrate that the technology does not perform as
anticipated or advertised, the vendor may effectively "buy back" that data thereby canceling the
Agency's disclosure of the technology's performance.  In fairness to the ETV program, this policy does
not adversely affect the quality of any data actually issued through Agency approved technology
verification reports. Furthermore, when the ETV program itself was new and untried, this policy may
have encouraged vendor participation.  Finally, this option has only been invoked once in over 150
tests completed or underway.  Nevertheless, the Subcommittee is convinced that the prerogative of
vendors to suppress the disclosure of an unflattering technology evaluation through financial
reimbursement detracts from the overall credibility of the ETV program. The Subcommittee, therefore,
encourages a modification in program policy.

       In summary, the Subcommittee recognizes the ETV program as being largely successful in
meeting its goal of generating credible and impartial verification data on environmental technology
performance.  The Subcommittee attributes this to the program's implementation of

-------
the Agency's Quality System, the effective use of stakeholder advisory groups and the dedication of the
ETV program personnel.  The Subcommittee looks forward to your response to the advice contained
in this report.

                                    Sincerely,
                                           /s/

                                    Dr. Morton Lippmann, Interim Chair
                                    Science Advisory Board
       /s/                                                /s/

Dr. Hilary I. Inyang, Chair                           Dr. Michael J. McFarland, Chair
Environmental Engineering Committee         Technology Evaluation Subcommittee
Science Advisory Board                            Environmental Engineering Committee

-------
                                         NOTICE
       This report has been written as part of the activities of the Science Advisory Board, a public
advisory group providing extramural scientific information and advice to the Administrator and other
officials of the Environmental Protection Agency.  The Board is structured to provide balanced, expert
assessment of scientific matters related to problems facing the Agency.  This report has not been
reviewed for approval by the Agency and, hence, the contents of this report do not necessarily
represent the views and policies of the Environmental Protection Agency, nor of other agencies in the
Executive Branch of the Federal government, nor does mention of trade names or commercial products
constitute a recommendation for use.
Distribution and Availability: This Science Advisory Board report is provided to the EPA
Administrator, senior Agency management, appropriate program staff, interested members of the

-------
public, and is posted on the SAB website (www.epa.gov/sab).  Information on its availability is also
provided in the SAB's monthly newsletter (Happenings at the Science Advisory Board).  Additional
copies and further information are available from the SAB Staff.
                                      ABSTRACT
       The Technology Evaluation Subcommittee of the Science Advisory Board's (SAB)
Environmental Engineering Committee reviewed the extent to which quality management is incorporated
in the Environmental Technology Verification (ETV) program.

       The Agency's Quality System and ANSI/ASQC E-4 provide an effective framework within
which the Environmental Technology Verification (ETV) program has established a multi-tiered quality
assurance oversight system.  The ETV program has ensured that the appropriate technology verification
factors and level of quality management are consistent with marketplace demands by the extensive use
of stakeholder advisory groups.

       The Subcommittee recommended that the generic test protocols and Test/QA plans be
improved by consistent employment of a systematic data quality planning process such as the DQO
process (EPA QA/G-4). Consistent use of a systematic data quality planning process will ensure that
future verification tests will be designed that reflect the inherent variability in technology performance.

       To protect the credibility of the ETV program, verification partners and their subcontractors
must comply with the same quality assurance requirements adopted by the Agency.

Keywords:    environmental, technology, verification, quality

-------
                            TABLE OF CONTENTS
1.  EXECUTIVE SUMMARY 	1

2.  INTRODUCTION	2
       2.1    Background	2
       2.2    ETV Program - Overview 	2
       2.3    Systematic Data Quality Planning Process	3
       2.4    Charge	5
       2.5    SAB Review Process	5

3.  RESPONSES TO SPECIFIC CHARGE QUESTIONS	6
       3.1    Is the use of the ANSI/ASQC E-4 consensus standard and the maintenance
             of an active quality assurance oversight implementation adequate to assure
             that quality management has been appropriately incorporated in the program?	6
             3.1.1   Findings	6
             3.1.2 Recommendations  	8
       3.2    Is the level of quality incorporated into the example protocols adequate to appropriately
             portray commercial ready technology verification factors?  	8
             3.2.1 Findings	8
             3.2.2 Recommendations  	10
       3.3    Are these test protocols adequately comparable to assure that individual technologies
             tested in the future are fairly and comparably evaluated with those already verified? . 12
             3.3.1 Findings	12
             3.3.2 Recommendations	13

4.  ADDITIONAL COMMENTS	14
       4.1    Does the environmental technology marketplace need an ETV program?	14
       4.2    What is the value added to environmental technologies that are verified
             through the ETV program?  	14
       4.3    What factors should be considered in planning the future of the
             ETV program?	14
       4.4    Policy Factors Affecting Quality Assurance Credibility  	15

REFERENCES	R-l

APPENDIX A - DOCUMENTS PROVIDED TO THE TECHNOLOGY EVALUATION
       SUBCOMMITTEE	  A-l

APPENDIX B- SUMMARY OF ELEMENTS OF THE EPA QUALITY SYSTEM 	B-l

                                          iii

-------
APPENDIX C - ROSTERS  	B-3
                            1. EXECUTIVE SUMMARY
       The Technology Evaluation Subcommittee of the Science Advisory Board's (SAB)
Environmental Engineering Committee met March 6-8, 2000 in Washington, DC to review the degree
to which quality management is incorporated into the Environmental Technology Verification (ETV)
program. In this review, the Subcommittee considered the applicability of the ANSI/ASQC E-4
standard; Agency policy, requirements, and guidance; and the experience of the Subcommittee.
Appendix A lists the documents reviewed by the Subcommittee including the ETV quality and
management plan and quality documents for seven completed pilot studies.  The latter included
documents such as pilot quality and management plans, generic test protocols and technology specific
Test/QA plans.  Appendix B is a brief summary of selected documents in EPA's Quality System.
Please note that, in this report, the Subcommittee has used the word "Agency" when describing Agency
level activities and decisions, such as the Agency's Quality System, policies and requirements. The
Subcommittee has reserved the words "ETV program" for decisions and activities within that program.

       The Agency's Quality  System and ANSI/ASQC E-4 consensus standard provide an effective
framework within which the Environmental Technology Verification (ETV) program has established a
multi-tiered quality assurance oversight system. In summary, the ETV program has effectively
implemented major portions of the Agency's Quality System. ETV has made excellent use of
stakeholder advisory groups in establishing technology specific verification factors acceptable to both
users of and permit writers for environmental technologies. Because the stakeholder advisory groups
have been beneficial, the Subcommittee suggests that the Agency capture its experience in the form of
guidance on how future stakeholder advisory groups should be constituted and their specific role in
verification test development.

       The Subcommittee recommends that the Agency consistently employ a structured data quality
planning process such as the Agency's DQO process (EPA QA/G-4) to develop generic test protocols
and Test/QA plans.  Use of a systematic data quality planning process ensures that the decision-
maker's needs are appropriately considered in the development of verification testing procedures and in
the reporting of verification test results. In this case,  those needs include the environmental
marketplace's requirements for a minimum technology performance standard and understanding
technology performance variability

       Finally, the Subcommittee was alarmed to discover that some ETV verification partners and
their subcontractors are convinced that their ETV data collection activities are not subject to the same
quality assurance requirements adopted by the Agency.  The Agency's Quality System mandates that
specific quality requirements be implemented whenever environmental data are collected for or on
behalf of the EPA without regard to whether the data collection activity was funded under contract,

                                             1

-------
grant, or cooperative agreement. Without this element of quality assurance, the future credibility of the
ETV program is at risk. The Subcommittee therefore recommends that the ETV program enforce
these requirements on all verification partners.
                                 2.  INTRODUCTION
2.1    Background

       In the early 1990's, government and private sector groups determined that the lack of
independent and credible performance data was a major barrier to environmental protection because it
impeded the development and use of innovative environmental technology. Based on broad input from
technology developers, users and regulators, both the President's environmental technology strategy,
Bridge to a Sustainable Future and the Vice President's Reinventing Government: National
Performance Review contained initiatives for an EPA program to accelerate the development and use
of environmental technology through objective verification and reporting of technology performance. In
1994, the Office of Research and Development's (ORD) National Risk Management Research
Laboratory (NRMRL) convened a workshop of ORD managers to formulate a plan for implementing
such a program. The plan was modified based on recommendations from the Science Advisory Board
(SAB) in May 1995  SAB, 1995). The Agency formally established the Environmental Technology
Verification (ETV) program in October 1995.

       The ETV program is currently in the final year of a five-year pilot phase and must soon prepare
a report to Congress  that will define the future configuration of the program.  In October 1999, the
Agency requested that the SAB review the extent to which quality management is incorporated into the
ETV program.

2.2    ETV Program - Overview

       The Environmental Technology Verification (ETV) program has been established by the
Environmental Protection Agency (EPA) to document the performance characteristics of innovative
environmental technologies across all media and to report this information to the permitters, buyers and
users through the issuance of Agency approved technology verification reports. ETV is a voluntary
program designed to evaluate the performance of commercial-ready technologies; it is not a technology
approval or certification process.

       Management oversight of the ETV program is provided by the ETV management team, which
consists of approximately twenty (20) EPA Office of Research and Development (ORD) employees
assigned to two laboratories: National Risk Management Research Laboratory (NRMRL) and the
National Exposure Research Laboratory (NERL). The ETV program funds and manages twelve pilot
projects, each representing a broad environmental technology group.  The implementation of each pilot
projects is the responsibility of a third party organization (i.e., verification partner), which operates

-------
under the auspices of the ETV management team. The verification partners, which are selected through
open competition, include; private sector testing, evaluation and research companies, state technology
evaluation programs, federal laboratories and industry associations. To ensure that verification partners
have sufficient program flexibility in responding to the needs of the environmental technology
marketplace, cooperative and interagency agreements are used as the principal funding mechanisms for
remittal of verification partner services rather than contracts.

        The process of formulating a technology verification test involves the collective efforts of the
ETV management team, a specific verification partner (and their subcontractors) as well as the
representatives of the environmental technology customers (e.g., stakeholder advisory groups).
Members of the stakeholder advisory groups are selected to represent the interests of states,
technology developers, technology buyers, consulting engineers, and financial institutions. Meeting
several times a year, these groups advise the ETV management team on testing priorities, information
needs to facilitate decision-making, test protocols to determine technology performance, and
information distribution methods that will effectively reach specific customer groups.

        The ETV program has adopted a multi-tiered planning approach for developing technology
verification tests.  The principal documents that describe the two levels of verification test development
include the generic verification protocol and the Test/Quality Assurance (i.e., Test/QA) plan.  The
generic verification protocol provides testing guidance for a particular technology category (e.g.,
drinking water particulate removal), but not of a specific  product,  facility, or specific test event.  The
generic protocol is designed to be sufficiently broad to direct the testing of various products within the
same technology category while providing the necessary framework for development of the more
detailed Test/QA plan.  The Test/QA plan provides specific instructions for the verification testing of a
single product (e.g., bag filters), or a group of products in the same technology category, during a
specific test event.  The Test/QA Plan is designed to provide sufficient detail to allow a third party to
reproduce the verification test results.

        The quality management functions and activities necessary to support the ETV program are
described in the ETV quality and management plan (QMP). The ETV QMP defines the procedures,
processes, inter-organization relationships and outputs that will assure that the quality of both the data
and programmatic elements of the ETV program are appropriate for generating information that
adequately portrays the performance of commercial  ready technologies. Since verification partners
bear much of the quality management responsibilities, these organizations are required to develop and
implement their own quality management plans consistent with the quality assurance requirements found
in the ETV QMP. The compatibility of the ETV QMP and the quality management plans developed by
the verification partners assures that the appropriate levels of data collection, quality outputs and
customer responsiveness will be met.

-------
2.3    Systematic Data Quality Planning Process

       The Agency quality requirements specify that environmental data generated on behalf of or
funded by an EPA program must be supported by the use of a systematic data quality planning process.
In general, a systematic data quality planning process is a flexible planning tool whose complexity can
be modified to suit the specific circumstances of a data collection activity. Use of a systematic data
quality planning process assures that the type, quantity and quality of environmental data generated will
be appropriate for defensible decision-making.  For potential purchasers, permitters and developers of
environmental technology, a principal decision is to determine whether the type and quality of data
furnished through ETV verification reports adequately portray a technology's performance.

       Throughout this report, the Technology Evaluation Subcommittee recommends the consistent
use of a systematic data quality planning process for the development of both generic verification test
protocols and technology-specific Test/QA plans.  The structure of the systematic data quality planning
process should provide a convenient and consistent methodology for documenting the data collection
activities and for communicating the environmental data collection design to others. The Agency's Data
Quality Objectives (DQO) Process (EPA QA/G-4) is an example of a systematic data quality planning
process whose implementation will assure that data of an appropriate quality for defensible decision-
making are generated.

       The ETV program addresses diverse technologies being used by different groups for different
purposes.  Therefore the systematic data quality planning process employed for verification test
development must be flexible. Some technologies are designed to comply with one or more specific
regulations.  For other technologies no regulatory standard exists.

       Where technologies have been developed to meet regulatory requirements, the decision-maker
can often easily define the consequences of highly variable technology performance.  This makes it
easier to employ a systematic data quality  planning process to develop a statistically based verification
test plan. The following example is for illustration only. In the generic verification test protocol, the
decision-maker could establish that all drinking water paniculate removal technologies must be
supported by a verification test design that limits the decision error rate in meeting a regulatory standard
(e.g., pathogen concentration) to no more than ten percent (i.e., 90% confidence level). Once this
generic data quality performance criteria was established, technology-specific verification tests (e.g.,
Test/QA plan) would be designed that reflect the variability of an individual class of drinking water
particulate removal technologies (e.g., bag filters). In other words, the technology-specific verification
tests (e.g., Test/QA plan) would be designed to ensure that a sufficient amount of technology testing
was conducted to support the stakeholder's claim that the performance of all ETV verified drinking
water particulate removal technologies have an associated decision error rate of no more than ten
percent (90% confidence  level).

-------
       By taking a systematic data quality planning approach to statistical verification test design, the
ETV program assures the potential purchasers, users and permitters of environmental technology that
the furnished data are of a known and appropriate quality for decision-making.  Moreover, by explicitly
considering technology-specific performance variability in the verification test design, technology
developers and vendors can recognize an economic benefit resulting from the improvement of overall
product reliability.

       Not all technologies that contribute to the protection of public health and the environment are
governed by regulations. Where the environmental technologies are not intended for use in regulatory
compliance programs, decision-makers will need to put more effort into defining their information
needs.  Clearly articulated decisions and a description of the data needed to support them are
necessary inputs to any systematic data quality planning approach. These inputs provide the basis for
developing statistically based verification tests. The ETV program's use of stakeholder groups is an
especially rich source of information in these situations.

2.4    Charge

       The focus of the present SAB review is the  degree to which quality management is
incorporated into the ETV program. In the months  leading up to the SAB meeting, the Agency and the
Board agreed upon a charge consisting of the following three questions:

       a)      Is the use of the ANSI/ASQC E-4  consensus standard and the maintenance of an
               active quality assurance  oversight implementation adequate to assure that  quality
               management has been appropriately incorporated  in the program?

       b)      Is the level of quality incorporated into the example protocols adequate to appropriately
               portray commercial ready technology verification  factors?

       c)      Are these test protocols  adequately comparable to assure that individual technologies
               tested in the future are fairly and comparably evaluated with those already verified?

2.5    SAB Review Process

       The SAB Subcommittee was recruited from the Environmental Engineering Committee (EEC)
and its consultants following discussions between the EEC Chair and the Agency. The Subcommittee
met in public session on March 6, 7 and 8, 2000 in Washington, DC. This report is based upon written
comments prepared before and during the meeting  by Subcommittee members. It was approved by
the Environmental Engineering Committee at a public conference call meeting May 3, 2000 and
subsequently by the SAB's Executive Committee, May 30, 2000.

-------
           3. RESPONSES TO SPECIFIC CHARGE QUESTIONS
3.1    Is the use of the ANSI/ASQC E-4 consensus standard and the maintenance of an
active quality assurance oversight implementation adequate to assure that quality
management has been appropriately incorporated in the program?

       3.1.1   Findings

       The American Society for Quality and the American National Standards Institute jointly issued
Specifications and Guidelines for Quality Systems for Environmental Data Collection and
Environmental Technology Programs (ANSI/ASQC E-4) in 1994. This consensus standard
provides the minimum criteria for quality systems for environmental programs and their associated
management systems.  Although the standard provides the foundation upon which quality systems can
be developed, ANSI/ASQC E-4 does not specify in detail how quality systems should be
implemented.  Therefore, adoption of the standard does not necessarily ensure that quality management
will be appropriately incorporated into a data collection program.

       The SAB has previously concurred with the Agency's interpretation and implementation of the
ANSI/ASQC E-4 consensus standard in the Agency's Quality System (SAB, 1998 and 1999). The
Subcommittee now specifically endorses the use of ANSI/ASQC E-4 in establishing a multi-tiered
quality assurance oversight system for the Environmental Technology Verification (ETV) program.  The
Subcommittee is confident that the Agency's Quality System provides an effective framework for
generating technology specific data of the appropriate quality for decision-making. This framework
includes program and lower level quality and management plans (QMP), generic technology protocols,
and project specific quality assurance (Test/QA) plans.

       The ETV quality and management plan (QMP) was developed to conform to both the
ANSI/ASQC E-4 consensus standard and Agency requirements (EPA Order 5360.1 "Policy and
Program Requirements to Implement the Mandatory Quality Assurance Program" and EPA QA/R-2
"EPA Requirements for Quality Management Plans"). The  QMP defines the roles and responsibilities
of Agency quality managers with respect to the verification partners and their subcontractors.  The
verification partners and subcontractors, in turn, are required to develop and implement their own
QMPs, which describe the quality systems employed during technology verification testing.  As a result
of the partnership arrangement, the quality of the final product /'•£., technology verification information,
depends upon the design and implementation of the QMPs developed by the ETV program, its
verification partners and their subcontractors.

       To document quality and to maintain continuous quality improvement as required by the
Agency's Quality System, the ETV program has established a quality assessment process that consists
of management system reviews, technical system reviews, performance evaluations and data audits.

-------
The objective of the assessment process is to ensure that an adequate and appropriate level of quality
assurance oversight is maintained for identifying and correcting defects in the quality management
system and/or project data prior to the publication of a verification report.

       The Subcommittee fully supports the ETV program's decision to employ the ANSI/ASQC E-4
consensus standard in development of its quality and management plan.  However, the Subcommittee is
concerned by some elements of the ETV program's implementation of its quality and management plan
as illustrated by the following observations.

       a)     The ETV quality and management plan does not clearly describe the process by which
              quality assurance oversight workloads are assigned to the Agency quality managers.  In
              half of the twelve (12) example verification protocols and Test/QA plans reviewed, the
              responsibility for quality management oversight was assigned to a branch chief while, in
              the remaining six projects, quality management oversight was the responsibility of a
              divisional manager.  The Subcommittee has two concerns. One is whether the resulting
              workloads allow adequate oversight for all plans. The other is the need to maintain a
              consistent perspective in quality  assurance decision-making.

       b)     The Subcommittee endorses the ETV program's current approach for selection of
              verification partners, which emphasizes technical capabilities as well as an established
              history of implementing appropriate levels of quality assurance.  The Subcommittee
              recognizes that the verification partner selection process must be flexible to guarantee
              that technology customer needs are accommodated. However, to ensure consistency
              in the future development and implementation of quality management across technology
              areas, the program should develop verification partner selection criteria.

       c)     The ETV quality and management plan neglects to address the issue of verification data
              ownership, use and disclosure.  To maintain favorable marketplace support of the ETV
              program, the Subcommittee encourages the ETV program to carefully review the
              consequences of the contractual arrangements into which both verification partners and
              environmental technology vendors enter. Technical  credibility requires not only
              verifying that technology testing results are objective and representative of those found
              by a disinterested third party, but also that contractual arrangements guarantee that
              technical data are properly managed by the Agency and its verification partners.

       One Subcommittee member expressed concern that the productivity of the Environmental
Technology Verification Program was so valuable to the environmental marketplace that it should not
be impeded by imposing quality requirements in excess of what is currently being implemented.  The
remainder of the Subcommittee believed that, because the work performed by ETV is so important, full
compliance with the requirements of the Agency's Quality System was essential for the future credibility

-------
of the program. The general form of this concern was addressed in the Science Advisory Board
Review of the Implementation of the Agency-Wide Quality System (SAB, 1999):

              The Subcommittee believes that the Agency's senior management and
       Congress must recognize that initially, as the Quality System is implemented,
       there is the potential that the quality of products and services will improve at the
       expense of the total amount of work performed. The  benefits of a Quality System
       have been argued to be free of costs, but the validity of this assumption is based
       on the amortization of costs over the longer term.

       3.1.2 Recommendations

       The Subcommittee supports the ETV program's decision to use the ANSI/ASQC E-4
consensus standard as a framework for the ETV program's quality and management plan.  The
Subcommittee believes that maintaining proper and consistent quality assurance oversight is essential to
the success of the program. Because the organizational positions and workloads vary among those
who are currently providing such oversight, the Subcommittee believes it possible that the thoroughness
and impact of the oversight may vary as well. Consistency in form is not necessary if consistency in
function can be established and maintained.  However, the Subcommittee suggests that, to ensure the
quality and consistency of the oversight function, the ETV program carefully  consider the following.

       Because quality assurance oversight workload levels have the potential to impact quality
assurance decisions, the Agency should consider workload issues as well as  the technical expertise and
functional duties of the organizational reporting level when assigning quality assurance oversight
responsibility.

       The Agency should consider an appropriate mechanism to provide consistency in quality
assurance decision-making. Although the obvious approach would be assignment of primary quality
assurance management responsibility to the same organization level irrespective of the technology
undergoing verification testing, the ETV Program could find a  different and effective solution.

3.2    Is the level of quality incorporated into the example protocols adequate to
appropriately portray commercial ready technology verification factors?

       3.2.1 Findings

       The ETV quality and management plan describes the process employed for developing the
generic verification testing protocols.  The Subcommittee supports the Agency's requirement that
verification data quality criteria be established through the use  of a systematic data quality planning
process such as the Agency's Data Quality Objective (DQO) process (EPA QA/G-4) or an equivalent
methodology.  The Subcommittee endorses the ETV program's extensive use of stakeholder advisory

-------
groups for identification of appropriate technology verification factors as well as for providing practical
guidance for addressing the range of field conditions that technologies are likely to encounter.

       The test protocols are structurally generic. Additional information is needed to identify all key
verification and site-specific factors for specific technologies; therefore, the ETV quality and
management plan requires the development of technology specific Test/QA plans. The Test/QA plan is
meant to be the equivalent of the Agency's Quality Assurance Project Plan (QAPP), which is described
in Agency requirements and guidance (EPA QA/R-5 and EPA QA/G-5). The Test/QA plan provides
a second level of assurance that the technology verification test will be performed in a manner that
should generate objective and useful information of known quality. To ensure that verification tests
accommodate the permutations and range of conditions that technologies are likely to encounter in the
field, the Subcommittee recommends that a  structured data quality planning process such as the DQO
process (EPA QA/G-4) be employed for the development of Test/QA plans.

       The ETV program did not consistently consider technology performance variability in the
development of the generic technology verification protocols. By establishing data quality performance
criteria through the stakeholder advisory process, the extent of verification testing necessary to satisfy
the data quality needs of the environmental technology marketplace can be impartially determined and
formulated into the specific Test/QA plan. Using the drinking water treatment protocols as an example,
ETV might verify the ability  of several technologies to reduce the concentration of waterborne
pathogens.  In this case, the stakeholder advisory group could establish data quality performance
criteria for all drinking water technologies subject to this protocol. Once data quality performance
criteria are established (e.g., tolerable limits on decision errors) for a general category of environmental
technologies (e.g., drinking water participate removal), a verification Test/QA plan can be developed
that objectively reflects the inherent performance variability of a specific technology (e.g., filtration,
coagulation/flocculation, etc.). By explicitly defining a minimum data quality performance standard in
the generic technology verification protocols and by accounting for technology performance variability
in the Test/QA plans, the ETV program not only ensures that verification data of a known and
appropriate quality are being  consistently generated, but that future environmental technologies will be
objectively and impartially evaluated.

       The number of samples to be evaluated during verification testing is a function of both the data
quality performance criteria established by the  stakeholder  advisory group and the inherent variability of
the specific technology undergoing verification testing.  Accounting for variability in the design of
environmental data collection activities is not only consistent with the Agency's Quality System (EPA
Order 5360.1, EPA QA/R-5), it is necessary to portray the performance of commercial-ready
technologies.  This SAB report employs the term data quality performance standard to mean the
tolerable limits on decision errors  as described by Step 6 of the Data Quality Objectives Process
(DQO). Without clearly defining a data quality  performance standard that considers technology
performance variability, potential users of any verified technology will find it impossible to anticipate its
performance.  Furthermore, by establishing data quality performance criteria in the development of

-------
verification tests, future verification costs can be minimized through eliminating the collection of
unnecessary, duplicative or overly precise data.

       Finally, the Subcommittee, the Agency, and ETV program management agree that the ETV
verification partners and their subcontractors are contractually bound to comply with the Agency's
Quality System requirements. Not all verification partners are fully aware that even though cooperative
and interagency agreements rather than contracts are being employed for remittal of their services
Agency policy clearly states that all work performed by extramural organizations on behalf of or funded
by the EPA that involves the collection or use of environmental data in Agency programs shall be
implemented in accordance with an Agency-approved quality assurance project plan (QAPP)
developed from a systematic planning process (EPA QA/R-5).  The environmental data collection
activities for which an Agency-approved QAPP is required include all work performed through
contracts, interagency agreements, assistance agreements (e.g.,  cooperative agreements, grants, etc.)
and in response to statutory or regulatory requirements and consent agreements negotiated as part of
enforcement actions (EPA QA/R-5, EPA QA/G-5).

       3.2.2  Recommendations

       For various ETV protocol categories, small teams of Subcommittee members reviewed specific
example protocols.  This was done to obtain a sense for the implementation of the quality system at the
level where data are actually generated, but not for the purpose of critiquing the individual protocols.
Subcommittee teams reviewed the relevant documents for these categories:

       a)     Advanced Monitoring Systems Pilot
       b)     Air Pollution Control Technologies Pilot
       c)     Drinking Water Systems Pilot
       d)     Greenhouse Gas Technologies Pilot
       e)     Indoor Air Products Pilot
       f)      P2 Innovative Coatings & Coating Equipment
       g)     Site Characterization & Monitoring Technologies

The following recommendations are based, in part, on the results of those reviews.

       Continue Stakeholder Advisory Groups - First, the Subcommittee commends the Agency
for its use of stakeholder advisory groups for identifying appropriate verification factors during generic
protocol development.  It is particularly challenging to involve large numbers of people of diverse
backgrounds and varying levels of technical expertise in the development of technical documents.  It is
an ambitious and largely successful process that provides a useful example to others wishing to
undertake similar efforts. The Subcommittee recommends that the Agency continue its use of
stakeholder groups.
                                             10

-------
       Fully Implement a Systematic Planning Process - Second, the translation of the verification
factors and generic protocols into  specific plans varies among the technology categories.  In some
cases it appears that a systematic planning process, such as the DQO process,  was employed to
establish data quality criteria.  In other cases a systematic planning process was only partially employed
or there were flaws in its application with the result that there were gaps in the logic tying the needs
expressed by the stakeholders to the actual testing done.

       When the DQO process is not completely implemented, it is impossible to determine whether
the testing was designed to provide sufficient data of known quality to allow potential users or
purchasers of verified technology to reach defensible decisions regarding anticipated technology
performance. To achieve an appropriate level of quality to adequately portray commercial ready
technology verification factors, the Subcommittee provides the following recommendations for Agency
consideration.

       Step 6 of the DQO process (Appendix B) requires that the principal decision-maker specifies a
data quality performance criteria (e.g.,tolerable limits on decision errors) for a particular technology
category.  By establishing this standard in the generic test protocol, the inherent performance variability
of a particular technology (or group of technologies) may be used to develop a Test/QA plan whose
implementation would generate data of a known and appropriate level of quality for decision-making.
Development of Test/QA plans that would generate verification data of a known and consistent quality
would have happened if all test plans (or equivalent documents) had fully applied a systematic data
quality planning process,  such as the DQO process.

       The full implementation of a systematic planning process is important primarily for the reasons
stated above.  There are additional reasons why it is important. Vendors pay for the testing; some
companies can afford this more easily than others.  If data are required above and beyond what is
necessary to meet the stakeholder's needs, then the verification process presents a burden to
companies with fewer resources.  If insufficient testing is done to meet the stakeholder's needs,  then the
entire market is misinformed about the true capabilities of the technology.

       To ensure that all factors are considered and that the generic test protocols are interpreted and
applied appropriately, ANSI/ASQC E-4 and Agency Quality System require that a systematic data
quality planning process be implemented for ensuring that the correct type and amount of data are
collected during verification testing. The Subcommittee recommends that the ETV program always
employ the Agency's data quality objectives (DQO) or comparable systematic planning process to
define data quality criteria prior to verification testing.

       To enable potential technology purchasers and users to anticipate the performance of a verified
technology with confidence, the Subcommittee recommends that the verification testing design be
formulated based on both the data quality standard established by the stakeholders and the inherent
variability of the individual technology.

                                               11

-------
       To ensure that the implementation of the generic test protocols generate data of sufficient quality
for defensible decision-making, the Subcommittee recommends that the Agency clearly document how
information supplied by stakeholder advisory groups is translated into technology specific sampling
strategies.

       Enforcement of Agency Quality Assurance Policies for Verification Partners - To
establish a consistent level of quality assurance in all data collection activities, the Subcommittee
strongly recommends that the ETV program enforce Agency quality assurance policies by requiring that
verification partners generate and implement an EPA-approved quality assurance project plan (QAPP),
as described in EPA QA/R-5, prior to the collection of any technology verification data.

3.3    Are these test protocols adequately comparable to assure that individual technologies
tested in the future are fairly and comparably evaluated with those already verified?

       3.3.1 Findings

       The purpose in maintaining generic test protocols is to assure that the environmental technology
purchasing and permitting community can reach appropriate and defensible decisions. The generic test
protocols are characterized by a consistent set of metrics that is designed to facilitate comparisons
between similar technologies. Because it is impossible to anticipate all site-specific conditions under
which an environmental technology may be operated, development of generic test protocols that permit
objective and fair comparisons of related technologies is difficult. Moreover, experience with past
operating conditions and matrices may not be sufficient for extrapolating to new environmental
technologies and/or operating conditions.

       Stakeholder advisory groups were identified by the ETV program as principal participants in
the formulation of generic test protocols. These groups, which represent the various interests of the
environmental technology marketplace, advise the ETV program on verification testing priorities and
information needs for assessing environmental technology performance.

       Despite the influence that stakeholder advisory groups have on the development of generic test
protocols, the process by which these groups are constituted was not fully described by Agency
documentation.  The Subcommittee recognizes that, because of the broad diversity in environmental
technologies to be verified, the number and type of interests reflected in the stakeholder advisory group
membership will vary significantly between technology categories. However, to ensure that future
generic test protocols are developed impartially, the process employed to establish the composition of
the stakeholder advisory groups and their role in generic test protocol development should be
transparent and consistently applied from one technology category to the next.

       Although the use of generic test protocols, by themselves, is insufficient to ensure comparability
of verified technologies with those to be tested in the future, the use of Test/QA plans in support of the

                                              12

-------
generic test protocols should provide sufficient program flexibility to permit a fair comparison of future
technologies to those already verified.
       3.3.2 Recommendations-

       Guidance for Constituting Stakeholder Advisory Groups - The Subcommittee commends
the Agency for its use of a multi-tiered verification-testing program.  In general, this approach to
technology verification testing provides a flexible framework for establishing a set of key metrics that
can facilitate the comparison of previously verified technologies with those that will undergo verification
testing in the future.  By considering the data needs of environmental technology users through
stakeholder advisory groups, the ETV program has effectively captured the essential performance
criteria requirements of the environmental technology marketplace. However, to ensure that future
technologies will be fairly compared to those technologies already verified, the Subcommittee offers the
following recommendation for Agency consideration.

       To provide consistency in the process used to develop future generic test protocols and
Test/QA plans, the Subcommittee recommends that the ETV program provide guidance on how future
stakeholder advisory groups should be constituted and their specific role(s) in verification test
development. The Subcommittee is not advocating a rigid and formulaic approach to advisory group
formation resulting in an unwarranted consistency between advisory groups for different users of
different technologies. The Subcommittee recommends that the ETV program capture what is known
about how to do this well, then put it in a document that will assist others. As part of this task, the
Agency may wish to consider developing an explication of the verification process.
                                              13

-------
                           4.  ADDITIONAL COMMENTS


       This chapter addresses issues that were not included in the charge.

4.1    Does the environmental technology marketplace need an ETV program?

       The scarcity of independent and credible technology verification information is one critical
barrier to the use of innovative environmental technologies.  Therefore, the verification testing
information that is provided by the ETV program fulfills an essential need of the environmental
technology marketplace. The ETV program also contributes to the efficient operation of the
environmental technology marketplace by reducing the transaction costs associated with researching
and testing environmental technologies for all potential purchasers and users.

4.2    What is the value added to environmental technologies that are verified through the
ETV program?

       At the core of the ETV program is responsiveness to the environmental technology customers
and suppliers. Responsiveness to the environmental technology customer is successfully demonstrated
in the ETV program through the use of stakeholder advisory groups.  The stakeholder advisory groups
not only provide suggestions as to the minimum level of verification testing but also provide guidance as
to the most effective format for presenting verification test results. Similarly, ETV program
responsiveness to environmental technology suppliers is established by providing a process for
objective and credible technology verification, which improves the market for environmental technology
products.

4.3    What factors should be considered in planning the future of the ETV program?

       The extensive use of stakeholder advisory groups is a principal factor for the  current success  of
the ETV program.  The stakeholder advisory groups provide assurance that the program goals are
aligned with the needs of the technology users.   To ensure future program success, the Subcommittee
recommends that vendors, technology users, citizens' groups as well as local,  state and federal
government agencies be encouraged to continue their active participation in the development of generic
test protocols as well as Test/QA plans.

       The ETV program is expected to evolve from a primarily government supported to a privately
funded verification program.  Nevertheless, the Agency's verification partners identified a continued and
substantial EPA oversight role as an important factor for maintaining the program's overall technical
credibility.  Given the perceived importance of the Agency's continued oversight role by the
marketplace, the Agency should carefully evaluate to what extent it can withdraw from the overall
management of the ETV program without adversely impacting its marketplace acceptance.

                                              14

-------
       Finally, there is a continuing need to minimize the perception that ETV verification reports
constitute an indirect acknowledgment of technology success.  The ETV program has emphasized that
it is a voluntary program designed to increase market efficiency by providing an independent and
credible source of verification information. However, the increased use of verification testing data by
permitting agencies in making technology performance decisions may be viewed by the marketplace as
a de facto technology certification process. Therefore, continued vigilance will be required to avoid the
perception that the ETV program is providing an unfair advantage to specific segments of the
environmental technology marketplace.

4.4    Policy Factors Affecting Quality Assurance Credibility

       Continued private sector support is critical for the future success of the ETV program. This
support is based, in part, on the flexibility of the ETV program. Not all vendors are as ready for testing
as they may initially believe and the Subcommittee supports the Agency's decision to allow
environmental technology vendors permission to review preliminary verification results during the pre-
testing phase.

       Once testing has begun, however, the situation is different and the Subcommittee disagrees with
the policy of allowing environmental technology vendors the opportunity to withdraw their products
from full verification testing following full financial reimbursement of any costs incurred by the Agency or
its verification partners. This results in the subsequent cancellation of the verification test report.
Providing technology vendors the prerogative to "buy back" unfavorable data obtained during actual
tests presents an appearance problem which adversely affects the credibility of ETV program. The
Subcommittee therefore recommends that the Agency change this policy.
                                              15

-------
                                 REFERENCES

American Society for Quality Control, Energy and Environmental Quality Division, Environmental
       Issues Group Specification and Guidelines for Quality Systems for Environmental Data
       Collection and Environmental Programs ANSI/ASQC E-4 (1994)

EPA Order 5360.1 CHG 1 (1998) Policy and Program Requirements to Implement the
       Mandatory Quality Assurance Program USEPA, Washington, DC April 1984

EPA Requirements for Quality Management Systems, EPA QA/R-2 available through
       http://www.epa.gov/qualityl/qa_docs.html

EPA Requirements for Quality Assurance Project Plans, EPA QA/R-5 available through
       http://www.epa.gov/qualityl/qa_docs.html

Guidance for Quality Assurance Project Plans, EPA QA/G-5 available through
       http://www.epa.gov/qualityl/qa_docs.html

Guidance for the Data Quality Objectives (DQO) Process, EPA QA/G-4 available through
       http://www.epa.gov/qualityl/qa_docs.html

SAB. 1995. EPA's Environmental Technology Innovation and Commercialization Enhancement
       Program (EnTICE) (EPA-SAB-EEC-95-016).  August 21, 1995

SAB. 1998.  Science Advisory Board Review of the Agency-Wide Quality Management Program
       (EPA-SAB-EEC-LTR-98-003). July 24, 1998

SAB. 1999.  Science Advisory Board Review of the Implementation of the Agency-Wide Quality
       System (EPA-SAB-EEC-LTR-99-002). February 25,  1999
                                         R-l

-------
   APPENDIX A - DOCUMENTS PROVIDED TO THE TECHNOLOGY
                           EVALUATION SUBCOMMITTEE
       The following documents were provided to the Technology Evaluation Subcommittee on
       compact disk.

Introduction
Note to Reviewers
SAB Charge
Environmental Technology Verification: Review of Quality Management Implementation
ETV Quality Management Plan

Advanced Monitoring Systems Pilot
Pilot Summary
Protocol for SAB Review
Generic Test/QA Plan for Verification of Portable NO/NO2 Emission Analyzers
Generic Test/QA Plan for Verification of On-Line Turbidimeters
Background Material
NO/NO2 Emission Analyzer Technology Profile
Test/QA Plan for Verification of Portable NO/NO2 Emission Analyzers
Test/QA Plan for Verification of On-line Turbidimeters
Portable Emission Analyzer Verification Report
Pilot Quality Management Plan
Air Pollution Control Technologies Pilot
Pilot Summary
Protocol for SAB Review
Generic Verification Protocol for Paint Overspray Arresters
Background Material
Test/QA Plan for Paint Overspray Arresters
Paint Overspray Arrestor Verification Report
Pilot Quality Management Plan

Drinking Water Systems Pilot
Pilot Summary
Protocol for SAB Review
Protocol for Equipment Testing for Physical Removal of Microbiological and Particulate Contaminants
(this protocol contains accompanying test plans)
Background Material

                                          A-l

-------
Pilot Quality Management Plan
                                          A-2

-------
Greenhouse Gas Technologies Pilot
Pilot Summary
Protocol for SAB Review
Natural Gas Compressor Leak Mitigation Technologies
Background Material
Compressor Rod Leak Mitigation Technology Profile
Test/QA Plan for Static Pac® System
Verification Report for Static Pac® System
Pilot Quality Management Plan

Indoor Air Products Pilot
Pilot Summary
Protocol for SAB Review
Large Chamber Test Protocol for Measuring Emissions of VOCs and Aldehydes
Background Material
Test Plan for Emissions of VOCs and Aldehydes from Commercial Furniture
Verification Report: Emissions of VOCs and Aldehydes from Commercial Furniture

P2 Innovative Coatings & Coating Equipment Pilot
Pilot Summary
Protocol for SAB Review
HVLP Equipment Generic Testing and Quality Assurance Protocol
Background Material
HVLP Spray Guns Testing and Quality Assurance Project Plan
HVLP Spray Gun Verification Report
Pilot Quality Management Plan

Site Characterization & Monitoring Technologies Pilot
Pilot Summary
Protocol for SAB Review
Verification Test Design Elements: Evaluation of PCB Field Analytical Techniques
Background Material
Technology Profile
Evaluation of PCB Field Analytical Techniques—Technology Demonstration Plan
PCB Field Analytical Techniques Verification Report
Pilot Quality Management Plan

Update of Pilots that are not Presenting Protocols
P2 Metal Finishing Technologies Pilot
P2, Recycling and Waste Treatment Systems Pilot
Source Water Protection Technologies Pilot
                                           A-3

-------
Wet Weather Flow Technologies Pilot
EvTEC (Independent Entity) Pilot
                                          A-4

-------
APPENDIX B- SUMMARY OF ELEMENTS OF THE EPA QUALITY
                                        SYSTEM
    The Agency's quality policy is consistent with ANSI/ASQC E-4 and is defined in EPA Order
    5360.1 CHG 1 (1998), the Quality Manual and the organizational components designed for
    policy implementation as described by the Agency's Quality System (EPA QA/G-0). The
    quality system provides the framework for planning, implementing, and assessing work
    performed by the organization for carrying out required quality assurance and quality control.

    EPA has a comprehensive system of tools for managing its data collection and use activities to
    assure data quality. The management tools  used in the organizational level of the EPA Quality
    System include Quality Management Plans and Management System Reviews. The technical
    tools used in the project level of the EPA Quality System include the Data Quality Objectives
    Process, Quality Assurance Project Plans, Standard Operating Procedures, Technical
    Assessments, and Data Quality Assessment.

    At the management level, the Quality System requires that organizations prepare  Quality
    Management Plan (QMP). The QMP provides an overview of responsibilities and lines of
    authority with regards to quality issues within an organization.  Therefore, not only does ETV
    have a QMP, but the verification partners and subcontractors are required to develop and
    implement their own QMPs. The ETV program calls these documents Quality and
    Management Plans.

    Organizations with QMPs  review their own performance and develop Quality Assurance
    Annual Report and Work Plans (QAARWP) that provide information on the previous
    year's QA/QC activities and those planned for the current year. The QAARWP functions as an
    important management tool at the organizational level as well as at the Agency-wide level when
    QAARWP supplied information is compiled across organizations.

    At longer multi-year intervals EPA conducts periodic Management System Reviews for
    organizations. An MSR consists of a site visit; a draft report that details findings and
    recommended corrective actions, consideration of the reviewed organization's formal response
    to the draft report and the authoring of a final report.

    At the project level, the data life cycle of planning, implementation and assessment becomes
    important.  The data life cycle begins with systematic planning. EPA recommends that this
    required planning be conducted using the Data Quality Objectives (DOO) Process.  The
    DQO process includes seven steps:

    1.     State the problem

                                        B-l

-------
2.     Identify the decision
3.     Identify the inputs to the decision
4.     Define the study boundaries
5.     Develop a decision rule
6.     Specify tolerable limits on decision errors
7.     Optimize the design

The Quality Assurance Project Plan (OAPP) is the principal output of the DQO process
and is the project-specific blueprint for obtaining data appropriate for decision-making.  The
QAPP translates the DQOs into performance specifications and QA/QC procedures for the
data collectors. In the ETV program the QAPPs  are known as Test/OA plans: these provide
a second level of assurance that the technology verification test will be performed in a matter to
generated objective and useful information of known quality.

The final step in the data life cycle is the Data Quality Assessment (DOA) which determines
whether the acquired data meet the assumptions and objectives of the systematic planning
process that resulted in their collection.  In other words, the DQA determines whether the data
are usable because they are of the quantity and quality required to support Agency decisions.
                                     B-2

-------
        APPENDIX C - ROSTERS
    Technology Evaluation Subcommittee




Environmental Engineering Committee (FYOO)




           Executive Committee

-------
                     U.S. Environmental Protection Agency
                             Science Advisory Board
                     Environmental Engineering Committee
                     Technology Evaluation Subcommittee

CHAIR
Dr. Michael J. McFarland, Associate Professor, Utah State University, River Heights, UT

EEC MEMBERS
Dr. Edgar Berkey, Vice President and Chief Science Officer, Concurrent Technologies Corporation,
      Pittsburgh, PA

Dr. Calvin C. Chien, Senior Environmental Fellow, E. I. DuPont Company, Wilmington, DE

Dr. Barry Dellinger, Patrick F. Taylor Chair and Professor of Chemistry, Louisiana State University,
      Baton Rouge, LA

Dr. John P. Maney, President, Environmental Measurement Assessment, Hamilton, MA

SAB CONSULTANT
Dr. Gordon Kingsley, Assistant Professor, School of Public Policy, Georgia Institute of Technology,
      Atlanta, GA

Science Advisory Board Staff
Ms. Kathleen White Conway Designated Federal Official (DFO),Committee Operations Staff,
      Science Advisory Board (1400A), 1200 Pennsylvania Avenue, NW, US Environmental
      Protection Agency, Washington, DC 20460

Ms. Mary Winston, Management Assistant, Committee Operations Staff,  Science Advisory Board
      (1400A),  1200 Pennsylvania Avenue, NW, US Environmental Protection Agency,
      Washington, DC 20460
                                      Rosters-2

-------
             U.S. ENVIRONMENTAL PROTECTION AGENCY
                             Science Advisory Board
                 Environmental Engineering Committee (FYOO)

CHAIR
Dr. Hilary I. Inyang, Director, Center for Environmental Engineering and Science Technologies
       (CEEST), University of Massachusetts, Lowell, MA

MEMBERS
Dr. Edgar Berkey, Vice President and Chief Science Officer, Concurrent Technologies Corporation,
       Pittsburgh, PA

Dr. Calvin C. Chien, Senior Environmental Fellow, E. I. DuPont Company, Wilmington, DE

Dr. Barry Dellinger, Patrick F. Taylor Chair and Professor of Chemistry, Louisiana State University,
       Baton Rouge, LA

Mr. Terry Foecke, President, Waste Reduction Institute, St. Paul, MN

Dr. Nina Bergan French, President, SKY+, Napa, CA

Dr. Domenico Grasso, Rosemary Bradford Hewlett Professor and Chair, Picker Engineering
       Program  Smith College, Northampton, MA

Dr. Byung Kim, Staff Technical Specialist, Ford Motor Company, Scientific Research Laboratories,
       Dearborn MI

Dr. John P. Maney, President, Environmental Measurements Assessment, Hamilton, MA

Dr. Michael J. McFarland, Associate Professor, Utah State University, River Heights, UT

Science Advisory Board Staff
Ms. Kathleen White Conway Designated Federal Official (DFO),Committee Operations Staff,
       Science Advisory Board (1400A), 1200 Pennsylvania Avenue, NW, US Environmental
       Protection Agency, Washington, DC 20460

Ms. Mary Winston, Management Assistant, Committee Operations Staff, Science Advisory Board
       (1400A), 1200 Pennsylvania Avenue, NW, US Environmental Protection Agency,
       Washington, DC 20460
                                       Rosters-3

-------
                      U.S. ENVIRONMENTAL PROTECTION AGENCY
                                SCIENCE ADVISORY BOARD
                                  EXECUTIVE COMMITTEE
                                         FY2000

INTERIM CHAIR
Dr. Morton Lippmann, Professor, Nelson Institute of Environmental Medicine, New York University
       School of Medicine, Tuxedo, NY

MEMBERS
Dr. Henry A. Anderson, Chief Medical Officer, Wisconsin Division of Public Health, Madison, WI

Dr. Richard J. Bull, President, MoBull Consulting, Inc., Kennewick, WA

Dr. Maureen L. Cropper, Principal Economist, The World Bank, Washington, DC

Dr. Kenneth W. Cummins, Senior Advisory Scientist, California Cooperative Fishery Research Unit
       and Adjunct Professor, Fisheries Department, Humboldt State University, Arcata, CA

Dr. Linda Greer, Senior Scientist, Natural Resources Defense Council, Washington, DC

Dr. Hilary I. Inyang, University Professor and Director, Center for Environmental Engineering,
       Science and Technology (CEEST), University of Massachusetts Lowell, Lowell, MA

Dr. Janet A. Johnson, Senior Radiation Scientist, Shepherd Miller, Inc., Fort Collins, CO

Dr. Roger E. Kasperson, University Professor and Director, The George Perkins Marsh Institute,
       Clark University, Worcester, MA

Dr. Joe L. Mauderly, Director & Senior Scientist, Lovelace Respiratory Research Institute,
       Albuquerque, NM

Dr. M. Granger Morgan, Head, Department of Engineering & Public Policy, Carnegie Mellon
       University, Pittsburgh, PA

Dr. William Randall Seeker, Senior Vice President, General Electric Energy and Environmental
       Research Corp., Irvine, CA

Dr. William H. Smith, Professor of Forest Biology, Yale University, New Haven, CT
                                        Rosters-4

-------
Dr. Robert N. Stavins, Albert Pratt Professor of Business and Government, Faculty Chair,
       Environment and Natural Resources Program, John F. Kennedy School of Government,
       Harvard University, Cambridge, MA

Dr. Mark J. Utell, Professor of Medicine and Environmental Medicine, University of Rochester
       Medical Center, Rochester, NY

Dr. Terry F. Young, Senior Consulting Scientist, Environmental Defense Fund, Oakland, CA

LIAISON FOR CHILDRENS HEALTH PROTECTION ADVISORY COMMITTEE
Mr. J. Thomas Carrato, Assistant General Counsel, Regulatory Affairs, Monsanto Company, St.
       Louis, MO

LIAISON FOR SCIENCE ADVISORY PANEL
Dr. Ronald Kendall, Director & Professor, The Institute of Environmental & Human Health, Texas
       Tech University/Texas Tech University Health Sciences Center, Lubbock, TX

LIAISON FOR ORD BOARD OF SCIENTIFIC COUNSELORS
Dr. Costel D. Denson, Professor of Chemical Engineering, University of Delaware, Newark, DE

SCIENCE ADVISORY BOARD STAFF
Dr. Donald G. Barnes, Staff Director/Designated Federal Officer, Science Advisory Board (1400A),
       1200 Pennsylvania Avenue, NW, US Environmental Protection Agency, Washington, DC
       20460

Ms. Betty Fortune, Office Assistant, Committee Operations Staff, Science Advisory Board
       (1400A), 1200 Pennsylvania Avenue, NW, US Environmental Protection Agency,
       Washington, DC 20460
                                       Rosters-5

-------
      United States      Science Advisory      EPA-SAB-EEC-00-012
      Environmental      Board (1400A)          August 2000
      Protection Agency    Washington DC       iviviv.epa.gov/sab

&EPA REVIEW OF EPA's
      ENVIRONMENTAL
      TECHNOLOGY
      VERIFICATION PROGRAM
      REVIEW BY THE TECHNOLOGY
      EVALUATION SUBCOMMITTEE OF
      THE ENVIRONMENTAL
      ENGINEERING COMMITTEE (EEC)

-------