EPA Report No: 600/R-08/009
                                                   January 2008
ENVIRONMENTAL TECHNOLOGY
      VERIFICATION PROGRAM
  QUALITY MANAGEMENT PLAN
           National Risk Management Research Laboratory
              National Exposure Research Laboratory
               Office of Research and Development
              U.S. Environmental Protection Agency
                    Cincinnati, Ohio 45268

-------
                         US Environmental Protection Agency
                   Environmental Technology Verification Program
                          QUALITY MANAGEMENT PLAN
                                     Approvals*
  Signed by Sally C. Gutierrez 07/30/07   Sally C. Gutierrez, Director
Signature                    Date       National Risk Management Research Laboratory
   Signed by  Teresa Harten, 12/09/07      Teresa M. Harten, Director
Signature                    Date       Environmental Technology Verification Program
  Signed by Trish Ericson, 08/08/07       Patricia Erickson, Acting Director
Signature                    Date       NRMRL Land Remediation and Pollution Control
                                       Division

  Signed by James Goodrich, 08/23/07    James Goodrich, Acting Director
Signature                    Date       NRMRL Water Supply and Water Resources
                                       Division

  Signed by Daniel Heggem, 07/13/07     Daniel Heggem, Acting Director
Signature                    Date       NERL Environmental Sciences Division
  Signed by Frank T. Princiotta, 07/16/07  Frank T. Princiotta, Director
Signature                    Date       NRMRL Air Pollution Prevention and Control
                                       Division

  Signed by Roy Fortmann, 07/11/07      Roy Fortmann, Ph. D., Acting Director
Signature                    Date       NERL Human Exposure and Atmospheric Sciences
                                       Division
* Original approval signatures are kept on file.
ETV QMP, Version 3.0                     ii                       January 2008

-------
                               ACKNOWLEDGMENTS

The first draft of the ETV QMP was developed for the pilot period (1995-2000) by a team of
writers consisting of the following quality assurance staff members from the U.S. Environmental
Protection Agency Office of Research and Development National Risk Management Research
Laboratory and National Exposure Research Laboratory: Sam Hayes, Lora Johnson, Ann Vega,
and Jeff Worthington.  Subsequent revisions included significant input in the form of comments
from members of the Environmental Technology Verification Team. Verification organizations
also provided comments. The team of writers was joined by the following EPA staff in
development of the final document: Nancy Adams, Penelope Hansen, Linda Porter, and Shirley
Wasson.

The first revision was completed in 2002 and reflected the conversion of the pilot projects to
ETV centers targeted to broader classes of technologies was developed by Shirley Wasson,
Teresa Harten, and Nancy Adams with input from the team.

The second revision was completed in 2007 and addressed the inclusion of ESTE projects in the
ETV program, delegation of QA review responsibilities to verification organizations, and
revision of the existing data policy. It was prepared by Robert Wright, Evelyn Hartzell, and
Teresa Harten with input from the ETV team and the verification organizations
ETV QMP, Version 3.0                     iii                       January 2008

-------
                             TABLE OF CONTENTS


DOCUMENTS AND GENERAL TERMS	vii

ABBREVIATIONS AND ACRONYMS	xv

INTRODUCTION	 1

PART A:   MANAGEMENT SYSTEMS	5

1.0   MANAGEMENT AND ORGANIZATION	5
   1.1    ETV quality policy	5
   1.2    Organization structure	5
   1.3    ETV customer identification and ETV customer needs and expectations	13
   1.4    Management negotiation with verification organizations on constraints	14
   1.5    Resources	 15
   1.6    Authority to stop work  for safety and quality consideration	15

2.0   QUALITY SYSTEM AND DESCRIPTION	 16
  2.1    Authorities and conformance to ANSI/ASQC E4-1994 quality standard	16
  2.2    Quality system documents	 16
  2.3    Quality system scope	 17
  2.4    Quality expectation for products and services	17
  2.5    Quality procedures documentation	 17
  2.6    Quality controls	 18
  2.7    Quality system audits (QSAs)	 18

3.0   PERSONNEL QUALIFICATION AND TRAINING	19
  3.1    Personnel training and qualification procedures	19
  3.2    Formal qualifications and certifications	 19
  3.3    Technical management and training	20
  3.4    Retraining	20
  3.5    Personnel job proficiency	20

4.0   ETV VERIFICATION ORGANIZATION SELECTION	22
  4.1    Planning and control of selection process	22
  4.2    Technical and quality requirements	22
  4.3    Quality specification/conformance	23
  4.4    Peer review of extramural agreements	23
  4.5    Conformance of verification testing efforts	23

5.0   DOCUMENTS AND RECORDS	24
  5.1    Scope	24
  5.2    Preparation, review, approval, and distribution	24
  5.3    Documents and records storage and obsolete documents and records	25

ETV QMP, Version 3.0                    iv                       January 2008

-------
6.0    COMPUTER HARDWARE AND SOFTWARE	27
  6.1    General procedures	27
  6.2    Scope of ETV computer hardware/software procedures	27
  6.3    Configuration testing	28
  6.4    Measurement and testing equipment configurations	28
  6.5    Change assessments - configurations, components, and requirements	28
  6.6    ETV website and ETV database roles and responsibilities	29

7.0    PLANNING	30
  7.1    Systematic planning process	30
  7.2    Planning document review	31

8.0    IMPLEMENTATION OF WORK PROCESSES	32
  8.1    Implementation	32
  8.2    Procedures	32
  8.3    Oversight	32

9.0    ASSESSMENT AND RESPONSE	33
  9.1    Numbers and types of assessments	33
  9.2    Procedures	34
  9.3    Personnel qualifications, responsibility, and authority	35
  9.4    Response	35

10.0     QUALITY IMPROVEMENT	36
  10.1     Annual review for quality improvement	36
  10.2     Detecting and correcting  quality system problems	36
  10.3     Cause-and-effect relationship	36
  10.4     Root cause	37
  10.5     Quality improvement action	37

PART B:   COLLECTION AND EVALUATION OF ENVIRONMENTAL DATA	38

1.0    VERIFICATION PLANNING AND SCOPING	38
  1.1    Systematic planning of the verification test	38
  1.2    Systematic planning for verification testing	39

2.0    DESIGN OF TECHNOLOGY VERIFICATION TESTS	41
  2.1    Design process	41
  2.2    GVPs and test/QA plans: planning documents from the design process	43

3.0    IMPLEMENTATION OF PLANNED OPERATIONS	48
  3.1    Implementation of planning	48
  3.2    Services and items	48
  3.3    Field and laboratory samples	49
  3.4    Data and information management	49
ETV QMP, Version 3.0                    v                      January 2008

-------
4.0    ASSESSMENT AND RESPONSE	50
  4.1    Assessment types	50
  4.2    Assessment frequency	51
  4.3    Response to assessment	52

5.0    ASSESSMENT AND VERIFICATION OF DAT A US ABILITY	52
  5.1    Data verification and validation	52
  5.2    Existing data	52
  5.3    Reports reviewed	52

REFERENCES	54

APPENDIX A: U.S. EPA RECORDS CONTROL SCHEDULE	56

APPENDIXB: WHAT CONSTITUTES SUCCESS FORETV?	57
  B.I    Timing	57
  B.2    Outcomes	57

APPENDIX C: EXISTING DATA POLICY	58
  C.I    Background	58
  C.2    Requirements for reviewing and usinge existing data	59

APPENDIX D: RECOMMENDED LANGUAGE FOR SOLICITATION OF VERIFICATION
ORGANIZATIONS	69
ETV QMP, Version 3.0                  vi                     January 2008

-------
                    DOCUMENTS AND GENERAL TERMS

Annual ETV progress report
   The report developed on an annual basis to report implementation of the ETV program.

ANSI/ASQC E4-1994
   American National Standard: Specifications and Guidance for Quality Systems for
   Environmental Data Collection and Environmental Technology Programs is the national
   consensus standard for quality management systems for environmental programs, is the
   Agency standard, and is applicable to extramural agreements.

ANSI/ASQC E4-2004,
   Specifications and Guidelines for Quality Systems for Environmental Data Collection and
   Environmental Technology Programs. Replaces the ANSI/ASQC E4-1994. More compatible
   with ISO 9000

Assessment
   The evaluation process used to measure the performance or effectiveness of a system and its
   elements.  As used here, assessment is an all-inclusive term used to denote any of the
   following: quality systems audit, technical systems audit, and audit of data quality. Guidance
   may be found  at www.epa.gov/qoality/qa ...... docs.html.

Audit of data quality
   An examination of a set of data after  it has been collected and 100% verified by project
   personnel, consisting of tracing at least 10% of the test data from original recording through
   transferring, calculating, summarizing and reporting. (Note: "10% of the test data" means a
   random selection of 10% of the data from all of the measured parameters.) It is documented
   in a data audit report. The goal is to determine the usability of test results, as defined during
   the design process. Guidance may be found at wgw^e|M4>Qy^£ija^^
Certify
   To guarantee a technology as meeting a standard or performance criteria into the future.
   Synonyms are ensure, warrant, and guarantee. ETV does not certify technologies.

Data quality indicators
   Quantitative and qualitative measures of principal quality attributes including precision,
   accuracy, representativeness, comparability, completeness and sensitivity. One way to
   employ DQIs is as a means of specifying quality goals or criteria which, if achieved, will
   provide an indication that the resulting data are expected to meet DQOs. Used in this way,
   DQIs provide a metric against which the performance of a program can be measured during
   the implementation and/or assessment phases of a verification test.
ETV QMP, Version 3.0                     vii                        January 2008

-------
Data quality objectives
   The qualitative and quantitative statements derived from the DQO process that clarify a
   study's technical and quality objectives, define the appropriate type of data, and specify
   tolerable levels of potential decision errors that will be used as the basis for establishing the
   quality and quantity of data needed to support decisions. Guidance may be found at
Document
   An instruction, specification, or plan containing information on how the ETV program
   functions, how specific tasks are to be performed, or how specific products or services are to
   be provided.  Examples of documents include the ETV QMP, the center/project QMPs,
   GVPs, and test/QA plans.

DQO process
   A systematic planning process that clarifies test objectives and establishes a basis for the
   types, quality, and quantity of data required to support customers' decisions for use of a
   technology. It provides a method for establishing DQOs for an individual technology
   verification test. Guidance may be found at www,epa^^/guaiity/ga_docs, JitmJ,

Environmental and Sustainable Technology Evaluations
   An ETV program element that expands ETV's ability to respond directly to EPA's need for
   credible performance information on innovative and commercial-ready technologies with
   potential to address high-risk environmental problems. It continues to maintain the quality
   assurance, cost-sharing, and stakeholder involvement that are fundamental operating
   principles of ETV. Under ESTE, the prioritized categories for verification are chosen by the
   EPA Office of Research and Development (ORD) with program office and/or regional office
   support. Project managers from ORD will direct the verifications using contractor support.
   All environmental technology categories are considered under ESTE, with the exception of
   remediation technologies which are covered under the EPA Superfund Innovative
   Technology Evaluation (SITE) Program.

Environmental data
   Any measurements or information that describe environmental processes, location, or
   conditions; ecological or health effects and consequences; or the performance of
   environmental technology. For EPA, environmental data include information collected
   directly from measurements, produced from models, and compiled from other sources such
   as data bases or the literature.

Environmental technology
   An all-inclusive term that is used to describe pollution control devices and systems, waste
   treatment processes and storage facilities, and site remediation technologies and their
   components that may be used to remove pollutants or contaminants from, or to prevent them
   from entering, the environment. Examples include wet scrubbers (air), soil washing (soil),
   granulated activated carbon unit (water),  and filtration (air, water). Usually, this term applies

ETV QMP, Version 3.0                     viii                        January 2008

-------
   to hardware-based systems; however, it can also apply to methods or techniques used for
   pollution prevention, pollutant reduction, or containment of contamination to prevent further
   movement of the contaminants, such as capping, solidification or vitrification,  and biological
   treatment.

Environmental Technology Verification
   This EPA program develops generic verification protocols and verifies the performance of
   innovative environmental technologies that have the potential to improve protection of
   human health and the environment. It was created to accelerate the entrance of new
   environmental technologies into the domestic and international marketplace. It also verifies
   monitoring and treatment technologies relevant for water security.

EPA directors of quality assurance
   QA directors for the EPA ORD laboratories, the National Exposure Research Laboratory and
   the National Risk Management Research Laboratory

EPA quality managers
   The EPA QA employees who are designated by EPA line management to manage QA efforts
   on behalf of the ETV center project officer or ESTE project manager. This is usually the
   Division's QA Manager.

EPA line management
   The management structure (i.e., branch chief, division director, and laboratory director) to
   whom each ETV center project officer or ESTE project manager reports.

EPA review/audit reports
   The quality records that are developed by EPA as  a result of conducting assessments during
   ETV implementation.

ESTE project
   An environmental technology verification that is initiated by an ORD researcher,
   collaborating with program office or regional office partners under the ESTE effort.

ESTE project manager
   The EPA employee who is designated by EPA line management to serve as the lead for an
   individual ESTE project.

ETV centers
   The different organizations that falls under the Environmental Technology Verification
   (ETV) Program which are in charge of verifying the performance of innovative technologies
   that have the potential  to improve protection of human health and the environment. The
   centers, under the lead of an ETV center project officer from EPA, work with the verification
   organizations.  ETV currently operates five ETV centers that test and evaluate the
   performance of environmental technology in all environmental media—air, water, and land
ETV QMP, Version 3.0                     ix                       January 2008

-------
ETV center project officer
   The EPA employee who is designated by EPA line management to serve as the lead for an
   individual ETV center.

ETV coordination staff
   The EPA employees who work directly with the ETV director to coordinate and implement
   outreach and evaluation activities at the program level.

ETV extramural agreement
   The contractual record that is developed by the EPA and signed by the verification
   organization.

ETV director
   The EPA employee who is designated by EPA ORD to lead the ETV team.

ETV team
   EPA employees actively working on the ETV program; the ETV director, ETV coordination
   staff, ETV center project officers, ESTE project managers, EPA directors of quality
   assurance (currently these are the NRMRL and NERL Directors of Quality Assurance), and
   the EPA quality managers are core members.

ETV test objective
   The stated objective(s) of each test. Verification organizations use the DQO process or
   systematic planning to establish test objectives and test measurement quality criteria.

ETV verification
   A verification test that is performed only by the EPA ETV program.

ETV verification report
   The report of the result of an individual verification test and/or accepted existing data.

ETV verification statement
   A summary statement, developed by the verification organization and approved by the EPA
   laboratory director, which reports individual technology performance.

ETV webmaster
   The person designated by EPA line management with responsibility for establishing and
   maintaining the ETV website.

Evaluate
   To  carefully examine and judge the efficacy of a technology; to submit technologies for
   testing under conditions of observation and analysis.  Synonyms are measure, estimate, and
   classify, test. ETV does evaluate technologies.
ETV QMP, Version 3.0                     x                        January 2008

-------
Existing data
   Existing data are data or information that you plan to use that have not been newly generated
   by your project. They may also be known as secondary data or non-direct measurement.

Generic verification protocol
   (Also known as guideline document, generic test protocol, or protocol).  This document is
   developed, modified, or selected to promote uniform testing procedures by the verification
   organization for a single class of technologies. Adequate documentation of a robust GVP
   may allow the development of abbreviated individual test/QA plans which incorporate the
   GVP by reference. GVPs may retain draft status until verification testing is performed, then
   finalized, building upon the testing experience.

Independent assessment
   An assessment performed by a qualified individual, group, or organization that is not part of
   the organization directly performing and accountable for the work being assessed.

Internal assessment
   An assessment of work conducted by individuals, groups, or  organizations directly
   responsible for overseeing and/or performing the work.

Laboratory director
   The directors of EPA ORD laboratories, the National Exposure Research Laboratory and the
   National Risk Management Research Laboratory.

Metadata
   A type of data that describes and defines other data, but what makes it different from
   ordinary data is how it is used. It refers to data that are used to describe a data set, such as the
   content,  quality, and condition of data. It is the information that answers questions like: Who
   owns the data? How was the data collected? How current is the data?  It is a set of facts about
   data and other information elements. It is everything  except for the data itself.

Office of Research and Development Assistant Administrator
   The administrative employee who directs EPA's Office of Research & Development.

Peer review
   A documented critical review, which is conducted by qualified individuals (or organizations)
   who are independent of those who performed the work but who are collectively equivalent in
   technical expertise (i.e., peers) to those who performed the original work.  A peer review is
   conducted to ensure that activities are technically adequate, competently performed, and
   properly documented and that they satisfy established quality requirements.  It is an in-depth
   assessment of the assumptions, calculations, extrapolations, alternative interpretations,
   methodology, acceptance criteria, and  conclusions pertaining to the specific major scientific
   and/or technical work product and of the documentation that supports them.
ETV QMP, Version 3.0                      xi                        January 2008

-------
Performance evaluation audit
   A quantitative evaluation of a measurement system, usually involves the measurement or
   analysis of a reference material of known value or composition.

Program support contractor
   The contractor selected to assist with the generation of the annual report.

Quality assurance
   An integrated system of management activities involving planning, implementation,
   documentation, assessment, reporting, and quality improvement to ensure that a process,
   item, or service is of the type and quality that is needed and expected.

Quality control
   The system of technical activities that measures the attributes and performance of a process,
   item, or service against defined standards to verify that they meet stated requirements;
   operational techniques and activities that are used to fulfill  requirements for quality.

Quality management plan
   The specific policies and procedures that have been established for managing quality-related
   activities in the ETV program. It is the "blueprint" that defines an organization's QA policies
   and procedures; the criteria for and areas of QA application; and the different QA-related
   roles, responsibilities, and authorities of personnel.

Quality system
   A structured and documented management system describing the policies, objectives,
   principles, organizational authority, responsibilities, accountability, and implementation plan
   of an organization for ensuring quality in its work processes, products (items), and services.
   It provides the framework for planning, implementing, documenting, and assessing work
   performed by the organization and for carrying out required QA and QC activities.

Quality system audit
   An on-site review of the implementation of a verification organization's quality system as
   documented in the approved QMP. This review is used to verify the existence of, and
   evaluate the adequacy of, the internal quality system. It is the qualitative assessment of data
   collection operations and/or organization(s) to evaluate the adequacy of the prevailing quality
   management structure, policies, practices, and procedures for obtaining the type and quality
   of data needed. Guidance may be found atwww.epa.gov/qoality/qa	docs.html.

Raw data
   All data and information recorded in  support of analytical and process measurements made
   during planning, testing, and assessing environmental technology including records such as:
   computer printouts, instrument run charts, standards preparation records, field log records,
   technology operation logs, and monitoring records. ETV test files (all records including raw
   data) and technical data and associated quality control data which support the data that are
   summarized and the conclusions that are made in each ETV verification report.

ETV QMP, Version 3.0                      xii                         January 2008

-------
Record
   A statement of data and facts pertaining to a specific event, process, or product, that provides
   objective evidence that an activity has occurred. All books, papers, maps, photographs,
   machine readable materials, or other documentary materials, regardless of physical form or
   characteristics, made or received by the EPA or a verification organization or their
   designated representative for the ETV program are records. Examples include verification
   statements and reports, raw and summary data tables, data notebooks, audit reports, and
   stakeholder meeting minutes.

Stakeholder groups
   Groups set up for each ETV center and ESTE project consisting of representatives of any or
   all of the following groups: buyers and users of technology, technology developers/vendors,
   the consulting engineers, the finance and export communities, government permitters,
   regulators, first responders, emergency response, disaster planners, public interest groups,
   and other groups interested in the performance  of innovative environmental technologies.

Standard operating procedures
   Procedures describing routine verification activities including sample collection, analytical
   testing, and associated verification processes.

Technical systems audit
   A qualitative on-site evaluation of sampling and/or measurement systems.  The objective of
   the TSA is to assess and document acceptability of all facilities, maintenance, calibration
   procedures, reporting requirements, sampling and analytical activities, and quality control
   procedures. An approved test/QA plan provides the basis for the TSA.  Guidance may be
   found at www.epa.gov/quality/qa_docs.html.

Test measurement
   Those critical measurements that must be made during the course of a verification test to
   evaluate achievement of the ETV test objective.

Test/QA plan
   The plan developed by a verification organization for each individual test of a technology or
   technology class. Therefore, the test/QA plan may include more than one technology. The
   test/QA plan provides the experimental approach with clearly stated test objectives and
   associated quality objectives for the related measurements.  The test/QA plan may incorporate
   or reference existing GVPs or provide the basis for refining draft GVPs. Guidance may be
   found at www.epa.gov/quality/qa_docs.html.

Verification (data verification)
   The process of evaluating the completeness, correctness, and conformance/compliance of a
   specific data set against the method, procedural, or contractual requirements. Guidance may
   be found at www.epa.gov/quality/qa docs.html.
ETV QMP, Version 3.0                      xiii                       January 2008

-------
Verification (ETV verification)
   Establishing or proving the truth of the performance of a technology under specific,
   predetermined criteria, test/QA plans or GVPs, and adequate data QA procedures.

Verification organizations
   The public and private sector organizations holding cooperative or interagency agreements or
   contracts to assist EPA in implementing the ETV program.

Verification organization manager
   The person designated by the verification organization to manage the ETV center and ESTE
   project and serve as the chief point of contact with the EPA.

Verification organization quality manager
   The person designated by the verification organization to manage QA for the ETV center and
   ESTE project on behalf of the verification organization manager.

Verification organization quality management plan
   The documented procedures for quality-related activities developed and implemented by the
   verification organization to assure quality in the work processes and services developed for
   ETV. If the verification organization has a current quality system that conforms to EPA
   quality requirements, additional quality system elements do not need to be developed.

Verification report
   Contains the technology description, how the tests were conducted, performance verification
   test results, statistics and summary. The report is developed for each verified technology.

Verification statement
   A statement signed by the EPA laboratory director and the VO manager. The statement
   contains the summary (2 to 7 pages) of the test results for a given environmental technology.

Verification test
   The simultaneous performance evaluation of one or more similar environmental technologies
   that is documented in a single test/QA plan and in one or more verification statements and
   verification reports. Multiple verification tests, each with its own test/QA plan, may be
   conducted at different times under a  single GVP.

Verify
   To establish or prove the truth of the performance of a technology under specific,
   predetermined criteria or protocols and adequate data quality assurance procedures.
   Synonyms are confirm, corroborate,  substantiate, and validate. ETV does verify
   technologies.
ETV QMP, Version 3.0                     xiv                        January 2008

-------
                    ABBREVIATIONS AND ACRONYMS

   ACE      any credible evidence
   ADQ      audit of data quality
   ANSI     American National Standards Institute
   ASQ      American Society for Quality
   CA       cooperative agreements
   CMD     Contracts Management Division
   DEP      data evaluation panel
   DQO      data quality obj ective
   EPA      U. S. Environmental Protection Agency
   ESTE     Environmental and Sustainable Technologies Evaluation
   ETV      environmental technology verification
   FBO      FedBizOpps (formerly Commerce Business Daily)
   FRC      Federal Records Center
   FTE      full time equivalent
   GAD      Grants Administration Division
   GVP      generic verification protocol
   IAG      interagency agreement
   IGE      independent government estimate
   IQGs      Information Quality Guidelines
   ISO      International Organization for Standardization
   NERL    ORD's National Exposure Research Laboratory
   NRMP    National Records Management Program
   NRMRL  ORD's National Risk Management Research Laboratory
   OAQPS   EPAs Office of Air Quality Planning and Standards
   OMIS     ORD's Management Information System
   ORD      EPA's Office of Research and Development
   OSHA    Occupational Safety and Health Administration
   PEA      performance evaluation audit
   PARS     EPA's Performance Appraisal and Recognition System
   PO       project officer
   QA      quality assurance
   QAPP    quality assurance project plan
   QSA      quality systems audit
   QC       quality control
   QMP      quality management plan
   SOP      standard operating procedure
   SOW     statement of work
   TSA      technical systems audit
ETV QMP, Version 3.0
xv
January 2008

-------
                                 INTRODUCTION

Background
The Environmental Technology Verification Program (ETV) was established by the U.S.
Environmental Protection Agency (EPA) to evaluate the performance characteristics of
innovative environmental technologies across all media and to report objective performance
information to the permitters, buyers, and users of environmental technology. ETV evolved in
response to the following mandates:

   a 1995 Presidential directive to EPA in bridge to a Sustainable Future - National
   Environmental Technology Strategy, to "work with the private sector to establish a market-
   based verification process . . . which will be available nationally for all environmental
   technologies within three years."
   goals articulated in the Administration's Reinventing Government; A Performance Review
   which directed EPA to begin a comprehensive environmental technology verification
   program no later than October 1995.
-  Congressional appropriation language, contained in the FY96 and FY97 budgets, that the
   Agency fund technology verification activities at the $10 million level in each year.

To comply with these directives, EPA's Office of Research and Development (ORD) established
a five-year pilot program to evaluate alternative operating parameters and determine the overall
feasibility of a technology verification program. ETV began the five-year pilot period in October
1995. At the conclusion of the pilot period, the Agency prepared a Report to Congress containing
an evaluation of the results of the pilot program and recommendations for its future operation.

Credible, high-quality performance information is one of the tenets of ETV.  Therefore, the
highest appropriate level of QA is used throughout the program. The EPA's Office of Research
and Development, under which ETV operates, has implemented an Agency-wide quality system
to assure that  activities conducted in EPA research laboratories and other facilities or at facilities
being operated on behalf of or in cooperation with the EPA are supported by data of known and
acceptable quality for their intended use. Each of the ORD laboratories involved in ETV, the
National Risk Management Research Laboratory (NRMRL) and the National Exposure Research
Laboratory (NERL), operate under laboratory-specific quality management plans (QMPs). The
ETV QMP is  consistent with the policies expressed in the individual laboratory QMPs and is
intended to provide an overarching, uniform quality system for all aspects of the ETV program.

Program Description
Developers of innovative environmental technology report numerous impediments to
commercialization. Among those most frequently mentioned is the lack of acceptance of
technology developer/vendor performance claims. The success of the pilot program shows that
objective, independently acquired, high-quality performance data and operational information on
new technologies significantly facilitates the use, permitting, financing, export, purchase, and
general marketplace acceptance of such technologies. ETV provides these data and information


ETV QMP, Version 3.0                      1                         January 2008

-------
to the customer groups that require them to accelerate the real world implementation of improved
technology. Improved technology more thoroughly, rapidly, and efficiently protects human
health and the environment. It is important to stress that the product of ETV is high-quality data
and information, not technology approval or endorsement. Although there is substantial EPA
involvement in guiding and administering this program, ETV does not provide EPA endorsement
or certification of commercial products.

At the conclusion of the pilot period the Agency internally reviewed the performance and
operation of the program to assess its future direction and scope. The ETV Director
recommended consolidation of the program into six technology centers:
       Advanced Monitoring Systems Center (AMS)
       Air Pollution Control  Technology Center (APCT)
   -   Greenhouse Gas Technology Center (GHG)
   -   Drinking Water Systems Center (DWS)
   -   Water Quality Protection Center (WQP)
   -   Pollution Prevention, Recycling and Waste Treatment Center (P2,R,WT)

During 2000 and 2001 the first five centers above were established. The sixth center was not put
in place due to a lack of adequate funding to support it. However the Coatings and Coating
Equipment Pilot Program, which was part of the P2,R,WT, continues to operate.

Environmental and Sustainable Technology Evaluations (ESTE)
In 2005, the program introduced a new way of operating to expand ETV's ability to respond
immediately and directly to high priority Agency problems. Under the new ETV program
element, Environmental and Sustainable Technologies Evaluations (ESTE), the verifications are
initiated by ORD researchers, collaborating with program office or regional office partners.
ESTE maintains the quality assurance, cost sharing and stakeholder involvement of the original
ETV program, with the categories for verification chosen by  the Agency. Office of Research
and Development (ORD) project managers direct the verifications using contractor support. A
three-step process is followed.

In Step One, technology categories for verification are chosen through an ORD competitive
process. All environmental technology categories, except remediation technologies, are
considered. A review committee recommends prioritized categories for verification.

In Step Two, funding is provided to the ESTE project manager to conduct a scoping study which
identifies the commercial-ready technologies and their vendors within the category, confirms
collaborative partners and identifies others, develops a multi-interest stakeholder group to  assist
in developing and reviewing a GVP and/or test/QA plan, develops and approves the GVP  and/or
test/QA plan, and develops the budget for actual verification.

In Step Three, the ESTE project manager with contractor assistance conducts testing and
finalizes peer-reviewed and quality-assured verification reports and statements. ESTE projects
may also involve technology transfer of verification results, and follow-up with the technology
user and permitting community to determine usefulness of the verification information in

ETV QMP, Version 3.0                      2                        January 2008

-------
decision-making. Projections to document policy-making and pollutant reduction outcomes of
the research may also be conducted under ESTE.

Operation of the ETV Centers and ESTE Projects
The verification organizations work with or for EPA through an extramural agreement (a
cooperative agreement (CA), an interagency agreement (IAG), or an existing contract). Each
agreement has oversight by an EPA project officer who may also be the ETV center project
officer or ESTE project manager. EPA provides substantial oversight through an active QA
program. Each verification organization is contractually required to fully implement EPA QA
requirements for planning, auditing, and documenting the testing and reporting activities. If the
verification organization intends to perform verifications by contracting or sub-contracting with
other organizations, EPA quality requirements and all of the controls incumbent upon the
verification organization  that are specified in Section 4.1 pass through to the contractor or sub-
contractor and the verification organization is responsible for ensuring that these controls are in
place.  Qualified reviewers review the technical aspects of the test/QA plans and final reports.

Program and Quality Management Document
The ETV QMP is the program and quality management document being used by ETV to guide
its operation.  It uses the structure, policies, and standards established in the American National
Standard Specification and Guidelines for Quality Systems for Environmental Data Collection
and Environmental Technology Programs (ANSI/ASQC E4-1994). This latter document, ". . .
describes a basic set of mandatory specifications and non-mandatory guidelines by which a
quality system for programs involving environmental data collection and environmental
technology can be planned, implemented, and assessed". Based on the structure and standards of
ANSI/ASQC E4-1994, the ETV QMP contains the definitions, procedures, processes, inter-
organizational relationships, and outputs that assure the quality of both the data and the
programmatic elements of ETV. Part A of the ETV QMP contains the  specifications and
guidelines that are applicable to common or routine quality management functions and activities
necessary to support the ETV program. Part B contains the specifications and guidelines that
apply to test-specific activities involving the generation,  collection, analysis, evaluation, and
reporting of environmental data.

The ETV QMP is designed to play a major role in clearly delineating the roles and
responsibilities of all the diverse and important participants. The ETV program is
organizationally complex. Within EPA, the program is coordinated through ORD's ETV Team,
consisting of staff from ten branches located in five divisions in two laboratories, NRMRL and
NERL including the QA staff assigned to each organizational element.  There are also numerous
outside organizations involved through the extensive stakeholder process, the verification
organizations who bear most of the QA responsibilities, and testing and consulting companies
hired by verification organizations to conduct field and laboratory work. Finally, EPA program
offices and regions are increasingly involved in outreach activities, as are other Federal agencies
and states.

Each verification organization uses the ETV QMP and ANSI/ASQC E4-1994 to create ETV
center-specific or ESTE project-specific QMPs that assure that the testing and evaluation efforts

ETV QMP, Version 3.0                      3                         January 2008

-------
carry the appropriate level of QA to meet the needs of the users of the performance information.
These QMPs are submitted to EPA for review and approval at the outset of the operation and are
reviewed annually with the review of the ETV QMP. The annual reviews incorporate lessons
learned from the experiences of the ETV centers and ESTE projects, the feedback from the
program's customers, and to accommodate any policy or programmatic changes.

Note: If a verification organization already has a QMP for an ETV center that conforms to the
ETV QMP (this document) and ANSI/ASQC E4-1994, it need not prepare another QMP
specifically for an ESTE project.  Some modifications may be needed, however, to address
differences that may be introduced by using a different extramural mechanism (e.g., contracts) to
perform an ESTE project. In general these modifications may be handled using an addendum to
the QMP or within an ESTE joint QMP/test/QA plan.
ETV QMP, Version 3.0                     4                        January 2008

-------
                     PART A:   MANAGEMENT SYSTEMS

Part A of the ETV QMP contains the specifications and guidelines that are applicable to common
or routine quality management functions and activities necessary to support the ETV program.

Part B of the ETV QMP contains the specifications and guidelines that apply to test-specific
environmental activities involving the generation, collection, analysis, evaluation, and reporting
of test data.

Note: The italicized text following the section headings refer to requirements of the ANSI/ASQC
E4-1994 and the Environmental Technology Verification Program Policy Compendium.

                 1.0    MANAGEMENT AND ORGANIZATION

1.1    ETV quality policy
The Office of Research and Development shall establish and implement a quality policy to ensure that the Environmental
Technology Verification (ETV) program produces the type and quality of program outputs needed and expected by ETV
clients.

The EPA Office of Research and Development's (ORD) quality policy for the Environmental
Technology Verification (ETV) program is established as follows:

    The quality system for the overall ETV program seeks to be consistent with industry
   consensus standards. Each verification organization shall implement a valid and approved
   quality system. The Agency's required quality system for cooperative agreements and
   contracts is ANSI/ASQC E4-1994. Each verification test will be performed according to
   planned and documented, pre-approved test/QA plans. All technical statements in ETV
   verification reports shall be supported by the appropriate data.

1.2    Organization structure
The relevant organizations, functional responsibilities, levels of accountability and authority, and lines of communication
shall be formally defined in the quality system and approved by the EPA laboratory directors responsible for the quality of
work performed by or in cooperation with each EPA laboratory.

The overall organizational structure of the ETV program graphically presents lines of
accountability, authority, and communication. The general functional responsibilities for the
major organizational units are specified in the structure.  See the organization chart for ETV
verifications in Figure 1. Note: the relationship of the stakeholder group to the rest of the ETV
organization may vary from that depicted in Figure 1 for different ESTE projects.

1.2.1   Assistant administrator for ORD and the EPA administrator responsibilities:
       provide overall program direction
       serve in a program leadership role with Congress, other agencies of the Executive
       Branch, and the general public

ETV QMP, Version 3.0                       5                         January 2008

-------
1.2.2   ORD laboratory directors' responsibilities:
       approve and implement annual program budgets and resource allocations
   -   allocate laboratory personnel and other resources to accomplish ETV's goals,

   Currently the following responsibilities are assumed by the NRMRL lab director
       appoints the ETV director
       select ESTE verification projects, unless delegated to the ETV director
   -   approve all ETV verification reports and statements
   -   approve ETV QMP
       ensure that appropriate program quality system assessments (see Table 9.1) are
       implemented

1.2.3   EPA line management's responsibilities:
       allocate appropriate personnel and other necessary resources to support the ETV centers
       and ESTE projects associated with the division
   -   appoint an EPA quality manager for each ETV center and ESTE proj ect
   -   approve ETV QMP
       provide oversight via administrative and technical review of ETV center and ESTE
       project outputs and products prior to public release
   -   review and approve verification reports and statements

1.2.4   ETV director responsibilities:
   -   leads the ETV team by providing communication opportunities, e.g., periodic conference
       calls, meetings, and training to both the team and the verification organizations
       coordinates the overall ETV program, including design of multi-year strategies, operating
       principles, implementation activities, and annual budgets
   -   communicates ETV team and program activities, progress, outputs, and recommendations
       to EPA, Congress, agencies in the Executive Branch, customer groups, and the general
       public
       reviews and approves the annual report
   -   maintains an up-to-date ETV website containing materials relevant to the program and to
       each ETV center and ESTE project
       manages overall ETV program outreach activities to ensure that stakeholder and
       customer groups are knowledgeable about the existence and use of ETV generated data
   -   collects data on operational parameters and program outputs to continuously evaluate the
       ETV program and make recommendations to management and the Congress on its
       present and future operation.
       reviews, approves, and assists in revision of ETV QMP
       ensures ETV QMP is implemented in the ETV program
   -   reviews CA, IAG, and contracting packages (e.g., statements of work [SOWs],
       independent government estimates [IGEs], and approval packages), including ESTE task
       orders and work assignments
   -   reviews ETV verification reports and statements and GVPs
ETV QMP, Version 3.0                      6                        January 2008

-------
                ETV  Organizational  Chart
                US EPA
                  AAQRB
           HEWL Olr.QA
                         WERL Lab Director
                          EPAQA
                                                    "NRMRL Lab Director —   LSAS1
                                      EPA Lsn® Wl-aniger —
                                                        ETAV2
            Private Sector
           VO QA
       Subcontractor QA
                           VO Manager
Subcontractor
Anaiyticai Late
                                                                     NRBJRL Die QA
                                                       ETV Director  -•-!--
                              EW PO     ^     ETV
                              ESTE3f»Q      ' Coordination Staff
Stakeholders     Technology Developers
                Atsw v^fiiar^
      1.   LSAS - Laboratory Support Accountability Staff
      2,   ET&V - Environmental Technology Assessment & Verification
      3.   ESTE - Enworimerrtai arid Sustainable Technotoyy Evaitiation
Figure 1. ETV Program Organizational Chart *
   (* Organizational Chart does not necessarily reflect the line of responsibilities. The above chart is a summary.
A detailed explanation of the roles and responsibilities is provided in Section 1.2 )

1.2.5   ETV team responsibilities:
   -   establishes mutually acceptable program-level strategies and protocols
   -   participates in development of overarching ETV outreach and outcomes strategies
       communicates ETV center-specific and ESTE project-specific progress, issues,
       difficulties, and lessons learned
   -   meets to discuss program objectives, seek collegial guidance, and evaluate success
   -   reviews ETV QMP
       prepares annual report

1.2.6   ETV coordination staff responsibilities:
   -   oversee and perform programmatic outreach, including web site development, conference
       exhibitions, publications (journal articles, fact sheets, brochures,  etc.), and
       workshops/outreach events
   -   compile and  document ETV outcomes, in coordination with ETV centers and ESTE
       project staff
       evaluate and communicate programmatic progress, issues, difficulties, and lessons
       learned
ETV QMP, Version 3.0
                                           January 2008

-------
   -   document and interpret ETV policies and procedures
       host team training events, meetings, and conference calls
       review ETV QMP and track implementation
   -   prepare annual report

1.2.7   ETV center project officer responsibilities:
       oversees verification organizations
   -   has overall responsibility for the quality of verification tests conducted by the ETV center
       verification organization
       reviews and approves verification organization QMP, ETV verification reports and
       statements, and generic verification protocols (GVPs)
   -   reviews and approves test/QA plans. The responsibility for reviewing and approving
       certain test/QA plans may be delegated to the verification organization quality manager.
       Section 2.2.2.1, Part B discusses the delegation process of test/QA plans.
       communicates requirements for and oversee the verification organization quality system,
       and ETV policies and procedures
   -   arrange for peer review of verification organization proposals and ETV verification
       reports, and statements.
       attend and/or conduct regular meetings with stakeholders
       reviews minutes from stakeholder meetings
   -   oversee production and approval process of GVPs, test/QA plans, ETV verification
       reports and statements
       assist with ETV outreach activities for the assigned ETV center
   -   participate in ETV team activities, including the development of the annual report and
       input to the ETV online database
       ensure that appropriate center-specific assessments (see Table 9.1) are implemented
       reviews independent QA document reviews and assessment reports by EPA quality
       manager
   -   reviews verification organization internal QA reviews and assessment reports (e.g.,
       test/QA plan reviews, internal assessment reports, audits of data quality)
       assists in verification outcomes projection at start of verification process and helps to
       track outcomes
   -   communicates ETV results and outcomes to EPA line management, EPA program
       offices, states, local government, and the public.

1.2.8   ESTE project manager responsibilities
   -   attends and/or conduct regular meetings with  stakeholders
       reviews minutes from stakeholder meetings
       manages  the oversight and conduct of verification activities
   -   oversees verification organizations
   -   has overall responsibility for the quality of verification tests conducted by the ESTE
       project
       selects technologies for verification based on  input from EPA labs, regions, and program
       offices, technology developers/vendors, and other stakeholders
   -   communicates requirements for and oversees  the verification organization quality system,

ETV QMP, Version 3.0                       8                         January 2008

-------
       and ETV policies and procedures
       arranges for peer review of verification organization proposals and ETV verification
       reports and statements
   -   reviews and approves verification organization QMP
   -   oversee production and approval process of test/QA plans, GVPs, and ETV verification
       reports and statements
       reviews and approves test/QA plans, GVPs, and verification reports and statements
   -   reviews independent QA document reviews and assessment reports by EPA quality
       manager
       review verification organization internal QA reviews and assessment reports (e.g.,
       test/QA plan reviews, internal assessment reports, audits of data quality) and initiates
       corrective actions
   -   reviews and approves existing data submitted by third-party testing organizations or
       technology developers/vendors
       conducts  outreach activities for the ESTE project
   -   participates in ETV team activities, including the development of the annual report and
       input to the ETV online database
       ensure that project-specific assessments (see Table 9.1) are implemented
       projects outcomes at start of verification process and tracks actual outcomes after
       completion of verification tests
   -   communicates ETV results and outcomes to EPA line management, EPA program
       offices, states, local government, and the public.

1.2.9   ETV center verification organization responsibilities:
   -   establishes, attends, and/or conducts meetings of stakeholders
       mediates  and facilitates the stakeholders' selection of technology focus areas
       documents stakeholder meetings with minutes to reflect discussions and decisions
   -   maintains communication with EPA to assure mutual understanding and conformance
       with EPA quality procedures and ETV policies and procedures
       manages the oversight and conduct of verification activities
       assures that QA/QC procedures are incorporated into all aspects of each verification test
   -   develops  DQOs based on input from stakeholders and ETV center project officer
   -   develops  a center QMP, test/QA plans, and GVPs based on input from stakeholders, ETV
       center project officer, and developers/vendors
       ensures that all subcontractors and analytical labs conform to the requirements of the
       GVP and/or test/QA plan
       solicits technology developer/vendor proposals or developer/vendor products
       selects technologies for verification based on input from stakeholders and technology
       developers/vendors
   -   develops  agreements for verification tests with technology developers/vendors and other
       collaborators, including cost-sharing agreements
       conducts  ETV verification tests and other ETV activities within their documented and
       approved QMP, GVPs, and test/QA plans
   -   reviews internal QA reviews and assessment reports (e.g., test/QA plan reviews, internal
       assessment reports, audits of data quality) and initiates corrective actions

ETV QMP, Version 3.0                       9                         January 2008

-------
   -   reviews independent QA document reviews and assessment reports by EPA quality
       manager
       prepares ETV verification reports on verification tests
   -   prepares a three-to-five page ETV verification statement at the completion of each
       verification test
       appoints a verification organization quality manager who is independent of those
       generating project information
   -   reviews and internally approves existing data submitted by third-party testing
       organization and technology developers/vendors
       provides input for quarterly and annual reports
       assists with ETV outreach activities for the assigned ETV center
   -   participates in meetings, conference calls and other activities with the ETV team
   -   projects and tracks verification outcomes
       submits a written request to the EPA center proj ect officer and the EPA quality manager,
       who then forwards it to the EPA director of quality assurance, that the responsibility for
       reviewing and approving test/QA plans be delegated to verification organization quality
       manager.

1.2.10 ESTE project verification organization responsibilities:
       communicates with EPA to assure mutual understanding and conformance with EPA
       quality procedures and expectations and ETV policies and procedures
       establishes, attends, and/or conducts meetings of stakeholders
       mediates and facilitates the stakeholders' input on technology focus areas
   -   documents stakeholder meetings with minutes to reflect discussions and decisions
   -   assures that QA procedures are incorporated into all aspects of each ETV project
       solicits technology developer/vendor proposals or developer/vendor products
       develops agreements for verification tests with technology developers/vendors and other
       collaborators
   -   develops DQOs based on input from stakeholders and ESTE project manager
       develops, conducts, and/or oversees test/QA plans in cooperation with technology
       developer/vendors
   -   develops GVPs based on input from stakeholders, ESTE project manager, and
       developers/vendors
       ensures that all subcontractors and analytical labs conform to the requirements of the
       GVP and test/QA plan
   -   conducts ETV verification tests and other ETV activities within their documented and
       approved QMP
       prepares ETV verification reports on verification tests
       prepares a three-to-five page ETV verification statement at the completion of each
       technology verification
   -   appoints a verification organization quality manager who is independent of those
       generating project information
       projects outcomes at start of verification process and tracks actual outcomes of
       verification tests
   -   reviews and approves internal QA reviews and assessment reports (e.g., test/QA plan

ETV QMP, Version 3.0                      10                        January 2008

-------
       reviews, internal assessment reports, audits of data quality)
       reviews independent QA document reviews and assessment reports by EPA quality
       manager
    -   reviews and internally approves existing data submitted by third-party testing
       organization and technology developers/vendors

1.2.11  Subcontractor or Analytical Laboratory
       maintain communication with ETV center/ESTE project verification organization to
       assure mutual understanding and conformance with requirements of GVP and test/QA
       plan
       performs QA/QC procedures during technical or analytical activities
    -   conducts technical or analytical procedures in support of ETV center/ESTE project
       verification tests under technical oversight of ETV center/ESTE project verification
       organization
       reports technical or analytical results with associated QC check results to ETV
       center/ESTE project verification organization

1.2.12  Stakeholders' group responsibilities may include the following:
       attends stakeholder meetings
       reviews minutes from stakeholder meetings
    -   assists in development of GVPs, test/QA plans, and DQOs
       assists in prioritizing the types of technologies to be verified
       reviews project-specific procedures, test/QA plans, GVPs, and ETV verification reports
       emerging from the ETV center or ESTE project
    -   assists in the definition and conduct of outreach activities  appropriate to the technology
       area and customer groups
       serves as information conduits to the particular constituencies that each member
       represents
    -   assists in verification outcomes projection at start of verification process and help track
       outcomes, especially regulatory outcomes following verification process

1.2.13  Developers/vendors responsibilities:
       review GVPs and test/QA plans
       review verification reports and verification statements
       provide comment section input to verification reports and verification statements
    -   direct third-party testing organization to submit existing data to verification organization

1.2.14  EPA directors of quality assurance responsibilities:
       develop and implement the ETV quality system at the direction of the ETV director and
       in coordination with EPA quality managers and the ETV team
       document the ETV quality system in the ETV QMP with input from the ETV director
       and EPA quality managers
       review, and update, if necessary, the ETV QMP annually in cooperation with the ETV
       director and EPA quality managers
       work with EPA quality managers to ensure implementation of ETV QMP

ETV QMP, Version 3.0                     11                        January 2008

-------
   -   provide current copies of the ETV QMP to the appropriate participants in the ETV
       program
       communicate quality issues and information to the ETV team in a timely manner
   -   conduct internal quality systems audits (QSAs) of the ETV program
   -   work with the EPA QA manager the delegation of responsibility for the approval of
       test/QA plans to a verification organization's quality manager when a written request for
       delegation has been requested by the VO.
   -   may approve or reject the delegation of the test/QA plan review and approval
       responsibilities, after discussion with the EPA QA Manager, to a verification
       organization's quality manager provided that EPA has reviewed and approved previously
       a test/QA plan for a very similar technology, that the verification organization
       demonstrates it has a functioning quality system in place that has the same rigor and
       integrity as the EPA quality system, and that the verification organization quality
       manager is independent of the environmental data collection process.  EPA retains the
       right to review and approve any QAPP prepared by verification organizations under this
       delegated responsibility.

1.2.15  EPA quality manager responsibilities:
       assists EPA directors of quality assurance in documenting the ETV quality system in the
       ETV QMP
   -   communicates ETV quality system requirements, quality procedures, and quality issues
       to the assigned ETV center project officer or ESTE project  manager and verification
       organization
       oversees verification organization QA activities to verify conformance to the QA
       requirements of the ETV QMP (this document)
       reviews and approves verification organization QMP.
       performs and documents independent QSAs for each ETV center and ESTE project to
       verify conformance to the quality requirements of this document
   -   reviews and approves verification organization GVPs, test/QA plans, and verification
       statements and reports (see Table 5.1 for details).
       may review, on a routine basis, verification organization internal quality records (e.g.,
       test/QA plan reviews, internal assessment reports, audits of data quality)
    -  performs and documents independent technical systems audits (TSAs) and performance
       evaluation audits (PEAs) of verification tests, as appropriate, to verify conformance to the
       QA requirements of the applicable GVP and/or test/QA plan
   -   provides assistance to ETV center and ESTE project personnel in resolving QA issues
   -   reviews existing data submitted by third-party testing organization and technology
       developers/vendors and approves its use in a ETV verification report or statement as part
       of the review of the verification package.
   -   discuss with the EPA Director of QA, the delegation of the  approval of certain routine
       TQAP.

1.2.16  Verification Organization Quality Manager
   -   is independent of the environmental data collection process.
   -   is responsible for ensuring that the verification organization and its subcontractors and

ETV QMP, Version 3.0                      12                        January 2008

-------
       analytical labs have quality systems in compliance with the ETV QMP (this document),
       and that the verification organization complies with its own documented quality system
       and the applicable test/QA plan
   -   communicates ETV quality system requirements, quality procedures, and quality issues
       to the assigned ETV center project officer or ESTE project manager, to verification
       organization, and to subcontractor or analytical laboratory
       oversees verification organization QA activities to verify conformance to the quality
       provisions of this document, QA oversight of the implementation of test/QA plans
   -   oversees subcontractor and analytical lab QA activities to verify conformance to the QA
       requirements of the GVP and/or test/QA plan
       prepares the verification organization QMP , which conforms to the QA requirements of
       this document
   -   review, and update, if necessary, the VO QMP annually
       performs and documents internal reviews of GVPs, test/QA plans, verification statements
       and reports and internally approves their release
       if responsibility has been delegated by the ETV  director for QA, provides the final QA
       approval of test/QA plans prior to the start of testing
       performs and documents internal QSAs to verify conformance to the quality requirements
       of the verification organization QMP
   -   performs and documents internal TSAs  and PEAs of verification tests, as appropriate, to
       verify conformance to the quality requirements of the applicable test/QA plan
       performs and documents audits of data quality (ADQs) on 10% of test data
       reviews and internally approves existing data submitted by third-party testing
       organization and technology developers/vendors

1.2.17 Subcontractor or Analytical Laboratory Quality Manager
       oversees internal QA activities to verify conformance with verification organization GVP
       and/or test/QA plan
   -   reviews internal QA check results to ensure that verification organization quality
       acceptance criteria are attained

Information available on the ETV website presents a current listing of the ETV centers and
ESTE projects, including the ETV center project officer, ESTE project managers, and
verification organization managers' names, company affiliations, and phone numbers.

1.3    ETV customer identification and ETV customer needs and expectations
The ETV director, ETV center project officers, ESTE project managers, and verification organizations are responsible for
coordinating the identification of customers and communicating the needs of the internal and external customers to ensure
that ETV work products satisfy their needs.

1.3.1  External customers (i.e., outside EPA) include, but are not limited to:
   -   federal, state and local government permitting/regulatory agencies
       public and private sector buyers and users of technology
       developers/vendors of technology
   -   the  consulting engineering community that recommends technologies to buyers

ETV QMP, Version 3.0                       13                         January 2008

-------
    -   international marketers and the financial and insurer communities
       Congress

In a general sense, needs and expectations of external customers include:
    -   ETV verification reports and ETV verification statements supported by objective and
       reliable data, provided in a timely manner
       a justifiable documented approach to selecting technologies for testing
    -   characterization by stakeholders of the quality/uncertainty of the verification test results
    -   a practical approach in testing which provides efficient, timely, well-documented, and
       cost-effective verification tests
       full disclosure of all testing results, including those which do not verify the technology
       manufacturer's claims
    -   user-friendly documents (e.g., easy to read and to implement)
       technology operation consistent with statements in the ETV verification report

For each ETV center and ESTE project, needs and expectations of external customers are
defined and documented in the minutes of stakeholders meetings. The process to define these
needs and expectations includes:
       discussions between the ETV center project officers or ESTE project managers,
       verification organizations, and stakeholders
    -   development by ETV center project officer or ESTE project managers, verification
       organizations, and stakeholders of verification test objectives and data quality objectives
       (DQOs) prior to testing

1.3.2   Internal customers of the ETV program are EPA staff who are responsible for execution
of the ETV program in accordance with the expectations of Congress and the Administration.
These customers include EPA and ORD  senior managers who expect conformance with
management and quality policies of the Agency.

Other EPA staff in the regions and headquarters benefit from the program in the following areas:
       data of known and useful quality
       expedited use of improved environmental technologies
    -   ETV testing accomplished on a wide variety of technologies
       user-friendly documents (e.g., easy to read and to implement)
       development of appropriate GVPs and test/QA plans

1.4    Management negotiation with verification organizations on constraints
When necessary, appropriate EPA management shall negotiate acceptable measures of quality and success when
constraints of time, costs, or other problems affect the verification organization capability to fully satisfy customer needs
and expectations.

When constraints of time, costs, or other problems significantly affect the verification
organization capability to fully satisfy the ETV quality system needs and expectations, or when
problems are identified through the EPA QA function, the verification organization will notify
the project officer and negotiations will proceed according to agreement terms.

ETV QMP, Version 3.0                      14                        January 2008

-------
7.5    Resources
The laboratory directors shall provide adequate resources to the ETV directors of quality assurance, ETV center project
officer, ESTE project managers, and EPA quality managers to enable them to plan, implement, assess, and improve the
overall ETV program and quality system effectively.

Laboratory directors take the following actions to achieve the above policy:
       provide full-time equivalent (FTE) allotment of ETV center project officer or ESTE
       project managers
   -   provide FTE allotment of QA and other support personnel at each laboratory's
       geographical location
       provide sufficient travel funds for each ETV center and ESTE project for an appropriate
       level of oversight and independent assessments, which should be determined based on the
       needs of the ETV center and ESTE project. Needs may include attendance at ETV team
       meetings, verification tests, and meetings with technology developers/vendors and
       stakeholder. Travel for assessments is based on the requirements of Part A, Table 9.1.
   -   provide for maintenance of communication lines between ORD laboratory directors, the
       ETV team, and the ETV director.

1.6    Authority to stop work for safety and quality consideration
The verification organization shall stop  unsafe work and work of inadequate quality,  or shall delegate the authority to do
so to others.

The following procedures are necessary to stop unsafe work and work of inadequate quality:
       the verification organizations shall ensure compliance with all federal, state, and local
       health and safety policies during the performance of the verification tests. This includes
       obtaining appropriate permits.
       the verification organization quality system shall identify one or more individuals who
       may issue a stop work order in the event that unsafe work or work of inadequate quality
       is identified.
   -   ETV center project officers, ESTE project managers or EPA quality managers shall
       contact the authorized individual(s) in the event that work of inadequate quality is
       discovered.
   -   in extreme instances, ETV center project officers or ESTE project managers may ask the
       Grants Administration Division (GAD) or the Contracts Management Division (CMD)
       to intervene if the verification organization does not implement their approved QMP.
ETV QMP, Version 3.0                      15                         January 2008

-------
                2.0    QUALITY SYSTEM AND DESCRIPTION
A quality system shall be planned, established, documented, implemented, and assessed as an integral part of an ETV
management system for environmental technology verification programs defined by ETV quality policy.

Development and subsequent endorsement of this plan by the ETV director and EPA line
management are  evidence that the ETV quality system is planned, established, documented,
implemented, and assessed as an integral part of an EPA ETV management system.

2.1    Authorities and conformance to ANSI/ASQC E4-1994 quality standard
The ETV quality system shall address applicable parts of ANSI/ASQC E4-1994 and shall include the organizational
structure, policies and procedures, responsibilities, authorities, resources, and guidance documents.

The authorities for developing appropriate quality systems for ETV are EPA Order 5360.1 A2
{Policy and Program Requirements for the Mandatory Agency-wide Quality System ), Federal
Register, 40 CFR Parts 30 and 33, and Higher-level Contract Quality Requirements.

This plan complies with ANSI/ ASQC E4-1994, Specifications and Guidance for Quality
Systems for Environmental Data Collection and Environmental Technology Programs., the
Agency standard. ANSI/ASQC E4-1994 is comparable to the International Organization for
Standardization (ISO)  9000 standards  series,  as shown in the comparison table provided in
Annex B-5 to ANSI/ASQC E4-1994. Another acceptable quality  system model is ISO 17025,
General Requirements for the Competence of Calibration and Testing Laboratories.

The ETV quality system addresses each applicable individual "specification" provided in the
published quality standard, ANSI/ASQC E4-1994, using the policies and procedures in this plan,
as appropriate.

Verification organizations develop QMPs that are consistent with both ANSI/ASQC E4-1994
(and/or ISO 9001:2000) and the ETV QMP (this document). Joint QMP/test/QA plans can be
prepared for ESTE projects.

Note: If a verification organization already has a QMP for an ETV center that conforms to this
document and ANSI/ASQC E4-1994,  it need not prepare another QMP specifically for an ESTE
project. Some modifications may be needed, however, to address differences that may be
introduced by using a different extramural mechanism (e.g., contracts) to perform an ESTE
project. In general these modifications may be handled using an addendum to the verification
QMP or within an ESTE joint QMP/test/QA plan.

2.2    Quality  system documents
The ETV quality system shall be described in a QMP that is reviewed and approved by the ETV director and EPA line
management.

The ETV quality system is described in this QMP.  The ETV team develops and implements the
quality system, both internally and through oversight of the verification organizations. The ETV

ETV QMP, Version 3.0                      16                        January 2008

-------
director, ORD laboratory directors, and appropriate ORD line management review and approve
the ETV QMP and subsequent revisions to the plan, as policy for the ETV program. Verification
organization quality systems (which are consistent with ANSI/ASQC E4-1994) are described in
a written ETV center QMP or ESTE project joint QMP/test/QA plan, and are reviewed and
approved by verification organization management, the ETV center project officer, ESTE project
manager, and EPA quality manager. Subsequent revisions are reviewed in a similar manner.
These documents conform to EPA Requirements for Quality Management Plans (EPA QA/R-2).

2.3    Quality system scope
The ETV quality system description shall identify in general terms those items, programs, or activities to which it applies.

This quality system description applies to the following:
       the EPA ETV program
       selection and oversight of verification organizations
       review and approval of verification organization QMPs
       ETV products (e.g., test/QA plans, reports,  ETV verification statements)
   -   planning, implementation,  and assessment activities supporting ETV verification
       activities

2.4    Quality expectation for products and services
The ETV quality system shall include provisions to ensure that products or results of the environmental programs defined
by the ETV program are of the type and quality needed and expected by ETV clients.

The preeminent products of the ETV program are the environmental technology verification
reports and statements issued by EPA and the verification organization. Provisions to ensure that
these products and other results of the ETV program are of the quality expected include:
   -   products are reviewed as described in Part A, Section 5.0.
       QSAs and technical assessments are conducted as described in Part A, Section 9.0.
       technical assessments may include field and laboratory audits, performance evaluation
       audits, and audits of data quality.

2.5    Quality procedures documentation
Following approval of the ETV QMP, management elements of the quality system shall be implemented as described.

Verification organizations must operate the ETV centers and ESTE projects under  a written and
EPA-approved QMP that is based on ANSI/ASQC E4-1994 and/or the provisions of this plan.
Verification organizations provide evidence of compliance before verification activities  begin,  as
required by the extramural agreement signed by the verification organization. The ETV center
project officer or ESTE project manager is responsible for obtaining a copy of the verification
organization QMP for his own review and forwarding the document to the EPA quality manager
for review and approval  prior to planning verification tests.
ETV QMP, Version 3.0                     17                         January 2008

-------
2.6   Quality controls
The ETV quality system description shall define when and how controls are to be applied to specific technical or
technology testing efforts and shall outline how these efforts are planned, implemented, and assessed.

2.6.1   ETV program control s include:
       existing EPA policies and procedures for selection and administration of verification
       organization efforts in ETV
    -   an approved ETV QMP
    -   quality management, quality assurance, and quality control procedures as part of the
       extramural agreement

Specifically, the data produced by ETV centers  and ESTE projects are controlled such that
verified data will be published in verification reports, regardless of the outcome of the testing.

2.6.2   ETV center-specific and ESTE project-specific controls include:
    -   GVPs and specific test/QA plans developed and approved prior to testing
    -   oversight by the EPA quality managers of the implementation process and follow-up to
       any finding of nonconformance
       technical assessments and QSAs
       specified quality control requirements in the extramural agreement

ETV center-specific and ESTE project-specific procedures for planning, implementation, and
assessment are described in the verification organization quality system. Procedures for
planning, implementing, and assessing the overall ETV quality system are detailed in Sections
7.0 , 8.0 , and 9.0 in Part A and in Part B .

2.7   Quality system audits  (QSAs)
At regular intervals (at least annually) the ETV quality system shall be reviewed and its description updated, if
appropriate, to reflect changes in the organization as well as changes in ETV quality policy.

The EPA directors of quality  assurance perform an internal QSA of the ETV program every 3 -
5 years in accordance with the process as outlined in Part A, Section 9.0. The assessment report
provides input into any update of the QMP. The EPA quality managers shall perform a QSA of
each center/project preferably in the first year after the start of the center/project but no later than
the second year.  Subsequent QSAs of each center/project shall be performed at a frequency of
every three years or as needed.
ETV QMP, Version 3.0                      18                         January 2008

-------
           3.0    PERSONNEL QUALIFICATION AND TRAINING

3.1    Personnel training and qualification procedures
Personnel performing work shall be trained and qualified based on appropriate requirements prior to the start of the work
or activity.

3.1.1   ETV center project officers and ESTE project managers are selected based on:
       educational background and/or a degree that is directly relevant to the technology area for
       the ETV center or ESTE project
   -   work experience  specific to the technology area
       experience in program management
       participation in required training for project officer responsibilities on extramural
       agreements, as documented in training records

3.1.2   EPA quality managers are selected based on:
       educational background and/or a degree relevant to the verification tests and programs
   -   familiarity with the ETV quality management system and quality requirements, as
       demonstrated by  work experience with the ETV program or on the job training on the
       ETV program, ETV QMP and center QMP
       experience in quality management.
   -   freedom from personal and external barriers to independence and from bias and
       influences that could affect objectivity, organizational independence, and ability to
       maintain an independent attitude and appearance

3.1.3   Verification organization personnel
Key participants working directly for or on behalf of the verification organization in support of
the center/project and/or individual verification tests are selected by the verification organization
and evaluated by the EPA. Evaluation criteria for key personnel will vary, but typically include
a consideration of the following:
   -   educational background and/or a degree(s) relevant to technical areas represented in the
       center/proj ect
       work experience  related to the technology areas represented in the center/project
   -   experience in quality management systems

The verification organization quality manager is independent of the environmental data
collection process.

The verification organization QMP will document training and qualification procedures for
verification organization personnel.

3.2    Formal qualifications and certifications
The need to require formal qualification or certification of personnel performing certain specialized activities shall be
evaluated and implemented where necessary.
ETV QMP, Version 3.0                      19                        January 2008

-------
ETV program management, quality management, and center/project management require no
formal qualification or certification other than where applicable:
       EPA project officer training and extramural agreement training (or work assignment
       manager training as appropriate)
    -  appropriate Occupational Safety and Health Administration (OSHA) courses

Formal qualification or certification of personnel performing specialized activities for each
center/project or for specific test/QA plans is addressed on a center/project-specific or test/QA
plan-specific basis. Verification organizations maintain records of the qualification or
certification of such personnel.

NOTE: Requirements for formal qualifications or certification may be based on applicable
federal, state, or local requirements associated with a particular test. Examples of possible
certifications include but are not limited to drinking water plant operator certification,
professional engineering registration, and certification of industrial hygienists.

3.3   Technical management and training
Appropriate technical and management training, which may include classroom and on-the-job, shall be performed and
documented.

EPA line management is responsible for appropriate technical  and management training for staff
working on the ETV program. Such training will be documented in each individual's training
file.

Verification organizations are responsible for personnel training and qualification procedures for
each ETV center and ESTE project or for specific test/QA plans. Verification organizations
maintain the training records  (available for review by EPA).

The ETV team and verification organizations will be trained at meetings which occur
approximately once a year. At these meetings, the team develops  policy, and shares information
and lessons learned. The directors of quality assurance provide training on the requirements of
the ETV QMP during the periodic workshops organized by the ETV director.

3.4   Retraining
When job requirements change, the need for retraining to ensure continued satisfactory job proficiency shall be evaluated.

The need for retraining EPA ETV staff is evaluated on an annual basis by the appropriate EPA
line management. Evaluating the need for and performing retraining of verification organization
staff is the responsibility of the verification organization management.

3.5   Personnel job proficiency
Evidence of personnel job proficiency shall be documented and maintained for the duration of the technology test or
activity affected, or longer if required.

3.5.1  ETV center project officers and ESTE project managers - The  existing performance

ETV QMP, Version 3.0                      20                         January 2008

-------
standards of the ETV center project officer and ESTE project managers may already include
tasks consistent with the following items. These items should be considered for specific
identification in the performance standards:
   -   active participation in the ETV team; communicating center/project issues, lessons
       learned, required reports, and appropriate assistance to members of the ETV team and
       management
       developing solicitations and/or management of CAs, lAGs), and contracts
   -   facilitating stakeholders group activities
   -   ensuring development of and  contributing to GVPs and test/QA plans
       providing a leadership role to ensure technologies are selected consistently with the ETV
       QMP (this document)
       serving as  a communication link between EPA and the verification organization, in
       particular,  providing information and documents to support the ETV website
       reviewing  draft and final ETV verification reports and other center/project documents
       reporting to program management on the completeness and validity of the ETV
       verification statement prior to report issuance
   -   ensuring the timely delivery of complete and consistent ETV products and services

Evidence of personnel job proficiency is found in the human resources module in ORD's
Management Information System (OMIS) for tracking training, and in EPA's Performance
Appraisal and Recognition System (PARS) for satisfactory job performance.

Note: Evaluations are the responsibility of the appropriate supervisor and are not a record of the
ETV program.

3.5.2   EPA quality managers -The existing performance standards of the EPA quality managers
may already include tasks consistent  with the following items. These items should be considered
for specific identification in the performance standards:
   -   review and approval of verification organization center/project QMPs
       performance and documentation of independent QSAs  of verification organization
       quality system
   -   performance and documentation of independent TSAs and PEAs
   -   review and approval of GVPs, test/QA plans, and verification reports and statements

Note: Evaluations are the responsibility of the appropriate supervisor and are not a record of the
ETV program.

3.5.3   Verification organization staff- Verification organizations document and maintain
records (such as annual performance  reviews) of personnel job proficiency for work performed
directly in support of the verification organization ETV activities.

Note: Evaluations are the responsibility of the verification organization and are not a record of
the ETV program.
ETV QMP, Version 3.0                     21                        January 2008

-------
         4.0    ETV VERIFICATION ORGANIZATION SELECTION

4.1    Planning and control of selection process
Funding of extramural agreements associated with the ETV program shall be planned and controlled to ensure that the
quality of verification tests is known, documented, and meets technical requirements and acceptance criteria of the clients.

The ETV program is designed to investigate ways to facilitate the verification and use of
environmental technology. The ETV program secures verification organizations through
appropriate instruments governed by rules found in Title 31, Section 6303, of the US Code, and
in EPA Order 5700.1 {Policy for Distinguishing between Assistance and Acquisition).

Planning to select verification organizations requires:
   -   assessing and prioritizing environmental technology categories to be verified by each
       center/project (i.e., defining the scope of each center/project in terms of technology areas
       to be tested by that organization)
       establishing ANSI/ASQC E4-1994 as an applicable quality management standard
   -   issuing solicitations
       selecting the appropriate verification organization based on their experience and
       proficiency
       managing the selection process to ensure the quality of verification tests includes:
       -  implementing controls stipulated in EPA policies and procedures for extramural
          agreements
       —  establishing specific language in each  solicitation requiring development and
          implementing a quality system consistent with the ETV quality system, and
          ANSI/ASQC E4-1994. Suggested language is given in Appendix D.
       —  reviewing the applicant's proposed quality system to verify that it meets the
          solicitation requirements and provides for quality of verification tests which will be
          known, documented, and which will meet technical requirements.

4.2    Technical and quality requirements
Extramural agreement solicitation documents shall contain information clearly describing the technical and quality
requirements associated with the verification testing.

Technical and quality requirements expressed in the solicitation include technical evaluation
criteria for technical skills and experience of staff members, and demonstrated experience in the
development of quality systems relevant to ETV.  The policy in EPA Order 5360.1 A2 pertaining
to extramural agreements requires that a verification  organization develop  and receive EPA
approval for a QMP consistent with the ETV QMP (this document) and ANSI/ASQC E4-1994
prior to conducting verification tests. If the verification organization intends to perform
verifications by contracting or sub-contracting with other organizations, all of the controls
incumbent upon the verification organization specified in Section 4.1 pass through to the
contractor or sub-contractor. Verification organization agreements with subcontractors and
analytical labs shall include language requiring compliance with this document, the verification
organization's QMP, and other relevant QA requirements.

ETV QMP, Version 3.0                      22                        January 2008

-------
4.3    Quality speciflcation/conformance
Extramural agreement solicitation documents shall specify the ETV quality requirements for which the verification
organization is responsible and how the verification organization conformance to client requirements shall be verified.

ETV quality requirements for which the verification organization is responsible are specified in
this ETV QMP. During verification organization selection, the applicant proposals and written
responses to the requirements are reviewed for conformance to the Solicitation specifications.
After a verification organization is selected, the EPA quality manager and the ETV center project
officer or ESTE project manager review and approve written quality system documents (e.g.,
QMPs, GVPs, and test/QA plans) for conformance to the EPA and ETV quality policies and
procedures. (Note: The EPA director of quality  assurance may delegate responsibility for
reviewing and approving test/QA plans to the verification organization quality manager provided
that the verification organization demonstrates that it has a functioning quality system that has
the same rigor and integrity as the EPA quality system and that the verification organization
quality manager is independent of the environmental data collection process. EPA retains the
right to review and approve any QAPP prepared by verification organizations under this
delegated responsibility.)

4.4    Peer review of extramural agreements
Extramural award documents shall be reviewed for accuracy and completeness by qualified personnel prior to award.

Peer review is an integral part of EPA's project planning, implementation, and assessment
process.  Solicitation packages are internally peer reviewed prior to their issuance. Responses to
the solicitation undergo a peer review process which supports the award of the extramural
agreement.  ESTE project task order solicitations that are performed under existing EPA
contracts may not require peer reviews.

4.5    Conformance of verification  testing efforts
Appropriate measures shall be established to ensure that the verification testing efforts satisfy all terms and conditions of
the extramural agreement. Verification organizations shall have a demonstrated capability to meet all terms and
conditions.

Once a verification organization has been  selected, measures to ensure continued conformance to
terms and conditions in the extramural agreement are implemented as described in Part A,
Sections 8 , 9 ,  and 10.
ETV QMP, Version 3.0                      23                         January 2008

-------
                        5.0    DOCUMENTS AND RECORDS

5.7   Scope
Procedures shall be established, controlled, and maintained for identifying, preparing, reviewing, approving, revising,
collecting, indexing, filing, storing, maintaining, retrieving, distributing, and disposing of pertinent quality documents and
records. Such procedures shall be applicable to all forms of documents and records, including printed and electronic
media. Measures shall be taken to ensure that users understand the documents to be used. Documents and records
requiring control shall be identified.

A document is an instruction, specification, or plan containing information on how the ETV
program functions, how specific tasks are to be performed, or how specific products or services
are to be provided. Examples include the ETV QMP, the center/project QMPs, GVPs, and
test/QA plans. A record is a statement of data and facts pertaining to a specific event, process, or
product, that provides objective evidence that an activity has occurred. Examples include
verification statements and reports, raw and summary data tables, data notebooks, audit reports,
and stakeholder meeting minutes.  Documents and records to which this policy applies include:
    -  ETV QMP (this document)
    -  extramural agreement records, contracts
       QMPs for ETV centers and ESTE projects or joint QMP/test/QA plans for ESTE projects
       minutes of stakeholder meetings (summary for the record)
    -  GVPs
    -  test/QA plans for all verification tests, including standard operating procedures (SOPs)
       raw data (all written and electronic data generated when tests are conducted)
       existing data used in verification reports and statements
    -  ETV verification reports (comprehensive reports on a technology verification project)
       and ETV verification statements (summary statement for an individual verification test)
       annual report, including a summary of EPA and verification organization QA activities
       independent assessments of the verification organization quality system and technical
       systems
    -  verification organization internal reports of reviews and audits
       reports of internal assessments of the ETV quality system

Information in this section applies to both electronic and printed documents and records, as well
as original documents and records developed on behalf of the ETV program that are required to
demonstrate the quality of information and data provided in ETV verification reports.

5.2   Preparation, review, approval, and distribution
Sufficient documents and records shall be specified, prepared, reviewed, authenticated, and maintained to reflect the
achievement of the required quality for completed work and/or to fulfill any statutory requirements. Documents used to
perform work shall be identified and kept current for use by personnel performing the work. Documents, including
revisions, shall be reviewed by qualified personnel for conformance with technical requirements and quality system
requirements and approved for release by authorized personnel.

Table 5.1 lists the pertinent quality documents and records for ETV, the person(s) responsible for
preparing and updating these documents and records, the reviewers,  those given approval

ETV QMP, Version 3.0                       24                         January 2008

-------
authority for each record type, and the distribution plan. In Table 5.1, where a procedure is not
applicable (e.g., a document is not subject to approval), N/A is entered in the table. All reviewers
and approving officials receive copies of the documents and records they review/approve; the
Distribution column in Table 5.1 lists only those individuals who receive final copies, in addition
to the reviewers and approving official. For  revised documents, these same review, approval, and
distribution pathways are followed. Unless otherwise noted, material placed on the ETV website
is available for public inspection, comment,  and use.

5.3   Documents and records storage and obsolete documents and records
Obsolete or superseded documents shall be identified and measures shall be taken to prevent their use, including removal
from the workplace and from the possession of users when practical. Maintenance of records shall include provisions for
retention, protection, preservation, traceability, and retrievableness. While in storage, records shall be protected from
damage, loss, and deterioration. Retention times for records shall be determined based on extramural agreement and
statutory requirements, or, if none stated, as specified by the EPA director and EPA line management.

Obsolete records should be clearly marked as such. These records may be retained in the
workplace for historical reference, or they may be removed to archival storage. ETV will follow
ORD's Records Management Policy (see Appendix A ), which addresses requirements for
indexing, filing, maintaining, retrieving, and disposing of documents and records from all
extramural financial agreements. The current minimum requirement is that all records be kept for
seven years after the final payment on an extramural agreement.

Any ETV record that is the result of a contract or an agreement falls under EPA Records Control
Schedule 003 or Schedule 202 (see Appendix A and http://www.epa.gov/records/) and carry their
retention schedules (10 years for grants and  other agreements or 6 years and 3 months after
close-out for contracts. SITE funded verification records are kept for thirty years as per SITE
record schedules. Records are kept under the SITE program.
ETV QMP, Version 3.0                      25                         January 2008

-------
TABLE 5.1 Documents and Records Management Scheme
Record Type
ETV quality management
plan
CA/IAG/contract records
VO quality management
plan
Minutes of stakeholder
meetings
Generic verification
protocol
Test/QA plan*
(including SOPs)
Raw data
Existing data
ETV verification report
and ETV verification
statement
Annual report
EPA center and ESTE
project QA reviews and
assessment reports
VO internal QA
document reviews and
assessment reports
Independent ETV
program reviews and
assessment reports
Email/Memos related to
programmatic activities
Preparation/Updating
ETV directors of quality
assurance
EPA quality managers
ETV project officer
ESTE project manager
VO manager
VO manager
VO quality manager
VO manager
VO manager
VO manager
VO manager
Third-party testing
organization
VO manager
ETV team
ETV coordination staff
VO managers
EPA quality manager
VO quality manager
EPA directors of quality
assurance
NA
Review and Recommend
ETV team
VO managers
ETV director
EPA quality manager
ETV project officer
ESTE project manager
Stakeholders
VO quality manager
Developer/vendor
Stakeholders
VO quality manager
Developer/vendor
Stakeholders
N/A

ETV director
Stakeholders
VO quality manager
Developer/vendor
VO managers
ETV project officer
ESTE project manager
VO quality manager
VO manager
VO manager
ETV project officer
ESTE project manager
EPA quality manager
ETV director
EPA laboratory directors
EPA line management
EPA quality manager
NA
Review and Approval
ETV director
EPA laboratory directors
EPA line management
EPA line management
ETV project officer
ESTE project manager
EPA quality manager
N/A
EPA quality manager
ETV project officer
ESTE project manager
EPA quality manager
ETV project officer
ESTE project manager
VO quality manager
VO quality manager
EPA quality manager
VO manager
ETV project officer
ESTE project manager
EPA laboratory directors
EPA quality manager
ETV project officer
ESTE project manager
EPA line management
ETV director
N/A
N/A
N/A
NA
Finals distributed to:
ETV director or designee
for posting to web site
ETV/ESTE project files
ETV director or designee
for posting to web site
ETV director or designee
for posting to web site
ETV director or designee
for posting to web site
(draft and final versions)
ETV director or designee
for posting to web site
Developer/vendor
VO project files (EPA
can request copies)
VO project files (EPA
can request copies)
ETV director or designee
for posting to web site
EPA laboratory directors
ETV/ESTE project files
VO project files (EPA
can request copies)
ETV program files
Person receiving
correspondence
VO = verification organization
N/A = not applicable.

* The EPA QAM, after discussion with the EPA Director of QA may delegate this responsibility to the verification organization quality manager.
See Section 2.2.2.1, Part B for the TQAP delegation of authority.
ETV QMP, Version 3.0
26
January 2008

-------
              6.0    COMPUTER HARDWARE AND SOFTWARE

6.1    General procedures
Computer software and computer hardware configurations used in the ETVprogram shall be
installed/tested/used/maintained/controlled/documented to meet users' requirements and shall conform to this quality
policy and applicable consensus standards and/or data management criteria.

At the program level, ETV does not expect to develop software. At the center/project level, if
verification organizations intend to develop software to support their ETV process (or an
individual test/QA plan), they should have procedures in place as specified here. If the
verification organization uses only commercial software for office operations (e.g., word
processing software, spreadsheet software), it is unlikely that they would need specific
procedures for assessing software quality. Part A, Sections 6.2 through 6.6, apply only to
software and software/hardware configurations developed specifically for the ETV program.

The following are the ETV program procedures which ensure that each center/project controls
the quality of all computer hardware/software configurations for the program:
    -  the ETV center project officer or ESTE project manager and the verification organization
       discuss and agree upon the computer hardware and software requirements for the
       center/project and/or for specific test/QA plans;
    -  once decisions are finalized, the verification organization supplies evidence of meeting
       all requirements before data collection, reduction, or validation procedures begins;
       for software developed for ETV programs, the verification organization tests all
       applications and configurations using a test data set or by running a shakedown test of the
       system to ensure all applications/configurations are operating to specifications. The
       verification organization must show evidence of a system to maintain, control, and
       document such software and hardware configurations. This includes, but is not exclusive
       of: resources to correct any hardware/software failure with minimal downtime to the
       program, tracking upgrades/revisions to software or configuration changes, documenting
       software names, versions, and copyright dates, and complete documentation of the code.
       Complete documentation of code includes the written code with comments structured in a
       modular form.

6.2    Scope of ETV computer hardware/software procedures

Computer software and computer hardware/software configurations covered by ETV's quality policy include, but are not
limited to:
       operation or process control of environmental technology systems (including automated data acquisition and
       laboratory instrumentation)
       and databases containing environmental data

Computer software and computer hardware/software configurations covered by the ETV QMP
(this document) include all agreed upon, center/project-specific applications or configurations.
These applications or configurations include, but are not limited to:
       evaluating and reducing environmental data

ETV QMP, Version  3.0                      27                         January 2008

-------
    -   reporting environmental data
       databases containing environmental data

6.3   Configuration testing

Computer hardware/software configurations shall be tested prior to actual use and the results shall be documented and
maintained.

On a center/project level, the verification organization conducts tests of the computer
hardware/software configuration using a standard set of testing conditions.

NOTE: The verification organization is required to have a system to document all testing of
computer hardware/software configurations, as required by Part A Section 6.1 . A test data set or
a standard set of testing conditions should be developed on a center/project basis or on a test/QA
plan-specific basis. Maintenance testing should be easily trackable and retrievable.

6.4   Measurement and testing equipment configurations

Computer hardware/software configurations integral to measurement and testing equipment that are calibrated for a
specific purpose do not require further testing unless:
       the scope of the software usage changes OR
       modifications are made to the hardware/software configuration.

On a center/project level, verification organizations perform the following procedures (as
provided in the verification organization QMP).

Whenever computer hardware/software configurations integral to measurement and testing
equipment  are calibrated for a specific  purpose, further testing is not normally performed unless
the scope of the software usage changes or modifications are made to the hardware/software
configuration.

In the event either of the above mentioned changes occurs, the verification organization retests
the changes as described in Part A Sections 6.1 and 6.3. Retesting is documented to the same
extent  as the original application/configuration.

6.5   Change assessments - configurations, components, and requirements
Changes to hardware/software configurations, components, or program requirements shall be assessed to determine the
impact of the change on the technical and quality objectives of the ETV program supported.

The verification organization is responsible for assessing the changes,  determining the need for
testing, and reporting the assessments to the ETV center project officer or ESTE project
manager.
ETV QMP, Version 3.0                      28                         January 2008

-------
6.6    ETV website and ETV database roles and responsibilities
The ETV website shall be operated in such a way that it serves all ETV participants and customers through prompt and
accurate posting of ETV information and documents.

The ETV center project officers and ESTE project managers, or alternate(s) designated in writing
by these managers, are responsible for promptly sending the following information to the ETV
director or coordination staff designee:
   -   general fact sheets and brochures
   -   meeting announcements and summaries
       verification organization QMPs
       GVPs (indicating  draft or final)
   -   test/QA plans (indicating draft or final)
   -   ETV verification statements
       ETV verification reports

The ETV center project officers, ESTE project managers, or verification organization designees
are responsible for regularly inputing and/or updating the following types of information
contained in the online ETV database:
       vendor contact information and technology information
   -   technology categories list
       center contact information
       stakeholder member list(s)
       upcoming calendar of events and/or outreach activities
   -   monthly report (ETV center project officers and ESTE project managers only)
   -   vendor solicitations
       quarterly tracking document updates (tracking report, protocols list, test/QA plans list,
       outreach documents list, and conference participation list)
   -   annual  cost questionnaire
   -   testing events
       quality assurance: QMPs, audits,  and other QA  information
ETV QMP, Version 3.0                     29                         January 2008

-------
                                   7.0    PLANNING

7.1    Systematic planning process

A systematic planning process shall be established, implemented, controlled, and documented to:
       identify the customer-(s), and their needs and expectations
       identify the technical and quality goals that meet the needs and expectations of the customer
       translate the technical and quality goals into specifications that shall produce the desired result
       consider any cost and schedule constraints within which technology test activities are required to be performed
       identify acceptance criteria for the results or measures of performance by which the results shall be evaluated
       and customer satisfaction shall be determined.

7.1.1   Systematic planning process established for ETV is conducted as follows:
    -   EPA establishes the number and type of ETV centers and ESTE projects necessary to
       comply with the Presidential mandate to cover all environmental technologies;
       EPA laid out basic program parameters for ETV in 1997 in the Environmental
       Technology Verification Program Verification Strategy. Since then the program planning
       has been guided by the Pollution Prevention Research Strategy and the Pollution
       Prevention and New Technology Multiyear Plan (2003 draft).   In 2006,  ORD drafted the
       Sustainability Research Strategy and the Science and Technology for Sustainability
       Multiyear Plan (2008-20012), which also guide the program;
    -   based upon the ETV QMP (this document), the ETV director, in consultation with the
       ETV  team and EPA line management, designs an annual budget.
       appropriate ORD personnel are appointed to fill ETV positions. Selection and
       information on qualifications are presented in Part A, Sections 3.1 and 3.2;
    -   EPA  line management provides resources and planning to support the duties of EPA
       staff,  such as training and travel. Training is discussed in Part A, Sections 3.3  and 3.4;
       verification organizations are selected to manage the ETV centers or verification tests for
       ESTE projects in conformance with Part A, Section 4.0. EPA's requirements for the
       appropriate extramural agreement (e.g.,  CA, IAG, contract) are met;
       after  selection,  the verification organization, in consultation with the ETV center project
       officer, establishes stakeholders groups that contain representatives of customer groups of
       concern to that center's areas;
    -   the verification organization develop GVPs and/or test/QA plans for verification tests and
       present them to the stakeholders for review and comment and to the ETV center project
       officer or ESTE project manager and the EPA quality manager  for review and approval;
    -   the ETV center project officer or ESTE project manager, the EPA quality manager, the
       verification organization, and the stakeholders group hold at least one joint meeting
       annually to:
       —  identify, revise, and/or clarify the technical and quality goals of the work to be
           accomplished
       —  translate the technical  and quality goals into written specifications that will be used to
           produce the desired result
       —  consider any cost and  schedule constraints within which test activities are required to


ETV QMP, Version 3.0                      30                         January 2008

-------
          be performed
       —  develop qualitative measures of performance by which the results will be accepted
       —  determine testing priorities and evaluate customer satisfaction.
    -   take minutes of each meeting by the verification organization and distribute them to
       participants for comment. Minutes of stakeholders meetings are incorporated within the
       record management scheme described in Part A, Section 5.0.

7.1.2   Implementation of the systematic planning process
Planning is accomplished through frequent meetings  among participants and through posting
initial planning documents and stakeholders meeting minutes on the ETV website. Procedures
for planning at the center/project level and at the verification test level are addressed in Part B .
Procedures for implementing the planning process are detailed below:
       customer identification - ETV customers are identified in Part A Section 1;
       technical and quality goals identification - in addition to the goals identification which
       occurs at stakeholder meetings mentioned in Section  7.1.1, these are identified during
       planning meetings with senior management, conference calls with ETV participants, and
       meetings with EPA quality professionals and technical staff;
       technical and quality goal specifications - the ETV director works with the ETV center
       project officers, ESTE project managers, EPA directors of quality assurance, and other
       quality professionals to translate technical and quality goals of the overall program into
       the ETV QMP;
       cost and schedule constraints - these are discussed during planning meetings with senior
       management and considered yearly for allocation to ETV centers and ESTE projects;
    -   measures of performance - the ETV director develops measures of performance for ETV,
       which are evaluated by the ETV team and verification organizations throughout the
       course of the program. Appendix B contains the most current measures of performance.

7.1.3   Systematic planning process controls include:
    -   development and implementation of written procedures (GVPs and test/QA plans)
       requirement of minutes of stakeholders group meetings
       review of verification organization work efforts by the ETV center project officer or
       ESTE project manager and EPA quality manager

7.1.4   Systematic planning process documentation includes-the ETV QMP, the verification
organization QMPs, GVPs, test/QA plans,  the Pollution Prevention Research Strategy, the
Pollution Prevention and New Technology Multiyear Plan (2003 draft), Sustainability Research
Strategy, and the Science and Technology for Sustainability Multiyear Plan (2008-2012).

7.2   Planning document review

All planning documentation shall be reviewed and approved for implementation by authorized personnel before the
specific work commences. Such documentation includes but is not limited to test/QA plans and GVPs.

Planning document review is discussed in Part A Section 5


ETV QMP, Version 3.0                      31                         January 2008

-------
              8.0    IMPLEMENTATION OF WORK PROCESSES

8.1    Implementation
Work shall be performed according to approved planning and technical documents.

The planning for the implementation of the EPA management and quality work processes is
contained in Part A, Section 7.0. The individual center's/project's work is performed according to
planning documents written by the center/project. All technology verification work shall occur
according to GVPs and test/QA plans developed and agreed upon by EPA, the verification
organization, and the developer/vendor. The authors, reviewers, and approvers of these
documents are specified in Part A Section 5.0, Table 5.1.

The approved GVPs and test/QA plans shall be present on the site of testing, and the work shall
be implemented in accordance with them. During the work phase, modifications to plans and
procedures shall be documented, and the modifications shall be incorporated into the final GVPs
and test/QA plans. The authors, reviewers, and approvers of changes to these documents are the
same as for the original documents and are specified in Part A, Section 5.0, Table 5.1.

Verification organizations are responsible for implementing their work processes in accordance
with their QAM and with GVPs and test/QA plans.  They are responsible for ensuring that
subcontractors and analytical labs conform to the requirements of the GVPs and test/QA plans.

8.2    Procedures
Procedures shall be developed, documented, and implemented for appropriate routine, standardized, special, or critical
operations. Operations needing procedures shall be identified. The form, content, and applicability shall be addressed,
and the reviewers and approvers shall be specified.

Procedures for the overall operation of the ETV program are contained in the ETV QMP and in
other appropriate EPA policies (e.g., extramural agreement, records management). The
individual ETV centers and ESTE projects shall identify and document those operations in their
centers/projects requiring procedures as discussed in Part B . Procedures shall be written in a
format that can be readily comprehended by the user and shall contain sufficient detail and
clarity to ensure that results are achieved effectively. Appropriate operations documents, authors,
reviewers, and approvers are specified in Part A Section 5.0, Table 5.1.

8.3    Oversight
Implementation of work shall be accomplished with a level of management oversight and inspection commensurate with
the importance of the program and the  intended use of the results, and shall include the routine measurement of
performance against established technical and quality specifications.

EPA line management has responsibility for oversight of verification work processes as
discussed in Part A Section 1.0. Verification organization oversight and responsibilities for the
verification work processes are given in their QMPs.


ETV QMP, Version 3.0                      32                        January 2008

-------
                      9.0   ASSESSMENT AND RESPONSE

9.1    Numbers and types of assessments
Assessments shall be planned, scheduled, and conducted to measure the effectiveness of the implemented quality
management systems. Several types of assessments are available for this purpose. Management shall determine during the
planning stage the appropriate types of assessment activities. Assessments shall include an evaluation to determine and
verify whether technical requirements, not just procedural compliance, are being implemented effectively.

QSAs and TSAs shall be used to measure the effectiveness of the implemented center/project
quality management systems and technical systems. PEAs shall  be used to evaluate performance
of the center/project technical systems. ADQs shall be used to assess reported data quality. The
types of assessments are defined in Part B, Section 4.0.

Verification organization quality managers perform internal assessments of verification
organizations. EPA quality managers perform independent assessments of verification
organizations. EPA directors of quality assurance perform independent assessments of the ETV
program.

Note: General Auditing Guidelines: Because of the high visibility of ETV testing, the systematic
planning should provide for sufficient auditing to  insure the integrity of the data. The
assessments shown in Table 9.1  and the minimum frequency are commensurate with the
importance of the ETV program and the intended  use of the verification results.  The target
minimums are a TSA on every test by the verification organization, and at least once per year per
center by EPA. If a test is capable of being quantitatively audited and if suitable PEA samples
can be prepared, PEAs should be performed on every test by the verification organization and
once per year for each center/project by EPA. An  ADQ must be performed by the verification
organization on a random selection of 10% of all data from  each test.  An ADQ should be
performed at least once per year for each center/project by EPA. In the case of continuously
monitoring instruments operating over long  periods of time, a representative amount of the data
(suggest 10%) may be audited.

In cases where the minimum frequency for these assessments appears to be excessive, the
professional judgment of the EPA quality managers will prevail. For example, the EPA quality
managers may make an exception to the minimum frequency if multiple verification tests of very
similar environmental technologies are being conducted under the same GVP within a relatively
short span of time and if the results of the first assessment indicate that the verification test was
implemented as described in the test/QA plan and that the measurement data attained their
DQOs. In this circumstance, the EPA quality manager may decide  that assessments at less than
the minimum frequency are effective for monitoring the quality of the multiple, but very similar,
verification tests. (Also, see Part B, Section 4.2 for information about assessment frequency.)
ETV QMP, Version 3.0                     33                         January 2008

-------
Table 9.1 Assessments
Assessment
Type
Auditors
Responsible for
Corrective Action
Basis for
Audit
Minimum
Frequency
Reason for
Assessment
Report Reviewed by
Program-Level Assessments
Quality systems
audit
EPA directors
of quality
assurance
ETV program
management
ETVQMP
Every 3-5 years
Assess management
practices for ETV
program
EPA laboratory directors
ETV director
Center- or Project-Level Assessments
Quality systems
audits (internal)
Quality systems
audits
(independent)
Technical
systems audits
(internal)
Technical
systems audits
(independent)
Performance
evaluation audits
(internal)
Performance
evaluation audits
(independent)
Audits of data
quality (internal)
Audits of data
quality
(independent)
VO quality
managers
EPA quality
managers
VO quality
managers
EPA quality
managers
VO quality
managers
EPA quality
managers
VO quality
managers
EPA quality
managers
Verification
organizations
Verification
organizations
Verification
organizations
Verification
organizations
Verification
organizations
Verification
organizations
Verification
organizations
Verification
organizations
Center or
Project QMP
Center or
Project QMP
Test/QA plans
Test/QA plans
Test/QA plans
Test/QA plans
Raw data and
summary data
Raw data and
summary data
Within one year after
approval of the initial
center QMP, then as
requested by EPA
Within one year after
approval of the initial
center QMP, then
every three years or
as needed,
internal: once per test
(see Part A, Section
9.1 for exception)
independent: yearly
for each
center /project (see
Part A, Section 9. 1
for exception)
internal: each test, if
feasible (see Part B,
Section 4. 2 for
exception)
independent: yearly
for each center/
project, as
appropriate and
feasible (see Part B,
Section 4. 2 for
exception)
internal: At least: 10
percent of all of the
verification data
independent: yearly
for each
center /project (see
PartB, Section 4. 2
for exception)
Assess quality
management
practices of VO
Assess quality
management
practices of VO
Assess technical
quality of
verification tests
Assess technical
quality of
verification tests
Assess measurement
performance
Assess measurement
performance
Assess data
calculations and
reporting
Assess data
calculations and
reporting
VO managers
VO managers
VO quality managers
ETV project officers
ESTE project managers
EPA directors of Q A
ETV director
VO managers
EPA quality managers
ETV project officers
ESTE project managers
VO managers
VO quality managers
ETV project officers
ESTE project managers
EPA directors of Q A
VO managers
EPA quality managers
ETV project officers
ESTE project managers
VO managers
VO quality managers
ETV project officers
ESTE project managers
EPA directors of Q A
VO managers
EPA quality managers
ETV project officers
ESTE project managers
VO managers
VO quality managers
ETV project officers
ESTE project managers
9.2    Procedures
Assessments shall be performed according to written and approved procedures, based on careful planning of the scope of
the assessment and the information needed. Assessment results shall be documented and reported to management.
Management shall review the assessments.

Assessments shall be planned according to the scope of the assessment and the information
needed. Suitable written procedures for planning and conducting audits shall be contained in the
center/project QMPs of the verification organizations and in EPA guidance documents (i.e.,
Guidance on Assessing Quality Systems, EPA QA/G-3 and Guidance on Technical Audits and
Related Assessments, EPA QA/G-T). Assessments are based on interviews, on the physical
examination  of objective evidence, on results of analysis of PEA samples, and on the
ETVQMP, Version 3.0
34
January 2008

-------
examination of the documentation of past performance. The basis for QSAs of verification
organizations is the center/project QMPs.  The basis for TSAs of ETV verification tests is the
test/QA plans. Results are documented in audit reports, which are reviewed by appropriate
management as described in Table 9.1.

Assessments shall occur when verification testing is at a stage where auditing is feasible.
Assessments are most effective early in a verification test so that corrective action may be taken
before the project has been completed and before all environmental data have been collected.  It
is better for an auditor to report earlier so that a verification test can generate better data rather
than to report later when corrective action is no longer possible.

9.3   Personnel qualifications, responsibility, and authority
Personnel conducting assessments shall have the appropriate technical or management skills to perform the assigned
assessment. Management shall determine and document the level of competence,  experience, and training necessary to
ensure the capability of personnel conducting assessments. The responsibilities and authorities of personnel conducting
assessments shall be clearly defined and documented, particularly in regard to authority to suspend or stop work in
progress upon detection and identification of an immediate adverse condition affecting the quality of results or the health
and safety of personnel.

EPA or verification organization management determines and documents the level of
competence, experience, and training of their respective audit  personnel during hiring and
periodic performance reviews. Qualified audit personnel, as listed  in Table 9.1, have access to the
appropriate management personnel and documents required to perform their audit duties. They
must be organizationally independent of the center/program that they are auditing. They have the
responsibility and authority to:
       identify and document problems that affect quality of verification results
    -   propose recommendations for resolving problems that  affect quality of verification  work
       processes or results
       independently confirm implementation and effectiveness of solutions

If the auditors identify a significant problem affecting verification  data quality, ETV center
project officers and ESTE project managers have the authority to request of the verification
organization manager that work be stopped until the problem is addressed. If the auditors
identify a problem where the health and safety of personnel are in  danger, they have the
responsibility to bring it to the immediate attention of appropriate EPA management, verification
organization management, and onsite testing personnel.

9.4   Response
Responses to adverse conclusions from the findings and recommendations of assessments shall be made in a timely
manner. Conditions needing corrective action shall be identified and the appropriate response made promptly. Follow-up
action shall be taken and documented to confirm the implementation and effectiveness of the response action.

When the recommendations and conclusions from the findings of assessments are adverse,
response from the verification organization detailing the corrective action shall be expected
within 10 working  days of receiving the audit report. The auditors  shall follow up with
appropriate documentation to confirm the implementation and effectiveness of the corrective
action.

ETV QMP, Version 3.0                      35                          January 2008

-------
                        10.0   QUALITY IMPROVEMENT

10.1  Annual review for quality improvement
A quality improvement process shall be established and implemented to continuously develop and improve the ETV
Quality System.

The ETV director and EPA directors of quality assurance review the ETV QMP annually and
recommend improvements to the plan.

The EPA directors  of quality assurance recommend and negotiate quality improvements with the
ETV team during the annual meeting and through other business communication channels (e.g.,
e-mail, teleconferences, etc.)

10.2  Detecting and correcting quality system problems
Procedures shall be established and implemented to prevent as well as detect and correct problems that adversely affect
quality during all phases of technical and management activities.

ETV center project officer, ESTE project managers, and EPA quality managers report problems
in any of the following areas to EPA line management and to the EPA directors of quality
assurance and to the ETV Program director:
    -   adequacy of the ETV quality system
       consistency of the quality system
       implementation of the quality system
    -   correction of quality system procedures
       completeness of documented information
       quality of data
       quality of planning documents
       implementation of the work process

EPA line managers respond promptly to address correction of the quality problem.

10.3  Cause-and-effect relationship
When problems are found to be significant, the relationship between cause and effect and the root cause shall be
determined.

The following are general procedures for determining cause-and-effect relationships. Specific
procedures are found in the individual verification organization QMP. When problems are
significant, the verification organization quality manager determines and documents the
relationship between cause and effect, and when possible, determines and documents the root
cause of the problem. The verification organization quality manager provides this information to
the verification organization manager so corrective action can be authorized and implemented.
The corrective action may include changes to a GVP, a test/QA plan, the management system or
the verification organization's quality system as documented in its QMP.

Note: The verification organization quality managers, in accordance with their quality systems,

ETV QMP, Version 3.0                      36                        January 2008

-------
are continually reviewing and assessing their centers/projects for conformance with their quality
documents (i.e., QMP, GVP, test/QA plan). At the program level, independent assessment
reports from the individual centers/projects are monitored and evaluated by the EPA directors of
quality assurance for trends or recurring problems that are indicative of significant problems
affecting the ETV program as a whole. Any such situation is immediately communicated to the
ETV director. The ETV director shares the information and any corrective actions with the ETV
center project officers and ESTE project managers.

10.4  Root cause
The root cause should be determined before permanentpreventative measures are planned and implemented.

To guard against implementing ineffective changes, EPA personnel ensure when possible that
root causes are determined before preventative measures are planned and implemented.

10.5  Quality improvement action
Appropriate actions shall be planned, documented, and implemented in response to findings in a timely manner.

In the event that a significant problem is identified that requires a structural change to the ETV
program, the ETV director will initiate discussions with EPA line management appropriate to
correct the deficiency. Under extreme circumstances EPA Directors of Quality Assurance may
initiate discussions with the Lab Director.
ETV QMP, Version 3.0                     37                         January 2008

-------
  PART B:   COLLECTION AND EVALUATION OF ENVIRONMENTAL
                                         DATA

Part A of the ETV QMP contains the specifications and guidelines that are applicable to common
or routine quality management functions and activities necessary to support the ETV program.

Part B of the ETV QMP contains the specifications and guidelines that apply to test-specific
environmental activities involving the generation, collection, analysis, evaluation, and reporting
of test data.

              1.0    VERIFICATION PLANNING AND SCOPING

The work of the ETV program at the project level is to verify the performance of commercial-
ready technologies. As discussed in Part A Section 7.0, the planning process begins with the
Statement of Work (SOW) contained in the solicitation. The successful applicant becomes the
verification organization for the center.

1.1   Systematic planning of the verification test
All work involving the generation, acquisition, and use of environmental data shall be planned and documented. The type
and quality of environmental data needed for their intended use shall be identified and documented using a systematic
planning process. The test-specific planning must involve the key users and customers of the data. ETV center project
officers and ESTE project managers should guide planning activities and ensure that participants are informed of and
understand completely the requirements of each test.

The programmatic planning for verification of commercial-ready technologies is discussed in
Part A Section 7.1.1. This section continues the discussion of systematic planning at the
center/project level.

Verification organizations, working with the ETV center project officers or ESTE project
managers, begin a systematic process to plan the individual verification tests. Systematic
planning may be accomplished through the DQO process (see Guidance for the Data Quality
Objectives Process, EPA QA/G-4). The planners perform the following actions:
       refine the scope of their respective technology areas
       determine interest in verification from the manufacturers of commercial-ready
       technologies within the defined scope of the technology areas
    -   convene stakeholder groups,  including representatives of verification customer groups, to
       provide input during the planning process
       develop DQOs  for verification tests based on input from stakeholders and ETV center
       project officers or ESTE project managers
    -   mediate and facilitate the stakeholders' identification of technology focus areas
       select technologies for verification based on input from stakeholders and technology
       developers/vendors
    -   prepare GVPs which are developed to promote uniform testing procedures for similar
       environmental technologies and/or test/QA plans to verify the performance of the specific


ETV QMP, Version 3.0                     38                        January 2008

-------
       environmental technologies
    -   coordinate the review and revision of the GVPs and/or test/QA plans (see the review and
       approval scheme in Part A, Section 5.0.) keeping in mind both customer and EPA
       objectives for verification tests
       solicit developer/vendor agreements to participate in verification of their products based
       on the GVP and/or test/QA plan (some iteration of the two previous points frequently
       occurs here  as the  developer/vendors review and request revision of portions of the GVPs
       and/or test/QA plan)

ESTE project managers, working with verification organizations for ESTE projects, begin a
systematic process to plan the individual verification tests. Systematic planning may be
accomplished through the DQO process (see Guidance for the Data Quality Objectives Process,
EPA QA/G-4). The  planners perform the following actions:
    -   refine the scope  of their respective technology areas
       determine interest  in verification from the manufacturers of commercial-ready
       technologies within the defined scope of the technology areas
    -   convene stakeholder groups, containing representatives of verification customer groups,
       which provide input during the planning process
       develop DQOs for verification tests based on input from stakeholders and the verification
       organization
    -   mediate and facilitate the stakeholders' selection of technology focus areas
       selects technologies for verification based on input from EPA labs, regions, and program
       offices and technology developers/vendors
       prepare GVPs which are developed to promote uniform testing for similar environmental
       technologies and/or test/QA plans to verify the performance of specific environmental
       technologies (Note:  Although not required, some ESTE projects may develop GVPs.)
       coordinate the review and revision of the GVPs and/or test/QA plans (See the review and
       approval scheme in Part A, Section 5.0.) keeping in mind  both customer and EPA
       objectives for verification tests
       solicit developer/vendor agreements to participate in verification of their products based
       on  the GVP and/or test/QA plan (some iteration of the two previous points frequently
       occurs here as the  developer/vendors review and request revision of portions  of the  GVP
       and/or test/QA plan)

The GVPs and/or test/QA plans describe the experimental approach, with  clearly stated test
objectives  and associated DQOs for the related measurements.

1.2   Systematic planning for verification testing
       organizations that participate in the test shall participate in the planning.
       the scope and objectives of the verification testing and the desired action  or result from the work shall be
       defined.
       the data to be collected to achieve verification shall be identified, and the QA and QC requirements to establish
       the quality of the data shall be defined.
       verification tests shall undergo a design process.
       verification tests shall be documented.
       equipment, operators, and skill levels required for the verifications shall be identified.
       any constraints (e.g.  time and budget) shall be identified.

ETV QMP, Version 3.0                      39                        January 2008

-------
       conditions, which will suspend work, shall be identified.
       assessment tools shall be determined.
       methods and procedures for storing, retrieving, analyzing, and reporting the data shall be identified.
       methods and procedures for minimizing, characterizing, and disposal of hazardous waste generated during the
       test shall be identified.

1.2.1   Planning personnel
The verification organization shall coordinate test planning among the participating
organizations including EPA, the stakeholders, the vendors, and any testing organizations and
laboratories participating in the test. The verification organization, with the concurrence and
oversight of the ETV center project officer or ESTE project manager, shall identify the planning
roles of the various players, and shall conduct planning activities by shared communication via
teleconference, video conference, and in-person meetings, as appropriate, and within the
constraints of the budget.

1.2.2 Purpose, scope and objectives
The purpose of this testing is to verify the performance of commercial-ready technologies.
Another objective is to develop an efficient method for testing commercial-ready technologies.
Many of the centers accomplish this objective by preparing GVPs whereby the performance of
similar technologies can be verified in the future using the same or similar test/QA plan(s). The
characteristics of individual technologies and the specifics of individual tests are described in
individual test/QA plan(s), which incorporate the GVP by reference.  For some tests the
technologies are sufficiently similar that more than one product in the same technology area is
tested under the same test/QA plan. Depending on the technology and the test, technologies may
be tested on multiple occasions. The testing experience may be used to refine the GVP and/or
test/QA plan.

1.2.3   Data to be collected and design of experiment
During planning of the technology verification test, the process,  environmental, laboratory,
response, and QA data to be collected are  identified. Also identified are testing organizations,
test personnel, skill levels, methods, procedures, and equipment unique to each verification test.
Planning is integrated into design as discussed in Part B,  Section 2.0.

1.2.4   Documentation and reporting
Records generated during the verification  tests are  listed in Part A, Section 5.0. Records consist
of both paper and electronic records. Electronic methods  for storing, retrieving, analyzing, and
reporting the data are generally commercially available programs for word processing,
spreadsheet, or database processing, or commercial software developed especially for data
collection and processing on a specific instrument  or piece of equipment. Centers may also
develop software/hardware configurations, as appropriate, in their technology verification tests.
The use of computer hardware and software is discussed  in Part A, Section 6.0. Paper records
such as field notebooks, bench sheets, field data sheets, custody sheets, and instrument printouts
are part of the raw data test record and are kept with the study records.

1.2.5   Assessments
The assessment tools and minimum frequencies of assessments for the verification tests are


ETV QMP, Version 3.0                      40                         January 2008

-------
identified in Part A, Section 9.0. The definitions of the assessment tools and suggested
frequencies are given in Part B, Section 4.0.


1.2.6   Constraints, suspension of work, waste minimization and disposal
Verification organizations work under the constraints of time and resources communicated to
them by the EPA ETV director and the ETV center project officer or ESTE project manager.
When constraints are determined by the verification organization to affect quality, the resolution
of the problem proceeds as described in Part A, Section 1.5. Circumstances under which work
can be suspended are discussed in Part A, Section 1.7. If waste is generated as part of the
verification testing, the verification organization seeks to minimize the amount, and disposes of
it in accordance with applicable local, state, and federal laws.


          2.0    DESIGN OF TECHNOLOGY VERIFICATION TESTS


2.1   Design process
The design shall incorporate those activities pertaining to verification of performance identified during the planning
process, establish test specifications, and identify appropriate controls. The design shall include
       selection of field sampling or testing equipment, and its operational parameters, as appropriate
       selection of field sampling or testing methods, as appropriate
       sample types, numbers, quantities, handling, packaging, shipping, and custody, if applicable
       sampling locations, storage, and holding times, if applicable
       selection of analytical methods, quality measures of performance, analysis providers, if applicable
       requirements for calibration standards, and performance evaluation samples, as appropriate
       requirements for field and/or laboratory QA/QC activities
       requirements for qualifications of testing, sampling and/or analysis personnel
       protection of health and safety of test personnel and the public
       readiness reviews prior to data collection
       assessments required including technical and performance audits, audits of data quality, and assessments of data
        use limitations
       data reporting requirements
       methods for validating and verifying the data
       requirements for data security, archival, and retention
       integration of time and schedule constraints
       procedures for minimization or disposal of wastes generated during verification activities


2.1.1  Designers
The design of an ETV verification  test is provided by a team that includes stakeholders, EPA
staff, verification organization staff, and vendors.  The output of the design process is a test/QA
plan that details the planned tests and documents the rationale, assumptions, and personnel
involved.  EPA provides guidance  for writing test/QA plans in Guidance on Quality Assurance
Project Plan?,, EPA QA/G-5. For some classes of technologies, the output may be a GVP.


2.1.2   Objectives
The goal of an ETV verification test is the production of high-quality testing data for use by a
decision maker in determining the  appropriateness of the technology for the intended use.  In
designing technology performance verification operations, designers use a modification of the
DQO process (see Guidance for the Data Quality Objectives Process EPA QA/G-4) for those
ETV verification tests for which quantitative goals can be specified.  A modification is required
because the DQO process is geared toward making a decision. The ETV program does not


ETV QMP, Version 3.0                       41                           January 2008

-------
participate in making ranking decisions regarding the technologies. It provides unbiased
reporting of performance through testing. The objective of the design process is to identify and
harmonize all components necessary to conduct a successful test.

2.1.3   Design process and components
The planning process considers selection of test parameters, availability of test equipment,
availability of testing personnel, optimal test procedures, and the necessary and sufficient data
quality indicators for test measurements. The verification test design takes into account
constraints of time, scheduling, and resources.

The product of the design process is a test/QA plan, which has the following characteristics:
       documents the process and assumptions used for planning, as well as those persons
       responsible for the planning;
       specifies the field and laboratory tests to be conducted, the baseline parameters, the
       number of replicate tests, and the controls;
       specifies field and laboratory equipment and optimal operating parameters;
       specifies sampling methods, sample types, numbers, quantities,  handling, packaging,
       shipping, and custody if the testing involves samples; specifies sample locations,  storage
       conditions, and holding times;
       incorporates analysis methods,  quantitative measures of performance, calibration
       standards, calibration check standards, and performance evaluation samples, as
       appropriate and as identified in the planning process;
       establishes QC check acceptance criteria to ensure attainment of DQOs;
       includes methods and procedures to ensure the test produces data of known and
       acceptable quality;
   -   incorporates any other field or laboratory QA/QC activities identified by planners;
       specifies the requirements for qualifications of technical staff responsible for obtaining,
       analyzing, and evaluating the data;
   -   incorporates protection of the health and safety of testing personnel and the public;
   -   incorporates procedures for the minimization and disposal of generated wastes.

2.1.4   Assessments
Assessments  incorporated into the design include self-assessments (internal audits) by the
verification organization and independent assessments by EPA. The assessments identified in the
planning process are incorporated into the design. The type and minimum number of assessments
are identified in Part A,  Section 9.0. A suggested schedule of assessments is given in Part B,
Section 4.0.

2.1.5   Validating, reporting, securing, and archiving data
Data are verified by the  data collectors and independently validated by technical assessors as
indicated under Audits of Data Quality in Part A, Table 9.1 . Data are reported in ETV
verification reports and ETV verification statements. Data records are stored as discussed in Part
A, Section 5.0 and in Appendix A .
ETV QMP, Version 3.0                     42                         January 2008

-------
2.2    G VPs and test/QA plans: planning documents from the design process
Planning documents from the design process include GVPs and test/QA plans.

Writing planning documents is generally a lengthy process involving iterations of review and
revision. Authors should be knowledgeable of the activity and the equipment described in the
planning documents. Two types of planning documents have been identified, as the core
documentation needed for operation of an ETV center or ESTE project: the GVP and/or the
test/QA plan. The GVP is meant to promote uniform testing for a single center and, therefore, is
considered a more general document. The test/QA plan contains the specific information needed
to conduct a verification test.  In addition, SOPs provide detailed instructions for work processes,
such as sampling, analysis, and data analysis, that are conducted during a verification test.

2.2.1   Generic verification protocols
GVPs can provide the necessary framework for development of the more detailed test/QA plan.
The specific content and level of detail given in GVPs may vary between centers. For some
centers, the GVP may be so detailed that the test/QA plan  may require very little additional
information. For other centers, GVPs may not be appropriate because each of their test/QA plans
is very different from others being done by that center. Given the variable nature of the GVP, no
specific format has been proposed. GVPs are not developed for all of the technology categories
or focus areas that are verified by ETV.  Some ETV centers and ESTE projects only develop
GVPs after some testing has occurred.

The following issues may be addressed in the GVP:
       general description of the center
       responsibilities of all involved organizations
    -   experimental design
       equipment capabilities and description
       description and use of field test sites
    -   description and use of laboratory test sites
    -   DQOs for verification tests
       QA/QC procedures
       use of existing data
    -   data handling
    -   requirements for other documents
       health and safety
       references

The QA/QC section of the GVP typically describes the activities that verify the quality  and
consistency of the work and provides data quality descriptors, such as accuracy, precision,
representativeness, completeness, comparability, and detection limit, as appropriate. Preparation
and use of appropriate QA procedures such as QC samples, blanks, split and spiked samples, and
PEA samples to verify performance of the technology being tested can be described. Frequency
of calibrations and QC checks and the rationale for them can be described. Procedures for
reporting QC data and results can be given. Who is responsible for each QA activity, and who
has the responsibility for identifying and taking corrective action can be specified. However, if

ETV QMP, Version 3.0                      43                        January 2008

-------
these items vary between tests within a given center, the more appropriate document in which to
describe them may be the test/QA plan.

The GVP may cite documents or procedures that explain, extend, and/or enhance the GVP, such
as related procedures, the published literature, or methods manuals. The specific location of any
reference not readily available from a full citation in the reference section should be given (as in
a facility-specific SOP) or attached to the GVP.

2.2.2  Test/QA plans

EPA Order 5360.1 A2 requires that quality systems provide for:
    " (7) Approved Quality Assurance Project Plans (QAPPs), or equivalent documents defined by the QMP, for all
    applicable projects and tasks involving environmental data with review and approval having been made by the
    EPA QAM (or authorized representative defined in the QMP). QAPPs must be approved prior to any data
    gathering work or use, except under circumstances requiring immediate action to protect human health and the
    environment or operations conducted under police powers."

Test/QA plans contain the following elements as given in Guidance on Quality Assurance
Project Plans, EPA QA/G-5. In the ETV QMP, a test/QA plan is identical to a quality assurance
project plan (QAPP).  The GVP,  if one exists, may be incorporated in the test/QA plan by
reference. Not all elements listed in EPA QA/G-5 are appropriate to  every test. The test/QA plan
will note and explain those elements that are not applicable. EPA takes a graded approach to the
level of detail expected in a test/QA plan. For highly visible programs such as ETV, a higher
level of detail is expected.

Group A: Project  Management
This group of QAPP elements covers the general areas of project management, project history
and objectives, and roles and responsibilities of the participants. The following nine elements
ensure that the project's goals are clearly stated, that all participants understand the goals and  the
approach to be used, and that project planning is documented:
       Al Title and Approval Sheet
       A2 Table of Contents and Document Control Format
       A3 Distribution List
    -   A4 Project/Task Organization and Schedule
       A5 Problem Definition/Background
       A6 Proj ect/Task Description
    -   A7 Quality Obj ectives and Criteria for Measurement Data
    -   A8 Special Training Requirements/Certification
       A9 Documentation and Records

Group B: Measurement/Data Acquisition
This group of QAPP elements covers all of the aspects of measurement system design and
implementation, ensuring that appropriate methods for sampling, analysis, data handling, and QC
are employed and will be thoroughly documented:
       Bl Sampling Process Design (Experimental Design)
    -   B2 Sampling Methods Requirements

ETV QMP, Version 3.0                     44                         January 2008

-------
    -   B3 Sample Handling and Custody Requirements
       B4 Analytical Methods Requirements
       B5 Quality Control Requirements
       B6 Instrument/Equipment Testing, Inspection, and Maintenance Requirements
    -   B7 Instrument Calibration and Frequency
    -   B8 Inspection/Acceptance Requirements for Supplies and Consumables
       B9 Data Acquisition Requirements (Non-Direct Measurements)
       BIO Data Management

Existing data can be addressed in QAPP Element B9.  EPA Order 5360.1 A2 specifies that a
QAPP (or equivalent document) be prepared for any project that generates or uses environmental
data to support EPA's obj ectives. Chapter 3 of Guidance on Quality Assurance Project Plans,
EPA QA/G-5 provides guidance on the preparation of test/QA plans for projects that involve the
use of existing data. The existing data policy for the ETV program is given in Appendix C of the
ETV QMP (this document). The policy includes a format for QAPPs that exclusively deal with
existing data.

Group C: Assessment/Oversight
The purpose of assessment is to ensure that the QAPP is implemented as prescribed. This group
of QAPP elements addresses the activities for assessing the effectiveness of the implementation
of the project and the associated QA/QC activities:
    -   Cl Assessments and Response Actions
       C2 Reports to Management

Group D: Data Validation and Usability
Implementation of Group D elements ensures that the individual data elements conform to the
specified criteria, thus enabling reconciliation with the project's objectives. This group of
elements covers the QA activities that occur after the data collection phase of the project has
been completed:
    -   Dl Data Review, Validation, and Verification Requirements
       D2 Validation and Verification Methods
       D3 Reconciliation with DQOs

2.2.2.1  Delegation of review and approval  of Test/QA plans

Upon a written request from the verification organization, EPA directors of quality assurance
may delegate the responsibility for the review and approval of similar test/QA plans for a
specific technology class to a verification organization. The following procedure will be
followed for delegation:

    •   Any delegation of test/QA plan review and approval responsibilities from EPA to a
       verification organization is contingent upon the verification organization's demonstration
       that (1) it has a functioning quality system that has the same rigor and integrity as the
       EPA quality system and (2) that the verification organization quality manager is
       independent of the environmental data collection process. Such a demonstration may be


ETV QMP, Version 3.0                      45                         January 2008

-------
       accomplished during the independent quality systems audits of verification organizations
       by EPA quality managers (see Section 4.1.1).

    •   The scope of the delegation will be for test/QA plans that are similar to and of the same
       technology class as a test/QA plan that has been previously reviewed and approved by
       EPA. The delegation will not extend across technology classes.

    •   Based on a discussion with the EPA Center QA Manager regarding review of the
       previously approved test/QA plan and of the verification organization's demonstration as
       described above, the EPA directors of quality of assurance may approve or reject the
       delegation of the test/QA plan review and approval responsibilities.

    •   The verification organization  shall retain the original test/QA plans, the copies of their
       internal reviews, and the revised test/QA plans so that EPA can conduct an independent
       review of these documents,  either simultaneously or after the fact.

    •   After the responsibility has been delegated to a verification organization and if the EPA
       directors of quality assurance  determine that the quality of the verification organization's
       internal reviews was not meeting EPA's expectations, EPA reserves the right to rescind
       the delegation and/or cease signing the verification statements.

2.2.3   Standard operating procedures
If another level of detail is required for describing test activities, for example operation of an
instrument, an SOP may be written and attached to the test/QA plan. The following topics, from
Guidance for Development of Standard Operating Procedures (SOPs), EPA QA/G-6, may be
included (or a reference provided) in  the SOP:
       title page
       table of contents
    -   procedures - the following are topics that may be appropriate for inclusion in technical
       SOPs, but not all will apply to every procedure or work process detailed:
       — scope & applicability (describing the purpose of the process or  procedure and any
          organizational or regulatory requirements),
       — summary of method (briefly summarizing the procedure),
       - definitions (identifying any acronyms, abbreviations, or specialized terms used),
       — health & safety warnings (indicating operations that could result in personal injury or
          loss of life and explaining what will happen if the procedure is  not followed or is
          followed incorrectly; listed here and at the critical steps in the procedure),
       — cautions (indicating activities that could result in equipment damage, degradation of
          sample, or possible invalidation of results; listed here and at the critical steps in the
          procedure),
       — interferences (describing any component of the process that may interfere with the
          accuracy of the final product),
       — personnel qualifications (denoting the minimal experience the SOP follower should
          have to complete the task  satisfactorily, and citing any applicable requirements, like
          certification or "inherently governmental function"),


ETV QMP, Version 3.0                     46                        January 2008

-------
       —  equipment and supplies (listing and specifying, where necessary, equipment,
          materials, reagents, chemical standards, and biological specimens),
       —  procedure (identifying all pertinent steps, in order, and materials needed to
          accomplish the procedure such as:
          — instrument or method calibration and  standardization
          — sample collection
          — sample handling and preservation
          — sample preparation and analysis (such as extraction, digestion, analysis,
              identification, and counting procedures)
          — troubleshooting
          — data acquisition, calculations and data reduction requirements (such as listing any
              mathematical steps to be followed)
          — computer hardware and software (used to store field sampling records, manipulate
              analytical results, and/or report data),  and
       —  data and records management (e.g., identifying any forms to be used, reports to be
          written, and data and record storage information).
       quality control and quality assurance
       —  QC activities to allow self-verification of the quality and consistency of the work
       —  appropriate QC procedures (such as calibrations, recounting, reidentification), QC
          material (such as blanks - rinsate, trip, field, or method; replicates; splits; spikes; and
          PEA samples) that are required to demonstrate successful performance of the method
       —  frequency of required calibration and QC checks and discuss the rationale for
          decisions
       —  limits/criteria for QC data/results and actions required when QC data exceed QC
          limits or appear in the warning
       —  procedures for reporting QC data and results.
ETV QMP, Version 3.0                     47                        January 2008

-------
           3.0    IMPLEMENTATION OF PLANNED OPERATIONS

3.1    Implementation of planning
Environmental data operations shall be implemented according to the approved planning documents. Deviations shall be
documented and reported to and evaluated by management. Approved changes shall be made and distributed to test
personnel to replace previous versions of the documents.

Technology performance verifications are implemented according to the GVPs and/or test/QA
plans prepared during planning. During implementation, changes are incorporated, reviewed and
approved according to the scheme discussed in Part A, Section 5.0. Test personnel have access to
the approved planning documents, approved changes to planning documents, and all referenced
documents. The final GVPs and/or test/QA plans are posted on the ETV web page for future use
for similar technology verifications.

All implementation activities are documented.  Suitable documents are bound notebooks, field
and laboratory data sheets, spreadsheets, computer records, and output from instruments (both
electronic and hardcopy). All  documentation is developed as described in the planning
documents. All implementation activities are traceable to the planning documents and to test
personnel.

3.2    Services and items
Only qualified and accepted services and items shall be used in the performance verification operations. Acceptance shall
be identified on the items themselves and /or in documents traceable to the items. Tools, gauges, instruments, and other
sampling, measuring, and testing equipment used for activities affecting quality shall be controlled as required and, at
specified intervals, calibrated to maintain accuracy with specified limits. Documentation of calibration shall be
maintained and shall be traceable to the equipment. Periodic preventative and corrective maintenance of equipment shall
be performed, and it shall be recalibrated prior to use.

ETV program services are delivered by the verification organizations, which are accepted via the
solicitation, proposal, and extramural agreement process as discussed in Part A, Section 4.0.

Qualified and accepted services and  items used in testing are provided for in the verification
organization quality systems. The ETV center QMP and ESTE project QMP or joint
QMP/test/QA plan contain provisions for acceptance of services and items, and documentation
of acceptance. Control of equipment, calibration to maintain accuracy within specified limits,
maintenance, and documentation is the responsibility of the verification organization. The
verification organization verifies that the tools, gauges, instruments, and any other sampling,
measuring, and testing equipment used for activities affecting quality are controlled as required
by the planning documents, and calibrated at specified intervals to maintain accuracy within
specified limits. Equipment found to be out-of-specification is not used without documented
repair and reassessment of performance. All maintained and repaired equipment is recalibrated
as necessary before it is used for measurement work.

Oversight is the responsibility of EPA, and is conducted through review and acceptance of the
verification organization quality system documents, the center/project QMP, and through


ETV QMP, Version 3.0                       48                         January 2008

-------
independent audits.  All of the requirements for quality of goods and services on verification
organizations pass through to their subcontractors, including compliance with the center or
verification organization QMP. Verification agreements with subcontractors and analytical labs
should include this and other relevant quality requirements. Verification organizations must
provide oversight of subcontractors and analytical labs to insure QA requirements are met.

3.3    Field and laboratory samples
Handling, storage, cleaning, packaging, shipping, and preservation of field and laboratory samples shall be performed
according to required specifications, protocols, or procedures to prevent damage, loss, deterioration, artifacts, or
interference. Sample chain of custody shall be tracked and documented.

If samples are taken in the field, they are to be handled according to procedures in the test/QA
plan. The oversight responsibility of EPA  is to determine that the approved QMPs and test/QA
plans contain adequate procedures for handling, storage,  cleaning, packaging, shipping, and
preservation of field and laboratory samples to prevent damage, loss, deterioration, artifacts, or
interference. The verification organization must provide adequate chain of custody procedures..

3.4    Data and information management
Data or information management, including transmittal, storage, validation, assessment, processing, and retrieval, shall
be performed in accordance with the approved instructions, methods, and procedures.

ETV program records and the procedures for handling them are listed in Part A, Section 5.0.
ETV QMP, Version 3.0                      49                         January 2008

-------
                     4.0    ASSESSMENT AND RESPONSE

4.1   Assessment types

4.1.1  Quality Systems Audits
A quality systems audit (QSA) is an on-site review of the implementation of a verification
organization quality system as documented in its center/project approved QMP.  This review is
used to verify the existence of, and to evaluate the adequacy of, the quality system. A QSA may
be an internal assessment or an independent assessment. Because QSAs most effectively lead to
the timely correction of identified problems when they are conducted early, they should be
performed in the year following the start of a new center/project.. See Part A, Section 4.2 and
Section 9.0 for required assessment frequency. Guidance is available for conducting QSAs may
be found in EPA's Guidance on Assessing Quality Systems (EPA QA/G-3).

4.1.2  Technical Systems Audits
A technical systems audit (TSA) is a qualitative on-site evaluation of sampling and/or
measurement systems.  The objective of the TSA is to assess and document acceptability of all
facilities, maintenance, calibration procedures, reporting requirements, sampling and analytical
activities, and quality control procedures.  An approved test/QA plan provides the basis for the
TSA. Internal TSAs are conducted by the verification organization quality managers and
independent TSAs are conducted by EPA quality managers as required by Part A, Section 9.0.
Assistance for the TSA may be available to EPA quality managers from QA support contractors.
TSAs are most useful when conducted early in the life cycle of a project when corrective actions
(if necessary) can be performed that will minimize any loss of data. Guidance for conducting
TSAs may be found in EPA's Guidance on Technical Audits and Related Assessments for
Environmental Data Operations (EPA QA/G-7).

4.1.3  Audits of Data Quality
An audit of data quality (ADQ) is an examination of a set of data after it is collected and 100%
verified by project personnel, consisting of tracing at  least 10% of the data from original
recording through transferring, calculating, summarizing and reporting. (Note: "10% of the data"
means a random selection of 10% of the data from all of the measured parameters.) Assessing
whether the data quality indicator (DQI) goals specified in the test/QA plan were met requires a
detailed review of the recording, transferring, calculating, summarizing, and reporting of the
data.  ADQs are conducted as required by Part A, Section 9.0. Internal ADQs are conducted by
the verification organization quality managers.  Independent ADQs are conducted by EPA
quality managers.  Assistance for the ADQ may be available to EPA quality managers from  QA
support contractors. Guidance for conducting ADQs may be found in EPA QA/G-7.

4.1.4  Performance Evaluation Audits
A performance evaluation audit (PEA) is a quantitative evaluation of a measurement system.
Although each measurement in a test program could be subjected to a performance evaluation,
the critical measurements (designated in the test/QA plan) are more commonly evaluated. An
evaluation of a measurement system usually involves the measurement or analysis of a reference

ETV QMP, Version 3.0                     50                        January 2008

-------
material of known value or composition.  The value or composition of reference materials must
be certified or verified prior to use, and the certification or verification must be adequately
documented. Ideally, the identity of the reference material is disguised so that the operator or
analyst will treat the material no differently than a test program sample.  PEAs are conducted as
required in Part A, Section 9.0. Guidance for conducting PEAs may be found in Guidance on
Technical Audits and Related Assessments, EPA QA/G-7.

4.2    Assessment frequency
Activities performed during technology verification performance operations that affect the quality of the data shall be
assessed regularly, and the findings reported to management to ensure that the requirements stated in the GVPs and the
test/QA plans are being implemented as prescribed.

Because of the high visibility of ETV testing, the systematic planning should provide sufficient
auditing to insure the integrity  of the data. The types and minimum frequency of assessments for
the ETV programs are listed in Part A, Section 9.0. The target minimum types and numbers of
assessments for verification tests are the following:
       QSAs - internal assessments by verification organization within one year after approval
       of the initial QMP, then as requested by EPA , and independent assessments by EPA,
       within one year after approval of the initial QMP, then every three years;  or as needed
   -   TSAs - internal assessments by verification organization for each test and independent
       assessments by EPA at a minimum of once per year per center/project;
       PEAs - If a test is capable of being quantitatively audited and if suitable PEA  samples can
       be prepared, internal assessments by verification organization for each test and
       independent assessments by EPA yearly for each center/project
       ADQs - internal assessments by verification organization of at least 10% of all the
       verification data from each test; and independent assessment by EPA at a minimum of
       once per year per center/project

In cases where the target minimums appear to be excessive to the EPA quality managers, their
professional judgment will prevail.  Additional assessments may be included in individual
test/QA plans. Assessments by the verification organization will occur on a continuous and
stable level as provided in Table 9.1.

Assessments shall occur when  verification testing is at a stage where auditing is feasible.
Assessments are most effective early in a verification test so that corrective action may be taken
before the project has been completed and before all environmental data have been collected.  It
is better for an auditor to report earlier so that a verification test can generate better data rather
than to report later when corrective action is no longer possible.

ETV center project officers, ESTE project managers, and EPA quality managers may receive and
review, on a routine basis, verification organization internal assessment reports and
subcontractor internal assessment reports provided by verification organizations.
ETV QMP, Version 3.0                      51                        January 2008

-------
4.3   Response to assessment
Appropriate corrective actions shall be taken and their adequacy verified and documented in response to the findings of
the assessments. Data found to have been taken from non-conforming equipment shall be evaluated to determine its
impact on the quality of the data. The impact and the action taken shall be documented.

Assessments are conducted according to procedures contained in the verification organization
quality systems or the quality procedures available to EPA personnel, as discussed in Part A,
Section 9.0. Findings are provided in audit reports. Responses to adverse findings  are required
within 10 working days of receiving the audit report. Follow-up by the auditors and
documentation of response are required.

     5.0    ASSESSMENT AND VERIFICATION OF DATA USABILITY

5.7   Data verification and validation
Data obtained during verification tests shall be assessed, verified, and qualified according to their intended use (as
verification performance data). Any limitations on this intended use shall be expressed (quantitatively to the extent
practicable) and shall be documented in the ETV verification report.

Data are verified by the data collector.  The goal of data verification is to ensure and document
that the data are what they purport to be (i.e., the reported results reflect what actually was
done). When deficiencies in the data are identified, then those deficiencies should  be
documented for the data user's review  and, where possible, resolved by corrective action. Data
verification applies to activities in the field as well as in the laboratory. Data verification
procedures are specified in the center/project QMP. Validated data are reported in ETV
verification reports and ETV verification statements along with any limitations on the data and
recommendations for limitations on data usability.  All validated data arising from testing under
the ETV program are disclosed in verification reports,  even if the technology did not perform to
the expectations of the technology provider.  EPA provides guidance for writing test/QA plans in
Guidance on Environmental Data Verification and Data Validation, EPA QA/G-8.

5.2   Existing data
Any data obtained from sources that did not use a quality system equivalent to the ANSI/ASQC E4-1994 Standard shall be
assessed according to approved and documented procedures.

Existing data may be used for planning and other purposes (e.g., to augment verification testing),
subject to rules set up by each center. Although the centers need to establish the quality of these
data, as required under EPA Order 5360.1, no ETV-wide guidelines are necessary for the use of
existing data for purposes other than for use in ETV verification reports and statements. Existing
data used in ETV verification reports and  statements that are  collected outside the ETV program
are subject to rigorous scrutiny according  to the procedure in Appendix C.

5.3   Reports reviewed
ETV verification reports containing data and reporting the results of technology verification performance shall be
reviewed independently (i.e., by others than those who produced the data or the reports) to  confirm that the data or results
are presented correctly. These reports shall be approved by management prior to release, publication, or distribution.
ETV QMP, Version 3.0                      52                         January 2008

-------
The procedure for ETV verification report and ETV verification statement review and approval
is given in Part A Section 5.0. ETV verification reports and statements are peer-reviewed by the
EPA ORD peer review process. ETV verification statements are signed by the respective EPA
laboratory directors and the verification organization representative.
ETV QMP, Version 3.0                      53                        January 2008

-------
                                  REFERENCES

American National Standard Specifications and Guidelines for Quality Systems for
Environmental Data Collection and Environmental Technology Programs. ANSI/ASQC E4-
1994. Milwaukee WI: American Society for Quality, 1994.

bridge to a Sustainable Future - National Environmental Technology Strategy, US GPO S/N
061-000-00839-0, Washington, D.C., Executive Office of the President, April  1995.

Environmental Technology Verification Program Policy Compendium., Cincinnati OH: U. S.
Environmental Protection Agency, 2004.

Environmental Technology Verification Program Quality and Management Plan for the Pilot
Period (1995-2000), EPA/600/R-98/064,  Cincinnati OH: U. S. Environmental Protection
Agency, 1998.

Environmental Technology Verification Program Verification Strategy), EPA/600/K-96/003,
Washington DC: U. S. Environmental Protection Agency, 1997.

General Requirements for the Competence of Testing and Calibration Laboratories, ISO/IEC
17025. Geneva,  Switzerland: International Organization  for Standardization (ISO), 1999.

Guidance for the Data Quality Objectives Process, EPA  QA/G-4, EPA/600/R-96/055.
Washington DC: U.S. Environmental Protection Agency, 2000.

Guidance for Quality Assurance Project Plans, EPA  QA/G-5, EPA/240/B-01/003. Washington
DC: U.S. Environmental Protection Agency, 2001.

Guidance for the Preparation of Standard Operating Procedures (SOPs) for Quality Related
Documents, EPA QA/G-6, EPA/240/B-01/004. Washington DC: U.S. Environmental Protection
Agency, 2001.

Guidance on Assessing Quality Systems, EPA QA/G-3, EPA/240/R-03/002. Washington DC:
U.S. Environmental Protection Agency, 2003.

Guidance on Environmental Data Verification and Data Validation, EPA  QA/G-8, EPA/240/R-
02/004. Washington DC: U.S. Environmental Protection Agency, 2002.

Guidance on Technical Audits and Related Assessments, EPA QA/G-7. Washington DC:  U.S.
Environmental Protection Agency, 2000.

Guidance for Data Quality Assessment, Practical Methods for Data Analysis, EPA QA/G-9,
EPA/600/R-96/084. Washington DC: U.S. Environmental Protection Agency, 2000.
ETV QMP, Version 3.0                    54                        January 2008

-------
Integrated Information and Quality Management Plan (IIQMP)for the National Exposure
Research Laboratory and National Center for Computational Toxicology (NERL/NCCT),
Document Control Number NERL/NCCT IIQMP No. 1, Research Triangle ParkNC: U.S.
Environmental Protection Agency, 2005.

Higher-level Contract Quality Requirements., Federal Acquisition Regulation (FAR) Clause
52.246-11 Washington DC: General Services Administration, 1999.

Quality Management Plan for the National Risk Management Research Laboratory (NRMRL),
Document Control Number NRMRLQA 001 rev 1, Cincinnati OH: U. S. Environmental
Protection Agency, 2002.

Policy and Program Requirements for the Mandatory Agency-wide Quality System. EPA Order
5360.1 A2. Washington DC: U.S. Environmental Protection Agency, 2000.

Policy for Distinguishing between Assistance and Acquisition. EPA Order 57000.1. Washington
DC: U.S. Environmental Protection Agency, 1994.

Pollution Prevention Research Strategy, EPA/600/R-98/123. Washington DC: U.S.
Environmental Protection Agency, 1998.

Pollution Prevention andNew Technology Multiyear Plan (Draft). Washington DC: U.S.
Environmental Protection Agency, 2003.

Preparation Aids for the Development of Category II Quality Assurance Project Plans,
EPA/600/8-91/004. Cincinnati OH: U.S. Environmental Protection Agency, 1991.

Quality Management Principles, Geneva, Switzerland: International Organization for
Standardization (ISO),  2000. Available: www.iso.ch/iso/en/iso9000-14000/iso9000/qmp.html.

Quality Management Systems—Requirements. ISO 9001:2000. Geneva Switzerland:
International Organization for Standardization (ISO), 2000.

Reinventing Government; A Performance Review

Sustainability Research Strategy (ExternalReview Draft), Washington DC: U.S. Environmental
Protection Agency, 2006.

Science and Technology for Sustainability Multi-Year Plan (FY2008-FY2012) (External Review
            Draft), Washington DC:  U.S. Environmental Protection Agency, 2006.
ETV QMP, Version 3.0                     55                        January 2008

-------
       APPENDIX A: U.S. EPA RECORDS CONTROL SCHEDULE

Applicable records schedules include the following:

EPA Series No.     Title

003               Grants and Other Program Support Agreements
006               Program Management Files (Agency-wide All Programs)
185               Quality Assurance Project Plans (Agency-wide All Programs)
202               Contract Management Records (Agency-wide All Programs except
                  Superfund Site Specific)
258               Final Deliverables and Reports (Agency-wide All Programs)

Consult the National Records Management Program (NRMP) website
(http://www.epa.gov/records/index.htm) for the most recent information on EPA records
management.
ETV QMP, Version 3.0                    56                      January 2008

-------
        APPENDIX B:  WHAT CONSTITUTES SUCCESS FOR ETV?

B.I          Timing
   -  For ETV centers, no more than one year for verification organization selection.
      For ETV centers, no more than one year after verification organization selection for
      completing the organizational phase (i.e., stakeholder selection, technology prioritization,
      initial GVP and/or test/QA plan development, approval of GVPs and/or test/QA plans).
   -  For each ETV center and ESTE project verification test, no more than twelve months
      between vendor agreement and draft final report (excluding the duration of the test)
      No more than two months for EPA approval and one month for publication.

B.2          Outcomes
   -  EPA client offices use ETV information in mandatory and voluntary compliance and
      technology assistance programs
      Significant number of States accepts ETV data for permitting.
      Significant number of consulting engineers use ETV data for making technology
      recommendati ons.
      Significant number of vendors report a positive experience in ETV.
      Vendors return to test additional technologies under ETV.
   -  Applications for testing exceed ETV capacity.
   -  Website survey show positive response. Web site use increases over time
      Vendor sales data; technology use  data.
ETV QMP, Version 3.0                     57                       January 2008

-------
                   APPENDIX C: EXISTING DATA POLICY

C.I          Background

Consistent with the Agency's commitment to increasing efficiency, ETV seeks to identify
optimal methods to verify environmental technologies without compromising quality or ETV's
independence. One consistent request made by ETV's stakeholders and others has been that
existing data (i.e., data collected from outside of ETV) be used for ETV verification. This
suggestion is reinforced by the programs of individual states, as well as those of other countries,
that routinely consider previously collected data in the verification of vendor claims for a
technology.  It is also consistent with EPA Order 5360.1, which states:

    "establishes the policy and program requirements for the preparation and implementation of organizational or
    programmatic management systems pertaining to quality and contains minimum requirements for the
    mandatory Agency-wide Quality System."

 The Order requires that the Quality System provide for:

    "(8) Assessment of existing data, when used to support Agency decisions or other secondary purposes, to verify
    that they are of sufficient quantity and adequate quality for their intended use."

Guidance for Quality Assurance Project Plans, EPA QA/G-5 defines existing data as  follows:

    "Existing data are data or information that you plan to use that have not been newly generated by your project.
    They may also be known as secondary data or non-direct measurement."

Compelling arguments exist for using qualified existing data to replace some or all of the data
that could be generated under ETV testing and evaluation. Because resources are limited and
verification testing is both time-consuming and costly, ETV is able to test a limited number of
technologies per year. Further, verification tests can, at best, show the performance of the
technology under only limited conditions and limited time periods. Since many technologies
have been tested numerous times both before and after reaching a commercially viable stage  of
development, it is possible that existing data could be  used to increase and enhance the scope of
center verification tests, with acceptably reproducible  and accurate results. A well-developed
policy for using existing data could also allow the ETV program to establish reciprocity
agreements and exchange data with other testing organizations interested in verifying technology
performance using an existing GVP or test/QA plan.1 These agreements could leverage ETV's
ability to verify innovative technologies, increasing the number of technologies verified.

Recognizing that it is neither prudent nor cost-effective to ignore existing data, ETV establishes
by this appendix its guidelines for using existing data.
 This includes GVPs or test/QA plans that are jointly developed with other testing organizations and subsequently "adopted" by
ETV.
ETV QMP, Version 3.0                      58                         January 2008

-------
C.2          Requirements for reviewing and usinge existing data

Because the consequences of a serious verification decision error can include the verification of
fraudulent claims, litigation, and loss of credibility for ETV, the verification organizations and
EPA, it is essential that the data considered as a replacement for verification testing undergo a
rigorous process of evaluation using stringent criteria. Thus, in addition to specific requirements
for reviewing existing data and ensuring that the origin and quality of the data on which the
verification statement rests are known and documented, this appendix also includes minimum
acceptance criteria for using existing data to replace verification  testing. Table C-l contains a
list of documentation that must be provided to the verification organization to support an existing
data review.

              Table C-l. Documentation Needed for an Existing Data Review
 Documentation to confirm the third-party testing organization's independence from the vendor
 Documentation to assess the adequacy of the quality system employed by the third-party testing
 organization responsible for collecting the data
 Copies of the protocols and test plans used to collect the existing data
 A data report containing the data, including associated metadata, that the vendor proposes to be
 used in place of verification testing. This report will clearly identify which performance
 objectives/factors found in the test/QA plan are requested to be verified using the existing data.
 This report must  also: (1) explain any qualifications to these data; (2) identify which data were
 excluded and provide an explanation regarding how and why the data were excluded; (3)
 identify any organization-specific, legal, or security specifications for the use of the data (e.g.,
 confidential business information); and (4) address other requirements specific to the center.
 A complete list of all the data and QA/QC data obtained by the third-party testing organization*
 Letters from the vendor and the third-party testing organization stating that they have disclosed
 all relevant data that were obtained during the testing of the technology and that the third-party
 verification organization has accurately reported the quality system employed during testing.
 A letter from the vendor stating that it has accurately reported the relationship between the
 vendor and third-party testing organization.
* A complete set of the data obtained by third-party testing organization will not need to be provided. The vendor
must ensure, however, that the verification organization or EPA technical and QA staff (as appropriate) will be able
to visit and review the data at the third-party testing facility upon request. The verification organization will be
responsible for storing only the data the verification organizations actually used to verify the technology's
performance for a minimum of seven years after the final payment of the extramural agreement to the verification
organization.
ETV QMP, Version 3.0                       59                         January 2008

-------
Because technical and monetary resources are limited, centers/projects and EPA need to
carefully assess whether an existing data review is an appropriate use of verification organization
and EPA resources. Priority consideration will be given to:

-  Data submitted for review that were generated by a testing organization that has an
   agreement with EPA establishing reciprocity requirements with ETV.
   Vendors that want to have data reviewed for a previously verified technology that has either
   been: 1) tested under different test/operating conditions; or 2) updated and retested to
   demonstrate changes in the performance of the technology relative to the initial verification.
-  Vendors that applied too late to be included for the most recent verification testing on a
   specific technology category, assuming no additional rounds of testing are anticipated.

The vendor should also be prepared to cover the total cost of the existing data review, including
the generation of the test/QA plan and any QA reviews and assessments that may need to be
performed.

The following subsections address the major requirements for an existing data review.

C.3.1  EPA personnel must be notified when a center/project or an ESTE project is
   considering using existing data to replace verification testing.

EPA personnel, including the ETV director and appropriate EPA QA managers, must be notified
when an ETV center or an ESTE project is considering using existing data to partially or totally
replace verification testing.

C.3.2  A test/QA plan, or an amendment to an appropriate existing GVP and/or test/QA
   plan, will be used to direct  the data review effort.

EPA Order 5360.1  A2 specifies  that a QAPP, or equivalent document defined by an
organization's QMP, must be prepared for any project that generates or uses environmental data
to support EPA's objectives. To  comply with this requirement either a test/QA plan or an
amendment to an existing GVP or test/QA plan must be developed to direct the data evaluation
effort. This plan will need to go  through a review and approval cycle that meets ETV and
center/project QMP requirements.

Project personnel should refer to Guidance for Quality Assurance Project Plans, EPA QA/G-5
and Guidance on Environmental Data Verification and Data Validation, EPA QA/G-8, when
developing and reviewing the test/QA plan. Chapter 3 of EPA QA/G-5 contains guidance for
projects using existing data and identifies possible changes that could be made to different
project plan elements (e.g., for project management, data generation and acquisition, assessment
and oversight, and  data validation and usability) when existing data are used. EPA QA/G-8
contains guidance to help organizations conduct data verification and data validation activities.

The test/QA plan must clearly identify the intended use of the existing data (e.g., to verify the
performance of a technology through the evaluation of specific performance objectives/factors)


ETV QMP, Version 3.0                     60                        January 2008

-------
and the data quality specifications (e.g., DQOs) used to determine whether the data are of
sufficient quality to support their intended use. The plan should also identify the specific
procedures that will be followed to perform and document the review and verification process,
including the following:

-  the procedures to be used by the verification organization to assess the third-party testing
organization's independence from the vendor and the controls that were in place to prevent the
vendor from influencing the outcome of testing, since only data collected objectively and
independently of the vendor may be used to replace verification testing
   -   the process to be used by the verification organization to  assess whether the quality
       system employed by the third-party testing organization (laboratory or agency)
       responsible for collecting the data meets the requirements established for ETV
   -   The procedures to be used by the verification organization to confirm that the GVP or
       test/QA plan used to collect the existing data was "equivalent" to an existing GVP or
       test/QA plan3
       The specific acceptance criteria associated with using the  existing data to verify the
       performance of the technology (e.g., specific performance objectives/factors identified in
       the GVP and test/QA plan used to collect the existing data), including the data quality
       specifications
       The procedures to be used by the verification organization to assess whether the existing
       data meet ETV's data quality specifications (e.g., DQOs)  referenced in the test/QA plan
   -   Other important features of data quality, such as the level  of peer review and the quantity
       of data that are flagged
       The roles, responsibilities, and qualifications of the different parties involved in
       reviewing the existing data

The following format can be used for test/QA plans that deal exclusively with existing data:

Section 1.0, Project Objectives, Organization, and responsibilities
   1.1 The purpose of study shall be  clearly stated.
   1.2 Project objectives shall be clearly stated.
   1.3 The existing data needed to satisfy the project objectives shall be identified.
       Requirements relating to the type of data, the age of data,  geographical representation,
       temporal representation, and technological representation, as applicable, shall be
       specified.
   1.4 The planned approach for evaluating project objectives, including formulas, units,
       definitions of terms, and statistical analysis, if applicable, shall be included.
   1.5 Responsibilities of all project participants shall be identified, meaning that key personnel
       and their organizations  shall be identified, along with the designation of responsibilities
       for planning, coordination, data gathering, data analysis, report preparation, and quality
       assurance, as applicable.
        This will not need to be addressed if an existing GVP or test/QA plan was used to collect the existing data.
ETV QMP, Version 3.0                       61                         January 2008

-------
Section 2.0, Sources of Existing Data
    2.1 The source(s) of the existing data must be specified. 2.2 The rationale for selecting the
       source(s) identified shall be discussed.2.3 The sources of the existing data will be
       identified in any project deliverable.

Section 3.0, Quality of Existing Data
    3.1 Quality requirements of the existing data must be specified. These requirements must be
       appropriate for their intended use. Accuracy, precision, representativeness, completeness,
       and comparability need to be addressed, if applicable. (If appropriate, a related QAPP
       containing this information can be referenced.)
    3.2 The procedures for determining the quality of the existing data shall be described.

Section 4.0, Data Reporting, Data Reduction, and Data Validation
    4.1 Data reduction procedures specific to the project shall be described, including
       calculations and equations.
    4.2 The data validation procedures used to  ensure the reporting of accurate project data
       shall be described.
    4.3 The expected product document that will be prepared shall be specified (e.g., journal
       article, final report, etc.).

If an appropriate existing GVP or test/QA plan is not available for the technology category under
consideration, and thus not available to serve as the basis for performing an existing data review,
an ETV center or ESTE project manager can either:
       Refuse to review the data based on the  fact that an appropriate GVP or test/QA plan does
       not currently exist, or
       Develop (with stakeholder and EPA input) a preliminary list of performance
       objectives/factors for evaluation and possible data needs, assuming that EPA, verification
       organization, stakeholders, and vendor  support the eventual generation of an GVP or
       test/QA plan for that technology category.

If, after reviewing the preliminary list of performance objectives/factors and data needs, the
vendor is still interested in pursuing an existing data review, the verification organization  will
proceed with developing a GVP for that technology  category with input from the stakeholders
and ETV center management or EPA ESTE project management.4 The new GVP or test/QA plan
will serve as the basis for evaluating whether the data/technology meets the same data quality
specifications as the data collected in a comparable ETV/ESTE verification test. The vendor will
be liable for some or all of the cost of protocol development.
       4
        In general, EPA center management consists of the ETV center project officer and EPA quality manager and EPA ESTE
project management consists of the ESTE project manager and the EPA quality manager.


ETV QMP, Version 3.0                      62                         January 2008

-------
C.3.3  The data were objectively and independently collected by a third-party testing
   organization.

The verification organization must be provided with written documentation by the third-party
testing organization demonstrating its independence from the vendor. This information will need
to be made available to the public, preferably through an electronic link to a public information
domain.

 The verification organization must confirm that the third-party testing organization that
collected the data is independent of the vendor and that controls were in place to prevent the
vendor from influencing the outcome of the testing. The verification organization will also need
to determine whether the third-party testing facility is free from any outside influences —
monetary, organizational, commercial, or otherwise — which could potentially be perceived as
biasing the integrity of the test and the impartiality of the third-party testing organization.  If the
existing data were collected by the verification organization but not during an ETV-related
effort, the verification organization will need demonstrate that no organizational or other conflict
exists that could be perceived as biasing the verification organization's ability to impartially
review the existing data. The verification organization will need to provide EPA with a written
explanation supporting this claim, which EPA personnel, including the ETV center project
officer or ESTE project manager, the ETV director and appropriate EPA QA personnel, will
review to determine whether the verification organization has successfully demonstrated that no
actual or potential bias exists. If EPA believes that the verification organization has not
successfully addressed this issue, the existing data cannot be used for verification unless an
impartial third-party with the capabilities to perform the review (e.g., another verification
organization) can be identified.

The consequences of inaccurately or falsely reporting the relationship between the vendor and
the third-party testing organization may include the revocation of the verification, removal of the
verification statement and  report from the ETV Web site, and the release of an broad
announcement by the verification organization stating that the verification has been revoked.

C.3.4  The quality system employed by the third-party testing organization during the
   collection of the data  meets ETV requirements.

Only data collected under a well-defined, documented quality system will be considered for
verification. The verification organization must be provided with written documentation by the
third-party testing organization describing the quality system employed during the generation of
the existing data. The verification organization will review this documentation to determine
whether the quality system meets ETV requirements.  Quality systems that conform to, or are
modeled (see next sentence for further clarification) after,  ANSI/ASQC Standard E-4, 1994 or
ISO Standard 9001:2000 have been determined to meet ETV requirements. When a testing
organization's quality system is modeled after one of the above standards, a point-by-point
analysis may need to be performed to determine whether the quality system used to collect the
existing data meets ETV's quality system requirements (i.e., ANSI/ASQC  Standard E-4, 1994 or
ISO Standard 9001:2000). It is also possible that a quality system and/or technical systems audit


ETV QMP, Version 3.0                      63                         January 2008

-------
may also be performed as part of the existing data review, at the EPA's or the verification
organization's request.

In the event that either the verification organization and/or EPA center management or EPA
ESTE project management determine that the documentation is insufficient to assess whether the
third-party testing organization's quality system or technical systems meets ETV QA
requirements, the verification organization, EPA center management, or EPA ESTE project
management will either:
    -   Discontinue the existing data review based on the fact that appropriate quality system or
       technical system documentation was not provided, or
       Perform an independent on-site assessment of the third-party testing organization's
       quality system and/or technical systems.

Such assessments will parallel EPA's assessments of verification organizations and verification
organization's assessments of its subcontractors.

The consequences of inaccurately or falsely reporting the quality system employed by the third-
party testing organization may include the revocation of the verification, removal of the
verification statement and report from the ETV Web site, and the release of a broad
announcement by the verification organization stating that the verification has been revoked.

C.3.5  The data were collected using either an existing GVP or test/QA plan or an
    "equivalent" protocol or test/QA plan developed outside of ETV that clearly and
    comprehensively describes the procedures used to test the technology, collect and
    analyze the data, and ensure data quality

The verification organization, the ETV center project officer or ESTE project manager, and EPA
QA managers must be provided with copies of the GVP, the test/QA plan and/or other written
documentation used by the third-party testing organization to generate the existing data. If the
testing organization used an existing GVP or test/QA plan to collect the existing data, this
documentation will be reviewed to determine whether the testing organization properly
implemented the GVP or test/QA plan during the tests.

If the third-party testing organization did not use an existing GVP or test/QA plan, a point-by-
point comparison must be made by the verification organization quality manager between the
"equivalent" test plan used to collect the existing data and the existing GVP or test/QA plan. This
review will assess whether the procedures in the testing organization's test plan meet the data
quality and usability requirements referenced in the corresponding ETV documents. This review
will assess all the QAPP elements listed in  Section 2.2.2 of Part B of the ETV QMP.

When similar but different sampling and analytical methods were used by the third-party testing
organization (e.g., to collect, analyze, review, or reduce the data), method validation needs to be
provided to confirm that the methods will be adequate for the intended use of the data. Project
personnel should refer to Section 2.2.4 of Guidance for Quality Assurance Project Plans, EPA
QA/G-5 for information about method validation data. After this validation is completed, the


ETV QMP, Version 3.0                      64                        January 2008

-------
verification organization must document the dissimilarities, whether the verification organization
considers the methods to be equivalent and why, and possible impacts these dissimilarities could
have on the data and evaluation of the performance objectives/factors.

C.3.6  The data are of sufficient quality and quantity to verify the technology's
    performance, as determined by the performance objectives/factors, QA/QC
    requirements, and data quality specifications referenced in the test/QA plan.

The third-party testing organization must provide a data package/report that is of sufficient
quantity and quality to support the evaluation of the technology's performance. The verification
organization should also receive letters from both the vendor and the third-party testing
organization stating that they have disclosed all relevant data that were obtained during the
testing of the technology. These letters are designed to ensure that partial data sets reflecting
unusually high levels of performance (e.g., data sets that are not representative of the system's
true or typical performance) are not sent for review.

During the existing data review, the verification organization will determine whether the dataset
is complete, preferably during an on-site visit to the third-party testing organization. The
verification organization will assess whether sufficient information is available on the source of
the data and their quality specifications. They will also confirm that the data appear to have been
collected following the protocols and test/QA plans provided by the vendor or the third-party
testing organization.

The verification  organization must also perform an independent ADQ for 10 percent of the
existing data. This random selection of data points is traced from the raw data set to the final
report. Section 4 of Part 2 of the ETV QMP discusses ADQ requirements. This audit can either
be performed at the verification organization's facility or the third-party testing organization's
facility, as convenient for both parties.

The verification  organization should also document how well the data meet the acceptance
criteria in the test/QA plan for different performance objectives and whether the QA/QC
requirements and data quality specifications in the test/QA plan have been met. This
documentation should also describe which performance objectives were adequately addressed by
the existing data and which were not, as well as any caveats that must be attached to the data.

The verification  organization will be responsible for storing only the raw data actually used to
verify the technology's performance for a minimum of seven years after the final payment of the
extramural agreement to the verification organization. If the data were reviewed under an
unfunded agreement with EPA, the verification organization will be responsible for storing the
raw data used to verify the technology's performance for a minimum of seven years after the
verification report is issued. The consequences of submitting false, inaccurate, or incomplete
data packages may include the revocation of the verification, removal of the verification
statement and report from the ETV Web site, and the release of a broad announcement by the
verification organization stating that the verification has been revoked.
ETV QMP, Version 3.0                      65                        January 2008

-------
C.3.7  The minimum acceptance criteria identified in Table C-2 must be satisfied to use
    existing data in a verification report and statement.

After completing the review, the verification organization must determine whether the criteria in
Table C-2 have been met. If these criteria gave been met, the existing data can be used, either
alone or in combination with verification testing results, to verify the performance of the
technology. The verification organization must ensure that all of the critical measurements and
data quality specifications identified in the existing GVP or test/QA plan are fully addressed
before developing a verification statement and report.  "Partial" verification reports and
statements must not be generated.5

In addition to reporting on the performance of the technology as determined during the existing
data review, the verification report must identify which performance objectives were addressed
by the existing data and which were not. The report must also include any caveats that need to be
attached to the data. The EPA quality manager and the ETV center project officer or ESTE
project manager will review the data and determine whether can be used to replace verification
testing during their review of the verification report6 EPA center management and ESTE project
management will need to confirm that the criteria outlined in Table C-2 have been met before
EPA can approve the verification report or statement.

                Table C-2. Minimum Acceptance Criteria for Existing Data
                                          Criteria
  Data were collected objectively by an independent third-party testing organization.
  Data were collected using GVPs and test/QA plans provided to the verification organization
  The quality system employed by the third-party testing organization during the collection of
  the data meets ETV requirements.
  The GVP or test/QA plan used to collect the data is "equivalent"* to the existing ETV
  verification protocol or test/QA plan.
  The data are quality assured and meet the minimum QA/QC requirements and data quality
  specifications referenced in the test/QA plan.
  The data meet the acceptance criteria referenced in the test/QA plan.
  The data are of sufficient quality and quantity to verify the technology's performance, as
  determined by the performance objectives/factors referenced in the test/QA plan.
* Equivalence will be determined relative to the critical measurements and data quality specifications
        Some the ETV test/QA plans or protocols may state that the vendor or center can choose: (1) a subset of critical
factors/objectives; or (2) a subset of specific conditions/sites. This may need to be considered during the existing data review.

        EPA's involvement in the existing data review may occur earlier in the data review and reporting process, at the
discretion of the EPA quality manager, the ETV center project officer, or the ESTE project manager.


ETV QMP, Version 3.0                      66                         January 2008

-------
C.3.8  Data Evaluation Panel Review

If the VO determines that the criteria listed in Table C-2 have been met, a Data Evaluation Panel
(DEP) may be formed at the verification organization's or EPA's request. The DEP will consist
of at least four qualified members: a verification organization representative7, the ETV center
project officer or ESTE project manager, the EPA quality manager (and/or an appropriate
designee, if the EPA quality manager performed portions  of the data review), and an outside
expert. (Note: The outside expert can be an EPA employee who is not a member of the ETV
team.) The members of the DEP must be credible, experienced, knowledgeable, and qualified in
the technical area critical to the technology being evaluated. DEP members must also be free of
any real or perceived conflict of interest with the commercial vendor of the technology they are
evaluating and cannot have been involved in the collection of the data being evaluated. The ETV
center project officer or ESTE project manager, EPA QA representative, and the outside expert
also cannot have been directly involved in the actual existing data review.

After the DEP is formed, the verification organization will forward the following to the panel:
    -   The test/QA plan (including deviations)
    -   A data review report documenting the verification organization 's findings and whether
       the criteria in Table C-2 have been met, including  how well the data meet the acceptance
       criteria and data quality specifications (e.g., DQOs) referenced in the test/QA plan.
    -   Appropriate sections of the original data package.

The panel will review these materials and determine:
       Whether they agree with the verification organization 's conclusions/recommendations
    -   Whether they believe that these conclusions/recommendations are adequately supported
       by the review performed by the center.

The DEP's evaluation shall follow the procedures and criteria developed by the verification
organization and EPA for other technology verifications conducted by the center.

The verification organization will be given the opportunity to respond with opposing
recommendations, either alone or after  contacting the third-party testing organization for
clarification. The verification organization can only proceed with verifying the technology using
existing data if the DEP concurs with its recommendation.
        Although a verification organization representative will serve as a member of the DEP, this representative will not be
responsible for evaluating/commenting on the quality of the existing data review and conclusions/recommendations generated
during the data review process. This representative will instead answer any questions that other DEP members have and respond to
their comments.


ETV QMP, Version 3.0                      67                         January 2008

-------
C.3.9  References

American National Standard Specifications and Guidelines for Quality Systems for
Environmental Data Collection and Environmental Technology Programs. ANSI/ASQC E4-
1994. Milwaukee WI: American Society for Quality, 1994.

Quality Management Systems—Requirements. ISO 9001:2000. Geneva Switzerland:
International Organization for Standardization (ISO), 2000.

Guidance for Developing Quality Systems for Environmental Programs (QA/G-1), EPA/240/R-
02/008 Washington DC: U.S. Environmental Protection Agency, 2002.

Guidance on Environmental Data Verification and Data Validation (QA/G-8), EPA/240/R-
02/004, Washington DC: U.S. Environmental Protection Agency, 2002.

Guidance for Quality Assurance Project Plans (EPA QA/G-5), EPA/240/R-02/009, Washington
DC: U.S. Environmental Protection Agency, 2002.

Policy and Program Requirements for the Mandatory Agency-wide Quality System. EPA Order
5360.1 A2. Washington DC: U.S. Environmental Protection Agency, 2000.
ETV QMP, Version 3.0                    68                       January 2008

-------
 APPENDIX D: RECOMMENDED LANGUAGE FOR SOLICITATION OF
                     VERIFICATION ORGANIZATIONS

Verification organizations in the ETV program are solicited via CAs, lAGs, or contracts.
Appropriate language must be incorporated into the solicitation and/or the award/agreement
documentation by the Grants Administration or Contracts Management. The following language
and supporting documentation are recommended to be included in the solicitation for the
verification organization, whether competitive or non-competitive.

Quality Assurance Requirements

The awardee shall comply with the following:

Before award, the proposal shall include a copy of the offerer's QMP describing the quality
system that provides the framework for planning, implementing, and assessing work performed
to carry out the required QA and QC activities.

After award, the awardee must submit a QMP prepared in accordance with the EPA
Requirements for Quality Management Plans (QA/R-2) and the requirements as described in the
latest version of the ETV QMP. The  center QMP must be approved by the ETV center project
officer or ESTE project manager and  EPA quality manager before testing begins.
ETV QMP, Version 3.0                    69                       January 2008

-------