Revision 0 - 02/22/02

ElV

crc

Concurrent

Technologies

Corporation

U.S. ENVIRONMENTAL PROTECTION AGENCY

ENVIRONMENTAL TECHNOLOGY VERIFICATION

PROGRAM

FOR METAL FINISHING POLLUTION PREVENTION

TECHNOLOGIES

GENERIC VERIFICATION PROTOCOL

Concurrent Technologies Corporation is the Verification Partner for the EPA ETVMetal
Finishing Pollution Prevention Technologies Center under EPA Cooperative Agreement

For

Aqueous Cleaner Recycling Technologies

Revision 0

February 22, 2002

No. CR826492-01-0.


-------
Revision 0 - 02/22/02

ElV

Concurrent
CPC Technologies
Corporation

U.S. ENVIRONMENTAL PROTECTION AGENCY

ENVIRONMENTAL TECHNOLOGY VERIFICATION

PROGRAM

FOR METAL FINISHING POLLUTION PREVENTION

TECHNOLOGIES

GENERIC VERIFICATION PROTOCOL

For

Aqueous Cleaner Recycling Technologies

i


-------
Revision 0 - 02/22/02

TITLE: Generic Verification Protocol for Aqueous Cleaner Recycling Technologies

ISSUE DATE: February 22, 2002

DOCUMENT CONTROL

This document shall be maintained by Concurrent Technologies Corporation in accordance with
the EPA Environmental Technology Verification Program Quality and Management Plan for the
Period 1995-2000 (EPA/600/R-98/064). Document control elements include unique issue
numbers, document identification, numbered pages, document distribution records, tracking of
revisions, a document master filing and retrieval system, and a document archiving system.

ACKNOWLEDGMENT

This is to acknowledge Valerie Whitman for her help in preparing this document.

Concurrent Technologies Corporation is the Verification Partner for the EPA ETV Metal
Finishing Pollution Prevention Technologies Center under EPA Cooperative Agreement

No. CR826492-01-0.

11


-------
Revision 0 - 02/22/02

Environmental Technology Verification Program for Metal Finishing Pollution Prevention
Technologies Generic Verification Protocol for Aqueous Cleaner Recycling Technologies

PREPARED BY:

	W 	

jai»*Ibltei-	Date

CTC Project Manager

APPROVED BY:

CTl" HTV-MF Program Manger

George
fcPA ETĄ Center Manager



-Am

Marion 1.. Rideout	/>
-------
Revision 0 - 02/22/02

TABLE OF CONTENTS

Page

1.0 INTRODUCTION	1

2.0 TECHNOLOGY DESCRIPTION	2

2.1	Theory of Operation	2

2.2	Technology Decription	2

2.3	Test Site Description	2

2.4	Previous Testing	2

3.0 TEST DESIGN	3

3.1	Data Quality Obj ectives (DQO)	3

3.2	Critical and Non-Critical Measurements	3

3.3	Test Matrix	4

3.4	Sample Collection and Handling	4

3.4.1 Process Measurements and Information Collection	4

3.5	Analytical Procedures	5

3.6	Cost Evaluation	5

3.7	Waste Reduction	5

4.0 QUALITY ASSURANCE/QUALITY CONTROL REQUIREMENTS	5

4.1	Quality Assurance Objectives	5

4.2	Data Reduction, Validation, and Reporting	6

4.2.1	Internal Quality Control Checks	6

4.2.2	Calculation of Data Quality Indicators	7

4.2.2.1	Precision	7

4.2.2.2	Accuracy	7

4.2.2.3	Completeness	8

4.2.2.4	Comparability	8

4.2.2.5	Representativeness	8

4.2.2.6	Sensitivity	9

4.3	Quality Audits	10

5.0 PROJECT MANAGEMENT	10

5.1	Organization/Personnel Responsibilities	10

5.2	Test Plan Modification	11

iv


-------
Revision 0 - 02/22/02

6.0 HEALTH AND SAFETY PLAN	11

6.1	Hazard Communication	11

6.2	Emergency Response Plan	11

6.3	Hazard Controls Including Personal Protective Equipment	11

6.4	Lockout/T agout Program	11

6.5	Materi al Storage	12

6.6	Safe Handling Procedures	12

7.0 WASTE MANAGEMENT	12

8.0 TRAINING	12

9.0 REFERENCES	12

10.0 DISTRIBUTION	13

LIST OF APPENDICES

APPENDIX A:	Test Plan Modification		A-1

APPENDIX B:	ETV-MF Operation Planning Checklist		B-1

APPENDIX C:	Job Training Analysis Form		C-l

APPENDIX D:	ETV-MF Project Training Attendance Form		D-1

v


-------
Revision 0 - 02/22/02

ACRONYMS & ABBREVIATIONS

coc

Chain of Custody

CTC

Concurrent Technologies Corporation

DOT

Department of Transportation

DQO

Data Quality Objectives

EHS

Environmental, Health and Safety

EPA

U.S. Environmental Protection Agency

ERP

Emergency Response Plan

ETV-MF

Environmental Technology Verification Program for Metal Finishing Pollution



Prevention Technologies

GYP

Generic Verification Protocol

ID

Identification

IDL

Instrument Detection Limit

JTA

Job Training Analysis

LM

Laboratory Manager

MDL

Method Detection Limit

MRL

Method Reporting Limit

MSDS

Material Safety Data Sheet

NRMRL

National Risk Management Research Laboratory

OSHA

Occupational Safety and Health Administration

P

Percent Recovery

PARCCS

Precision, Accuracy, Representativeness, Comparability, Completeness, and



Sensitivity

PPE

Personal Protective Equipment

PQL

Practical Quantification Limit

QA

Quality Assurance

QC

Quality Control

QMP

Quality Management Plan

Ref.

Reference

RPD

Relative Percent Difference

SR

Sample Result

SSR

Spiked Sample Result

U.S.

United States

vi


-------
Revision 0 - 02/22/02

1.0 INTRODUCTION

The purpose of this generic verification protocol (GVP) is to document the objectives,
procedures, and other aspects of testing that shall be utilized during verification testing of
aqueous cleaner recycling technologies. This GVP has been prepared in conjunction with
the U.S. Environmental Protection Agency's (EPA's) Environmental Technology
Verification Program for Metal Finishing Pollution Prevention Technologies (ETV-MF).
The objective of this program is to identify promising and innovative pollution
prevention technologies through EPA-supported performance verifications. The ETV-
MF Center prepares a test plan for testing individual technologies at a metal finishing site
where the technology is installed. The results of verification tests are documented in
verification reports that provide objective performance data to metal finishers,
environmental permitting agencies, and industry consultants. Verification statements,
which are executive summaries of verification reports, are prepared and signed by the
EPA National Risk Management Research Laboratory (NRMRL) Director and the CTC
ETV-MF Program Manager. After one or more technologies of a class have been tested,
a GVP is prepared to guide the development of future test plans. Verification of two
aqueous cleaner recycling technologies (microfiltration and microbiological digestion)
has been performed and forms the basis for this GVP.

Under the ETV Program, verification testing is conducted only on commercial-ready
technologies. As defined by EPA, commercial-ready technologies are either in use or
ready for full-scale production. This does not include technologies at the bench or pilot
scale, or those in the research and development stage.

Aqueous cleaners are widely used in the metal finishing industry to prepare parts for
subsequent processing. The cleaners used vary widely in composition, and choosing a
cleaner for a particular application is complex. Some of the factors considered are the
materials of the part, the types and amounts of the soils to be removed, the degree of
cleanliness required, the amount of time available for cleaning, and the available
methods for disposal of the used cleaner. A cleaner that is successful in one application
may be unsuited for other applications.

As a cleaner bath is used, soils are removed from parts and are retained by the cleaning
bath. The accumulation of soils limits the useful life of the cleaning bath, since there is
a limit to the amount of soil the bath can retain prior to soils being redeposited on the
parts. However, the constituents of the bath that perform the cleaning operation are still
present. Therefore, when used cleaner baths are discarded, useful chemicals are
discarded along with the soils. These discarded chemicals add to the treatment burden
of the metal finishing operation.

If the soils can be removed from the cleaning bath, the useful life of the bath can be
greatly extended, reducing the amount of waste requiring treatment and disposal. The
methods for removing soil while preserving the beneficial components of a cleaning
bath will vary with the soil and the cleaner. Microfiltration, biological digestion,
precipitation, and other methods have been successfully used.

1


-------
Revision 0 - 02/22/02

Verification testing of methods to recycle aqueous cleaners will be different for different
recycling technologies. However, in broad terms, the goal of verification testing of
aqueous cleaner recycling technologies will be to measure the efficiency of soil removal
from the cleaning bath and to verify the extent to which the recycling technology
removes beneficial cleaning components.

This generic verification protocol has been structured based on a format developed for
ETV-MF projects. This document describes the intended approach and explains plans for
testing with respect to areas such as test methodology, procedures, parameters, and
instrumentation. Also included are quality assurance/quality control (QA/QC)
requirements for testing that will ensure the accuracy of data, the use of proper data
interpretation procedures, and an emphasis on worker health and safety considerations.
The following sections (sections 2 through 10) are required to be included in all
verification test plans specific to technologies for recycling aqueous cleaners.

2.0 TECHNOLOGY DESCRIPTION

2.1	Theory of Operation

The theory of operation of the technology shall be described. In general, aqueous cleaner
recycling technologies operate by removing contaminants from the working bath in order
to extend bath life. This has been accomplished by many commercially available
separation technologies. The basic requirement for aqueous cleaner recycling is that the
separation method removes the contaminants of interest with little or no effect on the
constituents of the cleaner. Additionally, the materials of construction of the recycling
technology must be compatible with the chemistry of the cleaner and the working
environment.

2.2	Technology Decription

A detailed description of the recycling technology, as installed at the test site, shall be
provided. Pictures or flow diagrams are helpful in describing the metal finishing process
as well as how the technology interfaces with the process.

2.3	Test Site Description

A description of the test site shall be provided. Information on expected pollutants and
concentrations, flow rates, number of process lines, square feet processed per day, and
hours of operation are helpful.

2.4	Previous Testing

Summarize any previous testing done with the technology, including the type of
application and results. A review of existing manufacturer or third party performance test
results will assist in planning the verification testing. Cite any literature searches that
have been conducted and include the scope and quality of available data. Existing reports

2


-------
Revision 0 - 02/22/02

will provide a starting point for setting test conditions. The information should include
the following:

?? Test conditions

?? Description of the aqueous cleaner solution

?? Key operating parameters

?? Operating range

?? Performance results

?? QA/QC procedures/techniques

?? Technology/application sensitivities

?? Interferences

Summarize the available information rather than providing an extensive set of data or
other details. If applicable or vital to the results reported for the verification, portions of
this information can be placed into the appendix of the test plan. However, if the
information can be found in a publicly available document (e.g., QA standard, methods,
guidance documents, government report or trade journal), it is only necessary to
summarize it and include a reference to sources of such information.

3.0 TEST DESIGN

In general, there are four objectives in evaluating aqueous cleaner recycling technologies:

1)	Determine the efficiency of bath contaminant removal.

2)	Determine the amount of bath constituents removed by the technology.

3)	Determine the cost of using the technology.

4)	Determine the reduction of waste caused by the technology.

3.1	Data Quality Objectives (DQO)

The systematic planning elements of the data quality objectives process identified in
"Guidance for the Data Quality Objectives Process" (EPA QA/G-4, August 2000) shall
be utilized during preparation of verification test plans. The verification project team,
composed of representatives from the verification organization, technology vendor, test
site, analytical laboratory, and EPA, jointly develops the test objectives; critical and non-
critical measurements; test matrix; sample quantity, type and frequency; analytical
methods; and QA objectives to arrive at an optimized test designed to verify the
performance of the technology.

3.2	Critical and Non-Critical Measurements

Measurements that will be taken during testing are classified as either critical or non-
critical. Critical measurements are those that are necessary to achieve the primary
project.

3


-------
Revision 0 - 02/22/02

3.3	Test Matrix

The test matrix is dependent upon the technology undergoing verification. In general,
technologies operate as either flow-through technologies (for example, filtration) or in
situ technologies (for example, biological digestion). The design of the matrix is either
event driven, condition driven, or time driven. In the case of flow-through technologies,
samples shall be taken from the influent, product, and waste streams. Sampling for in
situ technologies shall be of the working bath, possibly at various points of the system.
When verifying an in situ technology, it is also necessary to take samples to determine
the rate of contaminant introduction to the bath. In order to assess variability of the
system, a minimum of four sampling events (days, runs, etc.) should be scheduled.

3.4	Sample Collection and Handling

Prior to the start of testing, the variability of the streams to be sampled should be
evaluated, either by a review of records or a preliminary sampling episode. Streams with
a high degree of variability will require composite sampling, while steady state streams
can use grab sampling.

At the time of sampling, each sample container shall be labeled with the date, time, and
sample identification (ID) number. Samples to be analyzed at an off-site laboratory shall
be accompanied by a chain of custody (COC) the verification Project Manager will
generate. The COC form will provide the following information:

?? project name

?? project address

?? sampler's name

?? sample numbers

?? date/time samples were collected

?? sample matrix

?? required analyses

?? appropriate COC signatures

All samples shall be transported in appropriate sample transport containers (e.g., coolers
with packing and blue ice). The transport containers shall be secured with tape to ensure
sample integrity during the delivery process to the analytical laboratory. The verification
Project Manager or designee will perform sampling and labeling, and ensure that samples
are properly secured and shipped to the laboratory for analysis per regulations required
by the Department of Transportation (DOT) and the Occupational Safety and Health
Administration (OSHA) to the laboratory for analysis.

3.4.1 Process Measurements and Information Collection

Process measurements and information collection shall be conducted to provide
the data required in supporting the test objectives. Additionally, process data are
collected to indicate proper operation of the technology. Typically, samples of

4


-------
Revision 0 - 02/22/02

the cleaner are taken and analyzed for contaminants and cleaner constituents.
Additional measurements may include process conditions such as temperature,
flow rate, amount of work processed, etc. Calibration information for any
equipment used to collect data should be included in the individual test plan.

3.5	Analytical Procedures

Chemical analyses of the samples shall be conducted to evaluate the effectiveness of the
technology in removing contaminants from the cleaner and preserving cleaner
constituents. Particular methods will depend on the cleaner being used and the soils
being cleaned. Whenever possible, standard EPA analysis methods shall be used.
Analytical laboratories used must be accredited by the National Environmental
Laboratory Accreditation Program. The test plan should include sample amount,
preservation, container required, and hold time for each method used.

3.6	Cost Evaluation

In order to evaluate the costs associated with a technology, various areas will require
evaluation. They may incluede the following: consumable costs (chemicals, filters, etc.),
energy costs (heating, pumps, etc.), labor costs, and possibly others. These costs can be
obtained from the test site records and the technology vendor. When possible, these costs
should be compared to the process used prior to installation of the technology.

3.7	Waste Reduction

The amount of waste generated by the technology should be evaluated. This is generally
calculated from the amount of rinsewater required and the bath dump and remake
frequency, but different technologies may reduce waste in other ways. When possible,
this should be compared to the process used prior to the installation of the technology.

4.0 QUALITY ASSURANCE/QUALITY CONTROL REQUIREMENTS

QA/QC activities shall be performed according to the applicable section of the
Environmental Technology Verification Program Metal Finishing Technologies Quality
Management Plan (ETV-MF QMP) [Ref. 1],

4.1 Quality Assurance Objectives

The first QA objective is to ensure that the process operating conditions and test methods
are maintained and documented throughout each test and laboratory analysis of samples.
The second QA objective is to use standard test methods (where possible) for laboratory
analyses. Data quality objectives for precision, accuracy, and completeness for each
analysis method must be determined prior to testing.

5


-------
Revision 0 - 02/22/02

4.2 Data Reduction, Validation, and Reporting
4.2.1 Internal Quality Control Checks

Raw Data Handling. Raw data are generated and collected by laboratory analysts
at the sampling site. These include original observations, printouts, and readouts
from equipment for sampling, standards, and reference QC analyses. Data may
be collected both manually and electronically. At a minimum, the date, time,
sample ID, raw signal or processed signal, and/or qualitative observations shall be
recorded. Comments to document unusual or non-standard observations shall be
included in the data package submitted by the laboratory to the verifying
organization.

Raw data are typically processed manually by the analyst, automatically by an
electronic program, or electronically after being entered into a computer. The
analyst shall be responsible for scrutinizing the data according to laboratory
precision, accuracy, and completeness policies. Raw data bench sheets and
calculation or data summary sheets shall be kept together for each sample batch.
From the standard operating procedure and the raw data bench files, the steps
leading to a final result may be traced.

Data Package Validation. The generating analyst will assemble a preliminary data
package, which shall be initialed and dated. This package shall contain all QC
and raw data results, calculations, electronic printouts, conclusions, and
laboratory sample tracking information. A second analyst will review the entire
package and check sample and storage logs, standard logs, calibration logs, and
other files, as necessary, to ensure that all tracking, sample treatments, and
calculations are correct. After the package is reviewed in this manner, a
preliminary data report shall be prepared, initialed, and dated. The entire package
and final report shall be submitted to the Laboratory Manager (LM) for review.

The LM shall be ultimately responsible for all final data released from the
laboratory. The LM or designee will review the final results for conformance to
task QA objectives. If the LM or designee suspects an anomaly or non-
concurrence with expected or historical performance values, or with task
objectives for test specimen performance, the raw data and the analysis
procedures shall be reviewed. If suspicion about data validity still exists after
internal review of laboratory records, the LM shall authorize a re-test. If sufficient
sample is not available for re-testing, a re-sampling shall occur. If the sampling
window has passed, or re-sampling is not possible, the LM shall flag the data as
suspect. The LM signs and dates the final data package.

Data Reporting. The final report shall contain the laboratory sample identification,
date reported, date analyzed, the analyst, the standard operating procedure used
for each parameter, the process or sampling point identification, the final result,

6


-------
Revision 0 - 02/22/02

and the results of all QA/QC analyses (field duplicates, matrix spike, and matrix
spike duplicates).

4.2.2 Calculation of Data Quality Indicators

Analytical performance requirements are expressed in terms of precision,
accuracy, representativeness, comparability, completeness, and sensitivity
(PARCCS). Summarized below are definitions and QA objectives for each
PARCCS parameter.

The influent, effluent, and waste streams are different matrices. Therefore, a field
duplicate, matrix spike and matrix spike duplicate from all three streams shall be
analyzed for every ten samples collected from these streams.

The following sections identify the formulae used to calculate the PARCCS
parameters.

4.2.2.1	Precision

Precision is a measure of the agreement or repeatability of a set of
replicate results obtained from duplicate analyses made under identical
conditions. Precision is estimated from analytical data and cannot be
measured directly. The precision of a duplicate determination can be
expressed as the relative percent difference (RPD), and calculated as:

? ?

RPD = {(IXi - X2|)/(Xi + X2)/2} x 100% = I ]Xl ?X2IJ x 100 %

9?Xj ? X2 ? 9

? 2 ?

where:

Xi = larger of the two observed values
X2 = smaller of the two observed values

Multiple determinations shall be performed for each test on the same test
specimen.

4.2.2.2	Accuracy

Accuracy is a measure of the agreement between an experimental
determination and the true value of the parameter being measured.
Accuracy is estimated through the use of known reference materials or
matrix spikes. It is calculated from analytical data and is not measured
directly. Spiking of reference materials into a sample matrix is the
preferred technique because it provides a measure of the matrix effects on

7


-------
Revision 0 - 02/22/02

analytical accuracy. Accuracy, defined as percent recovery (P), is
calculated as:

SSR-SR
SA

x 100%

where:

SSR	=	spiked sample result

SR	=	sample result (native)

S A	=	concentration added to the spiked sample

Analyses shall be performed with periodic calibration checks with
traceable standards to verify instrumental accuracy. These checks shall be
performed according to established procedures in the contracted
laboratory(s) that have been acquired for this verification test. Analysis
with spiked samples shall be performed to determine percent recoveries as
a means of checking method accuracy.

4.2.2.3	Completeness

Completeness is defined as the percentage of measurements judged to be
valid compared to the total number of measurements made for a specific
sample matrix and analysis. Completeness is calculated using the
following formula:

Completeness = Valid Measurements ? 100%

Total Measurements

Experience on similar projects has shown that laboratories typically
achieve about 90 percent completeness. QA objectives will be satisfied if
the percent completeness is 90 percent or greater as specified.

4.2.2.4	Comparability

Comparability is another qualitative measure designed to express the
confidence with which one data set may be compared to another. Sample
collection and handling techniques, sample matrix type, and analytical
method all affect comparability. Comparability is limited by the other
PARCCS parameters because data sets can be compared with confidence
only when precision and accuracy are known. Comparability will be
achieved in this technology verification by the use of consistent methods
during sampling and analysis and by traceability of standards to a reliable
source.

4.2.2.5	Representativeness

Representativeness refers to the degree to which the sample represents the
properties of the particular wastestream being sampled. For the purposes

8


-------
Revision 0 - 02/22/02

of this demonstration, representativeness shall be determined by
submitting identical samples (field duplicates) to the laboratory for
analysis. The samples will be representative if the relative percent
difference between the sample and the field duplicate is similar to or less
than the precision (laboratory duplicates) calculation of the sample.

4.2.2.6 Sensitivity

Sensitivity is the measure of the concentration at which an analytical
method can positively identify and report analytical results. The
sensitivity of a given method is commonly referred to as the detection
limit. Although there is no single definition of this term, the following
terms and definitions of detection shall be used for this program.

Instrument Detection Limit (DDL) is the minimum concentration that can
be measured from instrument background noise.

Method Detection Limit (MDL) is a statistically determined
concentration. It is the minimum concentration of an analyte that can be
measured and reported with 99 percent confidence that the analyte
concentration is greater than zero as determined in the same or a similar
matrix. (Because of the lack of information on analytical precision at this
level, sample results greater than the MDL but less than the practical
quantification limit (PQL) shall be laboratory qualified as "estimated.")

MDL is defined as follows for all measurements:

MDL = t(n-i,i-? = 0.99) xs

where:

MDL = method detection limit
t(n-i,i-? = 0.99) = students t-value for a one-sided 99 percent
confidence level and a standard deviation
estimate with n-1 degrees of freedom
s = standard deviation of the replicate analyses

Method Reporting Limit (MRL) is the concentration of the target analyte
that the laboratory has demonstrated the ability to measure within
specified limits of precision and accuracy during routine laboratory
operating conditions. (This value is variable and highly matrix-dependent.
It is the minimum concentration that will be reported without
qualifications by the laboratory.)

9


-------
Revision 0 - 02/22/02

4.3 Quality Audits

Technical System Audits. The verification organization may perform a technical systems
audit during the verification test. The EPA QA Manager may conduct an audit to assess
the quality of the verification test.

Internal Audits. In addition to the internal laboratory quality control checks, internal
quality audits shall be conducted to ensure compliance with written procedures and
standard protocols.

Corrective Action. Corrective action for any deviations to established QA and QC
procedures during verification testing shall be performed according to section 2.10,
Quality Improvement, of the ETV-MF QMP [Ref. 1],

Laboratory Corrective Action. Examples of non-conformances include invalid calibration
data, inadvertent failure to perform method-specific QA, process control data outside
specified control limits, failed precision and/or accuracy indicators, etc. Such non-
conformances shall be documented on a standard laboratory form and provided along
with the results to the verification organization. Corrective action shall involve taking all
necessary steps to restore a measuring system to proper working order and summarizing
the corrective action and results of subsequent system verifications on a standard
laboratory form. Some non-conformances are detected while analysis or sample
processing is in progress and can be rectified in real time at the bench level. Others may
be detected only after a processing trial and/or sample analyses are completed. Typically,
the LM detects these types of non-conformances. In all cases of non-conformance, the
LM shall consider sample re-analysis or instrument calibration verification as sources of
corrective action. If insufficient sample is available or the holding time has been
exceeded, the LM shall contact the Verification Project Manager to discuss generating
new samples. In all cases, a non-conformance shall be rectified before sample processing
and analysis continues.

5.0 PROJECT MANAGEMENT

5.1 Organization/Personnel Responsibilities

The Verification Project Team that will conduct the evaluation of the system shall be
identified by the CTC ETV-MF Program Manager. The verification organization will
have ultimate responsibility for all aspects of the technology evaluation. The Verification
Project Manager shall be assigned by the verification organization. The Verification
Project Manager and/or his staff designee shall be on-site throughout the test period and
will conduct or oversee all sampling and related measurements. The CTC ETV-MF QA
Manager shall approve the test plan and determine the requirement for a technical system
audit. Additional members of the project team include representatives from the
technology vendor, the test site, and the analytical service laboratory.

10


-------
Revision 0 - 02/22/02

5.2 Test Plan Modification

In the course of verification testing, it may become necessary to modify the test plan due
to unforeseen events. These modifications shall be documented using a Test Plan
Modification Request (Appendix A), which is submitted to the verification organization
for approval. Upon approval, the modification request shall be assigned a number,
logged, and transmitted to the requestor for implementation.

HEALTH AND SAFETY PLAN

The Health and Safety Plan provides guidelines for recognizing, evaluating, and
controlling health and physical hazards during the verification test. More specifically, the
Plan specifies the training, materials, and equipment necessary for assigned personnel to
protect themselves from hazards created by chemicals and any waste generated by the
process. Test site plans can be used if available. If a test site plan is not available, one
must be developed.

6.1	Hazard Communication

All personnel assigned to the project shall be provided with the potential hazards, signs
and symptoms of exposure, methods or materials to prevent exposures, and procedures to
follow if there is contact with a hazardous substance. All appropriate Material Data
Safety Sheet (MSDS) forms shall be available for chemical solutions used during testing.

6.2	Emergency Response Plan

An Emergency Response Plan (ERP) protects employees, assigned project personnel, and
visitors in the event of an emergency at the facility. All assigned personnel shall be
provided with information about the plan during the initial training, and the plan shall be
accessible to them for the duration of the project.

6.3	Hazard Controls Including Personal Protective Equipment

All assigned project personnel shall be provided with appropriate personal protective
equipment (PPE) and any training needed for its proper use, considering their assigned
tasks. The use of PPE shall be covered during training as indicated in section 8.0.

6.4	Lockout/Tagout Program

The Lockout/Tagout Program safety requirements shall be reviewed prior to testing, and
relevant lockout/tagout provisions implemented as required. Lockout/tagout safety must
be practiced if electrical, pressure, or other sources of energy must be installed or
disconnected during verification testing.

11


-------
Revision 0 - 02/22/02

6.5	Material Storage

Any materials used during the project shall be kept in proper containers and labeled
according to Federal, state, and local laws. Proper storage of the materials shall be
maintained based on associated hazards. Spill trays or similar devices shall be used as
needed to prevent material loss to the surrounding area. The test site Hazard
Communication Program is a source of information on these requirements.

6.6	Safe Handling Procedures

All chemicals and wastes or samples shall be transported on-site in non-breakable
containers used to prevent spills. Spill kits shall be strategically located in the project
area. These kits contain various sizes and types of sorbents for emergency spill clean-up.
Emergency spill clean-up shall be performed according to the host facility ERP.

7.0 WASTE MANAGEMENT

If waste is generated in the course of verification testing, waste handling, storage, and
disposal should be covered by the host facility's waste permit. If not, special
accommodations must be made, including contacting the local regulatory authority.

8.0 TRAINING

Environmental, health, and safety (EHS) training shall be coordinated with the test site.
All verification program personnel shall undergo EHS training prior to initiating the
verification test.

Also, the ETV-MF Job Training Analysis (JTA) Plan [Ref. 2] shall be utilized to identify
additional training requirements relating to quality control, worker safety and health, and
environmental issues. The purpose of this JTA Plan is to outline the overall procedures
for identifying the hazards, quality issues, and training needs. This JTA Plan establishes
guidelines for creating a work atmosphere that meets the quality, environmental, and
safety objectives of the verification program. The JTA Plan describes the method for
studying verification project activity and identifying training needs. The ETV-MF
Operation Planning Checklist (Appendix B) shall be used as a guideline for identifying
potential hazards, and the Job Training Analysis Form (Appendix C) shall be used to
identify training requirements. After completion of the form, applicable training shall be
performed. Training shall be documented on the ETV-MF Project Training Attendance
Form (Appendix D).

9.0 REFERENCES

1) Concurrent Technologies Corporation. "Environmental Technology Verification
Program Metal Finishing Technologies (ETV-MF) Quality Management Plan, Rev.
1." March 26, 2001.

12


-------
Revision 0 - 02/22/02

2)	Concurrent Technologies Corporation. "Environmental Technology Verification
Program Metal Finishing Technologies (ETV-MF) Pollution Prevention
Technologies Pilot Job Training Analysis Plan." May 10, 1999.

3)	EPA Office of Research and Development. "Preparation Aids for the Development
of Category IV Quality Assurance Project Plans." EPA/600/8-91/006, February
1991.

10.0 DISTRIBUTION

Distribution of the verification test plan to all participants (the verification organization,
technology vendor, test site, analytical laboratory, and EPA) is required. Distribution of
the test plan will occur after the test plan has been signed by the verification organization,
the CTC Project Manager, the CTC ETV-MF Program Manager, the U.S. EPA ETV
Center Manager, the test site, and the technology vendor.

13


-------
APPENDIX A
Test Plan Modification

Revision 0 - 02/22/02


-------
Revision 0 - 02/22/02

Test Plan Modification

In the course of verification testing, it may become necessary to modify the test plan due to
unforeseen events. The purpose of this procedure is to provide a vehicle whereby the necessary
modifications are documented and approved.

The Test Plan Modification Request form is the document to be used for recording these
changes. The following paragraphs provide guidance for filling out the form to ensure a
complete record of the changes made to the original test plan. The form appears on the next
page.

The person requesting the change should record the date and project name in the form's heading.
Program management will provide the request number.

Under Original Test Plan Requirement, reference the appropriate sections of the original test
plan, and insert the proposed modifications in the section titled Proposed Modification. In the
Reason section, document why the modification is necessary; this is where the change is
justified. Under Impact, give the impact of not making the change, as well as the consequences
of making the proposed modification. Among other things, the impact should address any
changes to cost estimates and project schedules.

The requestor should then sign the form and obtain the signature of the project manager. The
form should then be transmitted to the CTC ETV-MF Program Manager, who will either approve
the modification or request clarification. Upon approval, the modification request shall be
assigned a number, logged, and transmitted to the requestor for implementation.

A-l


-------
Revision 0 - 02/22/02

TEST PLAN MODIFICATION REQUEST

Date:	 Number:	 Project:	

Original Test Plan Requirement:	

Proposed Modification:	

Reason:

Impact:

Approvals:

Requestor:	

Project Manager:_
Program Manager:

A-2


-------
Revision 0 - 02/22/02

APPENDIX B

ETV-MF Operation Planning Checklist


-------
Revision 0 - 02/22/02

ETV-MF Operation Planning Checklist

The ETV-MF Project Manager prior to initiation of verification testing must complete this form.
If a "yes " is checkedfor any items below, an action must be specified to resolve the concern on
the Job Training Analysis Form.

Proj ect Name: 		Expected Start Date:

ETV-MF Project Manager:

Will the operation or activity involve the following: Yes No Initials & Date
	Completed

Equipment requiring specific, multiple steps for controlled shutdown?
(E.g., in case of emergency, does equipment require more than simply
pressing a "Stop" button to shut off power?) Special Procedures for
emergency shutdown must be documented in Test Plan.







Equipment requiring special fire prevention precautions (e.g., Class D fire
extinguishers)?







Modifications to or impairment of building fire alarms, smoke detectors,
sprinklers or other fire protection or suppression systems?







Equipment lockout/tagout or potential for dangerous energy release?
Lockout/tagout requirements must be documented in Test Plan.







Working in or near confined spaces (e.g., tanks, floor pits) or in cramped
quarters?







Personal protection from heat, cold, chemical splashes, abrasions, etc.? Use

Personal Protective Equipment Program specified in Test Plan.







Airborne dusts, mists, vapors and/or fumes? Air monitoring, respiratory
protection, and/or medical surveillance may be needed.







Noise levels greater than 80 decibels? Noise surveys are required.
Hearing protection and associated medical surveillance may be necessary.







X-rays or radiation sources? Notification to the state and exposure
monitoring may be necessary.







Welding, arc/torch cutting or other operations that generate flames and/or
sparks outside of designated weld areas? Follow Hot Work Permit
Procedures identified in Test Plan.







The use of hazardous chemicals? Follow Hazard Communication
Program, MSDS Review for Products Containing Hazardous Chemicals.
Special training on handling hazardous chemicals and spill clean-up may
be needed. Spill containment or local ventilation may be necessary.







Working at a height of six feet or greater?







B-l


-------
Revision 0 - 02/22/02

ETV-MF OPERATION PLANNING CHECKLIST

The ETV-MF Project Manager prior to initiation of verification testing must complete this form.
If a "yes " is checkedfor any items below, an action must be specified to resolve the concern on
the Job Training Analysis Form.

Project Name:

ETV-MF Project Manager:

Will the operation or activity involve the following: Yes No Initials & Date
	Completed

Processing or recycling of hazardous wastes? Special permitting may be
required.







Generation or handling of waste?







Work to be conducted before 7:00 a.m., after 6:00 p.m. and/or on
weekends? Two people must always be in the work area together.







Contractors working in CTC facilities? Follow Hazard Communication
Program.







Potential discharge of wastewater pollutants?







EHS aspects/impacts and legal and other requirements identified?







Contaminants exhausted either to the environment or into buildings?

Special permitting or air pollution control devices may be necessary.







Any other hazards not identified above (e.g., lasers, robots, syringes)?
Please indicate with an attached list.







The undersigned responsible party certifies that all applicable concerns have been indicated in
the "yes" column, necessary procedures shall be developed, and applicable personnel will
receive required training. As each concern is addressed, the ETV-MF Project Manager will
initial and date the "Initials & Date Completed" column above.

ETV-MF Project Manager:	 	 	

(Name)	(Signature)	(Date)

B-2


-------
Revision 0 - 02/22/02

APPENDIX C

Job Training Analysis Form


-------
Revision 0 - 02/22/02

Job Training Analysis Form

ETV-MF Project Name:

Basic Job Step

Potential EHS Issues

Potential Quality
Issues

Training









































































































ETV-MF Project Manager:	

Name	Signature

Date

C-l


-------
Revision 0 - 02/22/02

APPENDIX D

ETV-MF Project Training Attendance Form


-------
Revision 0 - 02/22/02

ETV-MF Project Training Attendance Form

ETV-MF Project:	

Date
Training
Completed

Employee Name

Last First

Training Topic

Test
Score
(If applic.)









































































































































































ETV-MF Project Manager: 	

Name	Signature

Date

D-l


-------