vvEPA
United States
Environmental Protection
Agency
EPA Microbiological Alternate Test
Procedure (ATP) Protocol for Drinking
Water, Ambient Water, Wastewater, and
Sewage Sludge Monitoring Methods
September 2010
-------
U.S. Environmental Protection Agency
Office of Water (4303T)
1200 Pennsylvania Avenue, NW
Washington, DC 20460
EPA-821-B-10-001
-------
DISCLAIMER
Neither the United States Government nor any of its employees, contractors, or their employees make any
warranty, expressed or implied, or assumes any legal liability or responsibility for any third party's use of
or the results of such use of any information, apparatus, product, or process discussed in this report, or
represents that its use by such party would not infringe on privately owned rights. Mention of company
names, trade names, or commercial products in this protocol does not constitute endorsement or
recommendation for use.
Questions concerning this report should be addressed to:
Robin K. Oshiro
Engineering and Analysis Division (4303T)
U.S. EPA Office of Water, Office of Science and Technology
1200 Pennsylvania Avenue, NW
Washington, DC 20460
oshiro. robin@epa. gov or OSTCWAMethods@epa.gov
-------
FOREWORD
This document describes a process for seeking EPA approval of microbiological alternate test procedures
(ATPs) or new methods for use in monitoring drinking water, ambient water, wastewater, and sewage
sludge (biosolids). This document serves as a supplement to the ATP guidelines at 40 CFR 136.4, 136.5,
and 141.27.
This ATP protocol describes a process for conducting side-by-side method comparisons and for
conducting quality control (QC) acceptance criteria-based method studies for EPA-approved reference
methods with QC acceptance criteria. Additionally, in some cases the revised protocol provides
applicants an opportunity to demonstrate comparability by meeting QC acceptance criteria associated
with the EPA-approved reference methods for different combinations of analyte and determinative
technique.
Under EPA's ATP program, any person may apply for approval of the use of an ATP or new method to
test for a regulated analyte. EPA anticipates that the standardized procedures described herein should
generally expedite the approval of ATPs, encourage the development of innovative technologies, and
enhance the overall utility of the EPA-approved methods for compliance monitoring under the National
Pollution Discharge Elimination System (NPDES) permit program and national primary drinking water
regulations (NPDWRs).
This document is not a legal instrument and does not establish or affect legal obligations under Federal
regulations. EPA reserves the right to change this protocol without prior notice.
Questions or comments regarding this document should be directed to:
Robin K. Oshiro
Engineering and Analysis Division (4303T)
U.S. EPA Office of Water, Office of Science and Technology
1200 Pennsylvania Avenue, NW
Washington, DC 20460
oshiro.robin(giepa.gov or OSTCWAMethods@epa.gov
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
TABLE OF CONTENTS
SECTION 1.0 INTRODUCTION 4
1.1 Background and Objectives 4
1.2 Types of Applications 5
1.2.1 Limited Use 5
1.2.2 Nationwide Use 5
1.3 Types of Studies 6
1.4 Scope of Alternate Test Procedures 6
1.4.1 EPA-Approved Reference Methods 6
1.4.2 Modifications to Sample Preparation Techniques 7
SECTION 2.0 APPLICATION 8
2.1 Submission Addresses and Approval Authority 8
2.2 Application Information 8
2.3 Reason for ATP 10
2.4 Standard EPA Method Format 10
2.5 Method Comparison Table 10
2.6 Method Development Information 10
2.7 Study Plan 11
2.8 Study Report 11
2.9 Proprietary Information in Applications 11
SECTION 3.0 METHOD FORMAT 13
3.1 Scope and Application 13
3.2 Summary of Method 13
3.3 Method Definitions 13
3.4 Interferences 13
3.5 Safety 14
3.6 Equipment and Supplies 14
3.7 Reagents and Standards 14
3.8 Sample Collection, Preservation, and Storage 14
3.9 Quality Control 14
3.10 Calibration and Standardization 14
3.11 Procedure 14
3.12 Data Analysis and Calculations 15
3.13 Method Performance 15
3.14 Pollution Prevention 15
3.15 Waste Management 15
3.16 References 15
3.17 Tables, Diagrams, Flowcharts, and Validation Data 15
SECTION 4.0 STUDY PLAN 16
4.1 Background 16
4.2 Objectives 16
4.3 Study Design 16
4.4 Coordination 17
4.4.1 Study Management 17
4.4.2 Technical Approach 17
4.5 Data Reporting 17
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
SECTION 5.0 QUALITY ASSURANCE/QUALITY CONTROL 18
5.1 Quality Assurance 18
5.2 Quality Control 18
5.2.1 Analyst Counting Variability 20
5.2.2 Autoclave Sterilization Verification 20
5.2.3 Dilution/Rinse Water Blanks 20
5.2.4 Incubator/Water Bath Temperatures 20
5.2.5 Initial Demonstration of Capability 20
5.2.6 Initial Precision and Recovery 21
5.2.7 Matrix Spike and Matrix Spike Duplicate Samples 21
5.2.8 Media Sterility Checks 21
5.2.9 Method Blank 21
5.2.10 Ongoing Demonstration of Capability (ODC) Samples 22
5.2.11 Ongoing Precision and Recovery (OPR) Samples 22
5.2.12 Positive/Negative Controls 22
5.2.13 Preparation Blanks (PB) 23
5.2.14 Refrigerator/Freezer Temperatures 23
5.2.15 Sample Processing Equipment Sterility Checks 23
SECTION 6.0 STUDY DESIGN 24
6.1 Side-by-Side Comparison Studies 24
6.1.1 Number of Laboratories 24
6.1.2 Number of Samples 24
6.1.3 Verification of Results 27
6.2 QC Acceptance Criteria-Based Comparison Studies 28
6.2.1 Number of Laboratories 28
6.2.2 Number of Matrices 28
6.2.3 Number of Replicates per Matrix 28
SECTION 7.0 SAMPLE PREPARATION AND ANALYSIS 29
7.1 Collection of Samples for Analysis 29
7.1.1 Source Water Characterization 29
7.2 Sample Spiking and "Stressing" Procedures for Bacteriological Methods 29
7.2.1 Drinking Water: Spiking and Chlorine-Stressing 30
7.2.2 Preparation of Enumerated Spiking Suspension 32
7.2.3 Log Phase Growth Curve 33
7.2.4 Commercially Available Enumerated Spikes 34
7.3 Spiking Procedures for Virus Methods 34
7.3.1 Cell Monolayer Propagation 34
7.3.2 Propagation of Virus Stock Suspension 34
7.3.3 Titering of the Virus Stock Suspension 34
7.4 Spiking Procedures for Cryptosporidium andGiardia 34
7.5 Analysis of Samples 35
7.5.1 Side-by-Side Comparison Studies 35
7.5.2 QC Acceptance Criteria-Based Comparison Studies 35
7.6 Verification of Results 36
7.6.1 Verification of Results from Bacteriological Methods 36
7.6.2 Verification of Results from Virus Methods 37
7.6.3 Verification of Results from Cryptosporidium and Giardia Methods 37
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
SECTION 8.0 REVIEW OF STUDY RESULTS 38
8.1 Assessment of Compliance with Approved Study Plan 38
8.2 Data Review 38
8.3 Data Validation 38
8.4 Development of Descriptive Statistics 38
8.4.1 Mean Recovery 38
8.4.2 Precision 39
8.4.3 False Positive Rates, False Negative Rates, Sensitivity, and Specificity 39
8.5 Statistical Assessment of Method Comparability 41
8.5.1 Presence/Absence Methods 41
8.5.2 Quantitative Methods 42
8.5.3 QC Acceptance Criteria-Based Comparison Studies 47
8.6 Method Recommendation and Approval 49
SECTION 9.0 STUDY REPORT 50
9.1 Background 50
9.2 Study Objectives and Design 51
9.3 Study Implementation 51
9.4 Data Reporting and Validation 51
9.5 Results 51
9.6 Data Analysis and Discussion 52
9.7 Conclusions 52
9.8 Appendix A - Method 52
9.9 Appendix B - Study Plan 52
9.10 Appendix C - Supporting Data 52
9.10.1 Raw Data 52
9.10.2 Electronic Data Reporting 53
9.10.3 Example Calculations 53
9.11 Appendix D - Supporting References 53
SECTION 10.0 REFERENCES 54
APPENDIX A: GLOSSARY 56
APPENDIX B: ATP APPLICATION FORM 67
APPENDIX C: APPLICATION INVENTORY FORM 70
APPENDIX D: DATA ELEMENTS AND EXAMPLE BENCH SHEETS 73
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
SECTION 1.0 INTRODUCTION
1.1 Background and Objectives
In accordance with the Clean Water Act (CWA) and Safe Drinking Water Act (SDWA), the U.S.
Environmental Protection Agency (EPA) promulgates guidelines establishing test procedures (analytical
methods) for data gathering and compliance monitoring under National Pollution Discharge Elimination
System (NPDES) permits and national primary drinking water regulations (NPDWRs). The approved test
procedures can be found in the Code of Federal Regulations (CFR) at 40 CFR Part 136 for wastewater and
ambient water and 40 CFR Part 141 for drinking water. In addition, EPA's regulations at 40 CFR 136.4,
136.5, and 40 CFR 141.27, allow entities to apply for Agency permission to use an alternate test procedure
(ATP) in place of an EPA-approved reference method. Figure 1.1 below summarizes the ATP or new
method review process within EPA. These regulations are the basis for the Agency's alternate test
procedure (ATP) program for water methods that is administered by the Office of Water, Office of Science
and Technology, Director of Analytical Methods.
An ATP is a modification of an EPA-approved reference method or a procedure that uses the same
determinative technique (i.e., the physical and/or chemical process used to determine the identity and
concentration of an analyte) and measures the same analyte(s) of interest as the EPA-approved reference
method. The use of a different determinative technique to measure the same analyte(s) of interest as an
EPA-approved reference method is considered a new method.
Under the ATP program, an organization or individual may apply for approval of an ATP or new method
to be used as an alternate to an EPA-approved reference method. The applicant is generally responsible for
characterizing method performance of its proposed alternate test procedure prior to submission to the ATP
program. EPA can provide assistance to applicants in the development of a study plan to demonstrate
comparability with the EPA-approved reference method. Figure 1.1 summarizes the ATP or new method
review process within EPA. The Agency reviews the ATP package, approves or disapproves the
application, and, for nationwide applications, will generally propose to include successful ATPs in the
CFR (unless the ATP is for limited use or constitutes a minor modification - See Appendix A).
Figure 1-1 .Summary of the ATP or New Method Review Process
ATP
application
08
Acronyms:
AW = Ambient Water
DW = Drinking Water
LU = Limited-use
NW = Nationwide-use
OW= Office of Water
RA = Regional
Administrator
WW= Wastewater
d to O
OW
review
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
The protocol illustrates how applicants can demonstrate comparability by meeting quality control (QC)
acceptance criteria associated with EPA-approved reference methods for which those criteria have been
developed.
The ATP program provides laboratories and regulated facilities with an opportunity to enhance compliance
monitoring and encourages the use of innovative technologies. Approval for an ATP or new method may
be sought when, for example, the alternate procedure reduces analytical costs, overcomes matrix
interferences problems, improves laboratory productivity, or reduces the amount of hazardous materials
used and/or produced in the laboratory.
Any person or organization may apply to gain approval for the use of an ATP or new method for
determination of a specific constituent that is regulated under the NPDES permit program or the NPDWRs.
The ATP applicants generally may demonstrate comparability of its proposed ATP or new method with the
EPA-approved reference method using the procedures described in this protocol. Other possible method
comparison procedures include those provided by organizations such as ASTM (Reference 10.5), AOAC-
International (Reference 10.1), and ISO (Reference 10.8).
1.2 Types of Applications
The types of applications submitted may depend on the intended use of the ATP. Methods intended for
use in demonstrating compliance with the NPDES permit program (wastewater or ambient water ATPs)
may be submitted for approval for limited-use (single laboratory) or for nationwide-use (all laboratories).
Because only the Administrator has the authority to approve an alternate analytical technique for SWDA
purposes, EPA will generally consider proposed methods intended for use in demonstrating compliance
with NPDWRs (drinking water ATPs) that are submitted for approval for nationwide-use only.
1.2.1 Limited Use
The primary intent of the limited-use ATP is to allow use of an ATP or new method by a single laboratory.
Limited-use ATPs can be applied to one or more matrix types, excluding drinking water matrices; limited-
use applications generally will not apply to Office of Ground Water and Drinking Water (OGWDW) ATP
applications (Reference 10.18). If a method developer intends to apply the method to more than one
matrix type, method studies should be conducted on each matrix type. Generally, nine different
wastewater types should be analyzed to demonstrate the ATP or new method will be applicable to most
other matrix types. If method modifications are within the specified flexibility of the EPA-approved
reference method and all QC acceptance criteria are met, it generally will not be necessary to submit the
modification to the ATP program.
1.2.2 Nationwide Use
The primary intent of a nationwide-use ATP is to allow use of an ATP or new method by all regulated
entities and laboratories for one or more matrix types including drinking water. Nationwide-use approval
allows vendors to establish that new devices and reagents produce results that are acceptable for
compliance monitoring purposes, and allows environmental laboratories across the United States to apply
new technologies or modified techniques throughout their chain of laboratories to one or more matrix
types. If a method developer intends to apply the method to more than one matrix type, method studies
should be conducted on each matrix type. Generally, nine different wastewater types should be analyzed
to demonstrate the ATP or new method will be applicable to most other wastewaters. If method
modifications are within the specified flexibility of the EPA-approved reference method and all QC
acceptance criteria are met, it generally will not be necessary to submit the modification to the ATP
program.
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
1.3 Types of Studies
The type of study most useful in seeking approval of an ATP or new method generally depends on whether
or not the EPA-approved reference methods contain QC acceptance criteria. There are two basic types of
studies described in this protocol:
• Side-by-Side Method Comparison Study. A side-by-side method comparison study generally
consists of parallel testing of an ATP or new method alongside an EPA-approved reference method to
determine whether the performance of the new or modified method is acceptable compared to the
reference method.
• QC Acceptance Criteria-Based Method Comparison Study. For EPA-approved reference methods
that contain (or are supplemented with) QC acceptance criteria for most combinations of analyte(s) and
determinative technique(s), the goal of the study is for the applicant to demonstrate that its ATP or new
method is able to meet the QC acceptance criteria of the EPA-approved reference method (or other
EPA-specified document) for the applicable combination of analyte and determinative technique
through a QC acceptance criteria-based comparison study.
Specific guidelines for the studies can be found in Section 5.0: Quality Control, Section 6.0: Study Design,
Section 7.0: Sample Preparation and Analysis, and Section 9.0: Review of Study Results.
1.4 Scope of Alternate Test Procedures
This protocol for demonstration of comparability, submission, and approval of an ATP or new method
offers flexibility to modify EPA-approved reference methods. Generally, an applicant should demonstrate
and document that the modified method produces results better than or equal to those produced by an
appropriate EPA-approved reference method for the applicable combination of analyte and determinative
technique.
1.4.1 EPA-Approved Reference Methods
The ATP process is based on comparing the performance of an ATP or new method to an EPA-approved
reference method through a side-by-side comparison study or a QC acceptance criteria-based comparison
study. Method comparability is demonstrated when results produced by an ATP meet or exceed the
performance criteria associated with the EPA-approved reference method. Table 1-1 below lists the EPA-
approved reference methods for the analytes covered by this protocol. This table will be updated as
necessary as additional pathogens are added to the list or advances in technology merit a change in the
EPA-approved reference method. When performing a study, the applicant should use the reference
method that uses the same determinative technique (e.g., MF, MPN) as the ATP for the analyte(s) of
interest. If the applicant is validating a new method, which generally will use a determinative technique
that is not currently approved for use with the analyte(s) of interest, then the applicant should consult EPA
prior to commencing the study to determine the most appropriate reference method.
Table 1-1. EPA-Approved Reference Methods
Analyte
Total coliforms
Method Format1
MPN
MF
Presence/Absence
EPA-Approved Reference
Method2'3
SM9221B
SM 9222B
SM9221D
40 CFR Citation
136.3, 141.21,
141.74
141.21
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Analyte
Fecal coliforms
£. co//
HPC
Fecal streptococcus
Enterococcus
Salmonella
Enteric virus
Helminth ova
Aeromonas
Coliphage
Cryptosporidium
Giardia
Method Format1
MPN
MF
MPN
MF
Pour Plate
MPN
MF
MPN
MF
MPN
Plaque Assay
Microscopy
MF
Plaque Assay
Two-Step Enrichment
Filtration/IMS/FA
Filtration/IMS/FA
EPA-Approved Reference
Method2'3
USEPA Methods 1680/1681
SM 9222D
SM9221F
For CWA use: USEPA Method
1603
For SDWA use: mEndo + NA-
MUG
SM9215B
SM 9230B
SM 9230C
SM 9230B
USEPA Method 1600
USEPA Method 1682
EPA Document 4
EPA Document 5
USEPA Method 1605
USEPA Method 1601
USEPA Method 1602
USEPA Method 1622/1623
USEPA Method 1623
40 CFR Citation
136.3, 141.21,
141.74, 503.8(b)
136.3, 141.21,
141.74
141.74
136.3
136.3, 141.402
136.3, 503.8(b)
503.8(b)
503.8(b)
141.40
141.402
136.3, 141.74
MPN = most probable number, MF = membrane filtration, IMS/FA = immunomagnetic separation/fluorescent
antibody
"SM@ refers to Standard Methods for the Examination of Water and Wastewater. For the edition(s) approved for
a specific method, consult the CFR sections referenced in the column headed A40 CFR Citation.@ (References
10.2, 10.3, and 10.4).
Not all methods have been approved for use in all matrices. Please see "40 CFR Citation" for specific method
approved for specific matrix.
See References 10.12 and 10.13
See References 10.12 and 10.19
1.4.2 Modifications to Sample Preparation Techniques
A sample preparation technique is any technique in the analytical process conducted at the laboratory that
precedes the determinative technique (i.e., the physical and/or chemical process by which measurement of
the identity and concentration of an analyte is made). Sample preparation techniques include the
procedures, equipment, reagents, etc., that are used in the preparation and cleanup of a sample for analysis.
Laboratories generally may modify sample preparation techniques, provided the modification is not
explicitly prohibited in the EPA-approved reference method that is being modified and provided the
modification can be demonstrated to produce results equal or superior to results produced by an EPA-
approved reference method for each combination of analyte and determinative technique.
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
SECTION 2.0 APPLICATION
ATP applications should be submitted in triplicate to EPA to facilitate the review process. The
application consists of a completed ATP application form (a sample application form is provided in
Appendix D) with any attachments. Electronic submissions are also generally acceptable and often may
accelerate the review process.
2.1 Submission Addresses and Approval Authority
A summary of ATP submission information and approval authorities is provided in Table 2-1.
Table 2-1. Submission of Alternate Test Procedure Applications
Level of Use
Limited Use
for Wastewater
or Ambient Water
Nationwide Use for
Drinking Water,
Wastewater, Ambient
Water, Sewage Sludge
(Biosolids)
Applicant
EPA Regional laboratories
States, commercial
laboratories, individual
dischargers, or permittees
in States that do not have
the authority to administer
Clean Water Act and Safe
Drinking Water Act
monitoring programs
States, commercial
laboratories, individual
dischargers, or permittees
in States that have the
authority to administer
Clean Water Act and Safe
Drinking Water Act
monitoring programs
All applicants
Submit Application
To1
EPA Regional
Administrator (Regional
ATP Coordinator)
EPA Regional
Administrator (Regional
ATP Coordinator)
Director of State Agency
issuing the NPDES permit2
ATP Program Coordinator,
EPA Headquarters
Approval Authority
EPA Regional
Administrator
EPA Administrator
See Appendix E for EPA addresses.
2 The Regional Administrator or the Director of State Agency issuing the NPDES permit may choose to forward
limited-use applications to the Director of Analytical Methods, Attn: ATP Program Coordinator for an approval
recommendation. Generally, the Regional Administrator or the Director of State Agency issuing the NPDES
permit will forward a copy of the approval to the Director of Analytical Methods, Attn: ATP Program Coordinator.
Generally, upon receipt, the application will be assigned an identification number, and a confirmation
letter referencing this identification number will be sent to the applicant. The applicant should use the
identification number in all future communications concerning the application.
2.2 Application Information
The following information should be provided on the ATP application form (Appendix B):
• Name, mailing address, phone number, and email address of the applicant
• Date of submission of the application
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
• Method number, title, and revision date of the ATP or new method submitted for review
• The analyte(s) included in the ATP or new method submitted for review
• The matrix or matrices to which the ATP applies
• EPA-approved reference method used for demonstration of comparability
• Type of application (i.e., wastewater, drinking water, ambient water, point source categories regulated
at 40 CFR Parts 400-499)
• The level of use desired (i.e., limited use or nationwide use)
• Type of study (side-by-side comparison or QC acceptance criteria-based comparison study)
• Applicant's NPDES permit number, the issuing agency, the type of permit and the discharge serial
number (if applicable)
The following items should be submitted as attachments to the initial application:
• Reason for proposing the ATP or new method
• The proposed ATP or new method prepared in standardized format (Section 3.0)
• A method comparison table that gives a side-by-side comparison of the steps of the proposed ATP or
new method and the EPA-approved reference method (Section 2.5)
• Method development information including preliminary method performance studies
• Study plan for EPA review and comment (Section 4.0)
A study plan is generally not needed with the application if an applicant is unsure whether or not a
modification is allowed within the method-specified flexibility. In such cases, the applicant may request
that EPA determine the usefulness of a full ATP comparability assessment based on the other information
submitted with the application. From this information, EPA can determine whether a full ATP assessment
will be helpful, whether the proposed modification is considered to be a minor modification (i.e., employs
the same chemistry and/or biological principles as the EPA-approved reference method to determine the
presence/absence or to quantify the amount of the target organism in a sample), or whether the proposed
modification is considered to be within the specified flexibility of the EPA-approved reference method.
The elements of a complete application are presented in Table 2-2. A list of the information discussed in
detail in Sections 2.3 to 2.8 is provided in Appendix C. EPA will generally seek all application
information and attachments before the application is considered complete.
Table 2-2. Application Information
Application Information
Completed application form
Reason for ATP
Method in EPA format
Method comparison table
Method development information
Study plan (to be approved by EPA before proceeding with study)
Study report (final report generally considered part of a complete application)
Note: Although the application process generally begins with the initial submission of ATP materials, the
application is not usually considered to be complete until the final study report has been submitted, and all
EPA questions on the report have been resolved.
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
2.3 Reason for ATP
The entity that proposes an ATP should indicate why the ATP is being proposed. Examples include, but
are not limited to, the following:
• The ATP improves method performance
• The ATP provides equivalent method components for a lower cost
• The ATP enables laboratories to perform analyses more efficiently
• The ATP successfully overcomes some or all of the interferences associated with the EPA-approved
reference method
• The ATP significantly reduces the amount of hazardous wastes generated by the laboratory
• The ATP provides another means for measuring a contaminant (i.e., provides more choices)
2.4 Standard EPA Method Format
A method description is needed for all ATPs including minor modifications. In accordance with the
standard EPA method description format advocated by EPA's Environmental Monitoring Management
Council (EMMC), methods should contain 17 specific topical sections in a designated order. The 17
sections are listed in Section 3.0 of this document. The method description should also contain a date and
the ATP case number so that the method version is properly identified. Additional numbered sections may
be inserted starting with Section 11.0, Procedure, as appropriate for a particular method. For detailed
information on the EPA format for proposed methods, see the Guidelines and Format document (Reference
10.14).
2.5 Method Comparison Table
As part of the application, the applicant should provide a two-column table comparing the proposed ATP
or new method to the EPA-approved reference method. The two-column method comparison table should
include the number and title of each method, the latest revision date of the proposed ATP, and a detailed
discussion of each of the 17 topics specified by the standard EPA method format (as applicable). Each
topic should be discussed on a separate row in the method comparison table. The applicant should
highlight any differences between the proposed ATP and the EPA-approved reference method.
2.6 Method Development Information
Before EPA reviews the study plan and works with applicants on ATP studies, the applicant should
provide data on performance of the modified or new method in the water matrix for which the ATP or new
method is being applied. These data may have been generated during method development by the vendor,
or through independent tests by third-party laboratories. Examples include, but are not limited to, replicate
spiked reagent water or replicate spiked matrix water tests. It is the responsibility of the applicant to
provide sufficient data to demonstrate that the ATP or new method performs sufficiently at a preliminary
level in the matrix of interest to merit evaluation of an ATP. If sufficient data is not available, EPA may
request additional studies be conducted prior to the review of the ATP study plan.
In addition to data, the following descriptive method information will facilitate EPA's evaluation of the
ATP application:
• The purpose and intended use of the method
• The analytical basis for the method, noting any relationship of the method to other existing analytical
methods and indicating whether the method is associated with a sampling method
10
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
• Method limitations and an indication of any means of recognizing cases where the method may not be
applicable to specific matrix types (e.g., turbidity greater than 50 NTU)
• The basic steps involved in performing the test and data analysis
• Options to the method, if applicable
This information also will aid EPA in preparing the docket and the preamble for the proposed rule that will
be published in the Federal Register if EPA proposes to approve the ATP for nationwide use.
2.7 Study Plan
Prior to conducting all studies, the applicant should submit a study design for EPA review and comment.
A detailed procedure (Section 3.0) for the new method or the modification should be included as an
attachment to the study plan. Generally, EPA will evaluate the study plan to ensure that the appropriate
data quality objectives identified in this protocol are defined and addressed. EPA comments will be
incorporated into the study design and this process will be repeated until EPA has approved the study
design. Generally, the study plan should contain the elements listed below:
• Background
• Objectives
• Study Design
• Coordination
• Data Reporting
These elements are further described in Section 4.0.
2.8 Study Report
The applicant should conduct a study and provide a comprehensive study report with the ATP or new
method application. The study report should include the following elements:
• Background
• Study Objectives and Design
• Study Implementation
• Data Reporting and Validation
• Results
• Data Analysis and Discussion
• Conclusions
• Appendix A - Method
• Appendix B - Study Plan
• Appendix C - Supporting Data
• Appendix D - Supporting References
These elements are further described in Section 9.0.
2.9 Proprietary Information in Applications
All information provided to the Federal government is subject to the requirements of the Freedom of
Information Act. Therefore, any proprietary information submitted with the proposed ATP application
should be marked as confidential. However, EPA prefers that supporting documentation labeled as
confidential business information not be submitted as part of the ATP application. If proprietary
11
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
information is determined to be essential to the application, EPA staff will request the information and will
handle such information according to the regulations in subparts A and B of 40 CFR Part 2.
Specifically, in accordance with 40 CFR §2.203, a business that submits information to EPA may assert a
business confidentiality claim covering the information by placing on (or attaching to) the information at
the time it is submitted to EPA, a cover sheet, stamped or typed legend, or other suitable form of notice
employing language such as trade secret, proprietary, or company confidential. Allegedly confidential
portions of otherwise non-confidential documents should be clearly identified by the business, and may be
submitted separately to facilitate identification and handling by EPA. If the business desires confidential
treatment only until a certain date or until the occurrence of a certain event, the notice should so state.
Please be advised, however, that any methods proposed in the Federal Register cannot be claimed as
confidential business information.
If a claim of business confidentiality is not made at the time of submission, EPA will make such efforts as
are administratively practicable to associate a late claim with copies of previously submitted information in
EPA files. However, EPA cannot ensure that such efforts will be effective due to the nature of application
review that may already be in progress.
ATP study designs, ATP study data and method performance characteristics developed from ATP study
data are not confidential business information, and may become publically available.
12
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
SECTION 3.0 METHOD FORMAT
Because alternate test procedures may be approved by EPA as comparable to the reference methods, and
may be implemented by multiple laboratories, it is important that the written procedures include all of the
information necessary to use the technique in the laboratory, including but not limited to: reagents and
equipment, sample collection and preservation procedures, quality control, and a detailed description of the
procedure. Updated versions of this information are also needed for minor method modifications. The
information described below should be provided in the method. In addition, EPA recommends the
Environmental Monitoring Methods Council (EMMC) format described below be used for all ATP
applications (i.e., including minor method modifications).
Sections 3.1 through 3.17 provide a list of EMMC method sections and a general description of the type of
information that should be included in each section. The date and revision number and ATP case number
of the method should be included on the cover page. In addition, the date should be included as a footer
on each page of the method. A detailed description of method format guidelines, as well as an example of
a formatted method, is provided in Reference 10.14. The detailed information in Reference 10.14 is
provided as guidance for the method write-up and as such, specific suggestions for font size, margins, etc.
are optional.
3.1 Scope and Application
Include a list of target organisms (by common name), taxonomic group and their CAS registry numbers or
other accepted numbering systems (if available), the matrices to which the method applies, a generic
description of method sensitivity (the minimum number of organisms the method can detect per unit
volume or mass, if known), and the data quality objectives that the method is designed to meet or
monitoring programs for which the method was designed to support.
3.2 Summary of Method
Summarize the method in a few paragraphs. The purpose of the summary is to provide a succinct
overview of the method procedure to aid the reviewer or data user in understanding the method and how
the results are generated. Include a general description of the method procedure, sample volume, type of
media used, preparation steps, incubation time and temperatures, and the techniques used for qualitative or
quantitative determinations.
3.3 Method Definitions
Provide definitions of terms that are necessary to understand how the method is used or what the results
represent. This should include a definition of the target organism or group of organisms, relative to the
determinative step of the method. For extensive lists of definitions, this section may simply refer to a
glossary attached at the end of the method document.
3.4 Interferences
This section should discuss any known method interferences such as toxic materials, particulates, non-
target organisms, etc. If known interferences in the reference method are not interferences in the alternate
method, this also should be clearly stated.
13
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
3.5 Safety
This section should discuss only those safety issues specific to the method and beyond the scope of routine
laboratory practices. Target analytes or reagents that pose specific health, toxicity, or safety issues should
be addressed in this section.
3.6 Equipment and Supplies
For critical equipment that may affect the performance of the method, cite the manufacturer, model name,
and catalog or product number of the equipment that was used to develop or validate the method; note that
equivalent equipment can be used, if applicable. Use generic language for standard laboratory glassware
and disposables.
3.7 Reagents and Standards
Provide sufficient details on the concentration and preparation of reagents and standards to allow the work
to be duplicated, but avoid lengthy discussions of common procedures. If only pre-prepared proprietary
reagents can be used, specify this. Include catalog and/or product numbers where appropriate. Indicate
shelf life of packaged materials and special storage specifications.
3.8 Sample Collection, Preservation, and Storage
Provide information on sample collection, preservation, shipment, storage conditions, and holding times.
If effects of holding time were specifically evaluated, provide reference to relevant data.
3.9 Quality Control
Describe specific quality control (QC) measures that enable one to establish the sensitivity, specificity,
false positive rates, false negative rates, bias, and precision of measurements using the method, and that the
measurements are free from contamination. Specific QC measures may include positive and negative
controls, duplicate samples, method blanks, and media sterility checks. Indicate which QC measures are
appropriate initially, before a laboratory uses the method, and which are appropriate on an ongoing basis.
Indicate frequencies for each QC measure and list minimum specifications or acceptance ranges (see
Section 5.0). Indicate corrective actions that should be taken when QC measures are not met. Define all
terms in method definitions section.
3.10 Calibration and Standardization
Discuss initial calibration specifications for instruments used in the method (e.g., water baths, refrigerators,
thermometers, balances, pH meters, microscopes, etc.). Indicate frequency of such calibrations; refer to
performance specifications; and indicate corrective actions that should be taken when performance
specifications are not met. This section may also include procedures for calibration, verification, or
continuing calibration, or these steps may be included in the procedure section.
3.11 Procedure
Provide a detailed description of the sample processing and analysis steps. Avoid unnecessarily restrictive
instructions, but provide sufficient detail for manual procedures so that analysts in other laboratories
perform the method consistently. Ranges should be provided for temperature requirements, time
requirements, etc.
14
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
3.12 Data Analysis and Calculations
Identify qualitative and quantitative aspects of the method. List criteria for the identification of target
organism(s) and interpretation of results for all steps of the method, including criteria for presumptive and
confirmed results. Provide equations used to derive final sample results. Provide discussion of estimating
detection limits, recoveries, specificity, false positive/false negative rates, etc., if appropriate.
3.13 Method Performance
Provide detailed information on method performance, including data on precision, bias (for quantitative
methods), specificity, detection limits (including the method by which they were determined and matrices
to which they apply), and statistical procedures used to develop performance specifications (i.e., recovery,
precision, specificity, false positive/false negative rates, etc.). Where performance is tested relative to the
reference method, provide a summary of the side-by-side comparison of performance versus reference
method specifications.
3.14 Pollution Prevention
Describe aspects of this method that minimize or prevent pollution that may be attributable to the reference
method.
3.15 Waste Management
Cite how waste is minimized and the proper disposal of samples and waste.
3.16 References
Include source documents, publications, etc.
3.17 Tables, Diagrams, Flowcharts, and Validation Data
Additional information may be presented at the end of the method. Lengthy tables may be included here
and referred to elsewhere in the text by number. Diagrams should only include new or unusual equipment
or aspects of the method.
15
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
SECTION 4.0 STUDY PLAN
Applicants should submit a study design for EPA review, comment, and approval prior to conducting the
side-by-side comparison study or the QC acceptance criteria-based comparison study. This process
protects the applicant by providing written approval of the study design before resources are spent to
conduct the study. The ATP program is intended to be flexible, and thus EPA may modify the study
design for a particular proposed method. Data from studies conducted without EPA review and approval
may not meet EPA's criteria, and may not adequately address the applicant's study objectives. A detailed
procedure (Section 3.0) for the ATP or new method should be included as an attachment to the study plan.
EPA will generally evaluate the study plan to verify that the appropriate data quality objectives identified
in this protocol are defined and addressed. EPA comments are incorporated into the study design. This
review/revision process is repeated until EPA has approved the study design.
Generally, the study design should include the information described in Sections 4.1 through 4.5.
4.1 Background
This section of the study plan should include the following information:
• A statement identifying the ATP as a new method or a modification of a reference method
• The EPA program(s) to which the ATP or new method applies (e.g., drinking water, wastewater,
ambient water, point source categories regulated at 40 CFR Parts 400-499, etc.)
• A short (one paragraph) summary of the ATP or new method
• The organization and method number of the reference method if applicable
• A description of the reasons for the extent of the modification, the logic behind the technical approach
to the modification, and the result of the modification
• The matrices (e.g., finished water, wastewater, ambient water, etc.), matrix types (e.g., turbidity greater
than 10 NTU, etc.), and/or media to which the ATP or new method is believed to be applicable
• A list of the analytes measured by the ATP or new method, including the corresponding CAS registry
number (if available) or other identification number
4.2 Objectives
Include a description of the new method or modification, describe the goals of the study, and define data
quality objectives.
4.3 Study Design
The following information should be included in the study design:
• Laboratories that will participate in the study (Sections 5.0 and 6.0)
• Number and type of samples to be analyzed (Section 6.0)
• Description of the matrices that will be used (Sections 6.0 and 7.0)
• Description of the spikes that will be used (Section 7.0)
• Description of the spiking procedure (Section 7.0)
• Positive and negative control organisms (Section 5.0)
• Quality control procedures that will be followed (Section 5.0)
16
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
4.4 Coordination
Describe how the study will be coordinated, how the spikes will be shipped to the laboratories, and who
will compile the data for submission. Data compilation should not be performed by any of the analysts
conducting the sample analyses.
4.4.1 Study Management
This section of the study plan should include the following information:
• The organization responsible for managing the study
• The laboratories, facilities, and other organizations that will participate in the study
• A delineated study schedule including, but not limited to, sample collection, start of sample analysis,
interpretation of sample results, completion of study, etc.
4.4.2 Technical Approach
This section of the study plan should include the following:
• A description of how sample matrices and participating laboratories will be selected
• A description of how samples will be collected and distributed
• The numbers and types of analyses to be performed by the participating laboratories
• A description of sample spiking procedures
• A description of how analyses are to be performed
4.5 Data Reporting
List the data elements that will be collected and provide sample bench sheets (see Appendix D) that will be
used to record raw data during the study. Raw data should be submitted as an attachment to the Study
Report (Section 9.0). A list of the supporting data that should be included as Appendix C of the study
report is given in Section 9.10.1. This supporting information includes QC data, instrument logs, media
preparation records, and other parameters. Address the statistical analysis of the study results that will be
performed, if the statistical analyses will differ from those described in Section 8.0. Please note, however,
that EPA's evaluation of method performance will generally be based on the statistical analyses described
in Section 8.0.
17
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
SECTION 5.0 QUALITY ASSURANCE/QUALITY CONTROL
For side-by-side comparison studies in which only one laboratory is performing analyses, the comparability
study should be conducted at an independent laboratory and the laboratory should be certified to perform
microbiological analyses under EPA's drinking water laboratory certification program. A laboratory with a
vested interest in the method, instrumentation, apparatus, reagents, media, or associated kits may not
participate in the side-by-side comparison study.
For QC acceptance criteria-based comparison studies, at least three independent laboratories should
participate. A laboratory with a vested interest in the ATP also may participate in the study, but in such
instances there should be at least three independent laboratories participate and the majority of the
laboratories participating are independent laboratories. All laboratories should be certified to perform
microbiological analyses under EPA's drinking water certification program, or a comparable certification
program, if a laboratory located outside of the U.S. is included. At least three independent laboratories
participating in the study should to be certified for microbiological analyses under EPA's drinking water
certification program. If more than three laboratories participate, the majority of the laboratories should be
certified for microbiological analyses under EPA's drinking water certification program.
5.1 Quality Assurance
The laboratory should have a comprehensive quality assurance (QA) program in place and operating at all
times during the performance of the comparability study. General QA program guidance is provided at
http://www.epa.gov/quality/qs-docs/r2-final.pdf (Reference 10.11). The laboratory should adhere closely
to all QA and quality control (QC) measures in this protocol as well as the QC measures in the method(s).
Laboratory QA/QC criteria for facilities, personnel, and laboratory equipment are included in Standard
Methods 9020-Quality Assurance (Reference 10.4) and the U.S. EPAManualfor the Certification of
Laboratories Analyzing Drinking Water, Fourth Edition (March 1997) (Reference 10.13).
The laboratory should adhere to standard laboratory practices for cleanliness and environment, and to the
methods for glassware and apparatus, reagents, solvents, and safety. Additional guidelines regarding
general laboratory procedures should generally be followed, as specified in Sections 4 and 5 of the
Handbook for Analytical Quality Control in Water and Wastewater Laboratories, EPA-600/4-79-019
(Reference 10.19).
5.2 Quality Control
Laboratories participating in a comparability study should perform all QC procedures specified in the
methods, except where explicitly stated in the approved study plan. The QC procedures listed in Table 5-1
and described below should be performed, as appropriate, based on the technique used in the method.
Other QC procedures may be necessary, based on the approved study plan. Laboratories participating in
side-by-side comparison studies or QC acceptance criteria-based comparison studies should perform all
QC procedures specified in the methods, except where explicitly stated in the approved study plan. The
laboratory should maintain records to define the quality of data that are generated. The laboratory should
maintain a record of the date and results of all QC sample analyses. Laboratories should maintain reagent
and material lot numbers along with samples analyzed using each of the lots. Laboratories should also
maintain media preparation records.
Table 5-1 lists quality control measures for each laboratory participating in side-by-side comparison studies
and QC acceptance criteria-based comparison studies. Detailed descriptions of the QC measures are
provided in Sections 5.2.1 to 5.2.15. If contamination is detected in any of the blanks or sterility checks
described below, the source of contamination should be identified and corrected. The blank/sterility check
7s
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
and all samples associated with that contaminated blank/sterility check should be re-prepared and reanalyzed.
Measures taken to eliminate contamination should be reported.
Table 5-1. Quality control measures for each laboratory involved in the study
Quality Control Measure
Analyst counting variability (Section
5.2.1)
Autoclave sterilization verification
(Section 5.2.2)
Dilution/rinse water blanks
(Section 5.2.3)
Incubator/water bath temperatures
(Section 5.2.4)
Initial demonstration of capability
(IDC) (Section 5.2.5) or Initial
precision and recovery (IPR)
(Section 5.2.6)
Matrix spike/matrix spike duplicate
(Section 5.2.7)
Media sterility checks
(Section 5.2.8)
Method blank (Section 5.2.9)
Ongoing demonstration of
capability (ODC) (Section 5.2.10)
or Ongoing precision and recovery
(OPR) (Section 5.2.11)
Positive and negative controls
(Section 5.2.12)
Preparation blanks
(Section 5.2.13)
Refrigerator/freezer temperatures
(Section 5.2.14)
Sample processing equipment
sterility checks (Section 5.2.15)
Frequency
2 samples per study
Within one week prior to
the start of the study
1 per every 20 samples or
1 per day of study,
whichever is greater
2 times per day when used
in study
Each laboratory
participating in the study
should generate
acceptable IDC/IPR data.
Per approved study plan
1 per each batch of media
used in the study or per
each test run, whichever is
greater
Per approved study plan
Included as part of the
study; not needed as
separate QC.
1 positive control and 1
negative control for each
media/stain used in the
study. See 5.2.12 below for
more details.
Frequency depends on the
type of method. See 5.2.13
below for more details.
Once per day when used in
study
Prior to the analysis of
samples. See 5.2.15 below
for more details.
Suggested for Study
Side-by-Side
Comparison Study
/
/
/
/
/
/
/
/
/
/
QC acceptance
criteria-based
Comparison Study
/
/
/
/
/
/
/
/
Optional
/
/
/
/
19
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
5.2.1 Analyst Counting Variability
If the laboratory has two or more analysts, each should count colonies, plaques, or positive wells on the
same plate/tray from one positive field sample per month. Compare each analyst's count of the colonies,
plaques, or wells. Counts should fall within 10% between analysts. If counts fail to fall within 10% of
each other, analysts should perform additional sets of counts, until the number of target colonies, plaques,
or positive wells counted fall within 10% between analysts for at least three consecutive samples. If there
is only one analyst replicate counts should be done and be within 5% of original counts.
5.2.2 Autoclave Sterilization Verification
Autoclave sterilization verification should be performed within one week of the start of the study by
placing Bacillus stearothermophilus spore suspensions or strips inside glassware. Autoclave at 121°C for
15 minutes. Place Bacillus stearothermophilus spore suspensions in trypticase soy broth tubes and
incubate at 55°C for 48 hours. Check for growth to verify that sterilization was adequate. If sterilization
was inadequate, determine appropriate time for autoclave sterilization. Repeat spore test. The laboratory
should have historical data verifying that at a minimum, autoclave sterilization is performed on a monthly
basis.
5.2.3 Dilution/Rinse Water Blanks
The laboratory should analyze dilution/rinse water blanks to demonstrate freedom from contamination. An
aliquot of dilution/rinse water which is analyzed exactly like a field sample should be analyzed for every
day of the study or for every 20 samples, whichever is more frequent, and observed for contamination with
agent of interest.
5.2.4 Incubator/Water Bath Temperatures
Incubator or water bath temperatures should be measured and recorded two times per day when in use.
Temperatures should be taken at least 4 hours apart and should be within the range of the desired
temperature as specified in each method. Thermometers used to measure "in-use" temperatures should be
calibrated yearly against an NIST traceable thermometer.
5.2.5 Initial Demonstration of Capability
Laboratories participating in a QC acceptance criteria-based comparison study should have successfully
performed an initial demonstration of capability (IDC) test for the new or modified method under
evaluation in the study. An IDC test is performed when QC acceptance criteria are available for the
evaluation of precision or recovery, but not both. The laboratory should perform an IDC as specified in the
method to demonstrate acceptable performance. The laboratory should complete any additional analyses
as specified in the study plan.
For the IDC test, the laboratory spikes and analyzes reference matrix (e.g., reagent water, buffered water,
etc.) samples to demonstrate acceptable performance with the method prior to the analysis of field samples.
The number of samples involved in the IDC varies by method. If the results of the IDC test meets all IDC
acceptance criteria (e.g., RSD, minimum number of samples positive, etc.), system performance will
generally be acceptable. If any of the IDC test results fail to meet the acceptance criteria, system
performance will generally be unacceptable. In this event, the laboratory should identify and correct the
problem and repeat the test. IDC tests should be accompanied by a method blank (Section 5.2.9).
20
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
5.2.6 Initial Precision and Recovery
Laboratories participating in a QC acceptance criteria-based comparison study should have successfully
performed an initial precision and recovery (IPR) test for the method using the modified version of the
method under evaluation in the study. An IPR test is performed when QC acceptance criteria are available
for the evaluation of both precision and recovery. The laboratory should perform an IPR as specified in the
method to demonstrate acceptable performance. The laboratory should complete any additional analyses
as specified in the study plan.
For the IPR test, the laboratory spikes and analyzes reference matrix (e.g., reagent water, buffered water,
etc.) samples to establish the laboratory's ability to generate acceptable precision and recovery prior to the
analysis of field samples. Using results of the analyses, the laboratory calculates mean percent recovery
and relative standard deviation (RSD) of the recoveries for the analyte(s) and compares them with the
corresponding limits for the IPR test criteria in the method. If the RSD and the mean percent recovery
meet the acceptance criteria, system performance is acceptable and analysis of samples may begin. If the
RSD or the mean percent recovery is unacceptable, system performance will generally be unacceptable. In
this event, the laboratory should identify and correct the problem and repeat the test. IPR tests should be
accompanied by method blank tests (Section 5.2.9).
5.2.7 Matrix Spike and Matrix Spike Duplicate Samples
Matrix spike (MS) and matrix spike duplicate (MSD) samples are spiked matrix water samples analyzed by
the laboratory to verify acceptable method performance in the matrix being monitored. During routine
performance of the method, MS/MSD samples are analyzed by the laboratory for the first sample of any
new matrix that will be monitored, and on the 21st sample thereafter. During a QC acceptance criteria-
based comparison study for some methods (e.g., Cryptosporidium and Giardia methods), the laboratory
should analyze MS samples as part of routine laboratory QC, however these analyses may not be necessary
during this study because MS/MSD samples using blinded spiking suspensions distributed by the study
coordinator will be used for the study.
5.2.8 Media Sterility Checks
Before using newly prepared media, a representative portion of each media batch needs to be checked for
sterility. The laboratory should test media sterility by incubating one unit (tube or plate) from each batch
of medium specified in the method or per each test run, whichever is more frequent, at the appropriate
temperature for the length of the method-specified incubation time and observing for growth.
5.2.9 Method Blank
Method blanks are reagent water blanks or other blanks including but not limited to buffered water, tap
water, etc., depending on the method analyzed to demonstrate freedom from contamination. Method
blanks should be analyzed at the frequency specified in the approved study plan. For QC acceptance
criteria-based comparison studies, the laboratory needs to analyze a method blank with the IPR and IDC
tests. During routine performance of the method, the laboratory should analyze at least one method blank
per every 20 test samples or per every week. During a QC acceptance criteria-based comparison study for
some methods (e.g., Cryptosporidium and Giardia methods), the laboratory should analyze method blanks
as part of routine laboratory QC, but method blanks also should be shipped to the laboratory as double-
blind samples by the study coordinator.
21
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
5.2.10 Ongoing Demonstration of Capability (ODC) Samples
Ongoing demonstration of capability samples are spiked reference matrix (e.g., reagent water, buffered
water, etc.) samples that are analyzed to demonstrate that the analytical system is in control on an ongoing
basis.
5.2.11 Ongoing Precision and Recovery (OPR) Samples
Ongoing precision and recovery samples are spiked reference matrix (e.g., reagent water, buffered water,
etc.) samples that are analyzed by the laboratory to verify that method performance criteria are being met.
During a performance based comparison study for some methods (e.g., Cryptosporidium and Giardia
methods), the laboratory should analyze OPR samples as part of routine laboratory QC because generally
OPR samples using blinded spiking suspensions distributed by the spiking coordinator will be used for the
study.
5.2.12 Positive/Negative Controls
Positive and negative controls are target and non-target organisms processed to ensure the laboratories are
familiar with the identification of the target organism and to ensure that confirmation test results are
appropriate. One hundred (each) positive and negative control tests should be conducted. These results
should be obtained from completed tests (e.g., obtaining a sample at 12 hours from a test that takes 24
hours to complete is unacceptable).
5.2.12.1 Positive/Negative Culture Controls (Culture-Based Methods)
Positive and negative culture controls refer to cultures that, when analyzed exactly like field samples,
produce a known positive or a known negative result, respectively, for a given type of media. One positive
culture control and one negative culture control should be prepared and analyzed for every media
(including confirmation media) used in the method whenever a new batch of medium or reagents is used,
every day of the study, or every 20 samples, or as specified in the method, whichever is more frequent.
Each control should be carried through the entire procedure and should exhibit the expected positive or
negative result.
5.2.12.2 Positive/Negative Staining Controls (Cryptosporidium and Giardia Methods)
A positive staining control for Cryptosporidium and Giardia methods is a slide containing positive antigen
or intact Cryptosporidium oocysts and Giardia cysts, and that is stained using the same procedure as used
for field samples or test samples. A negative staining control is a slide containing only phosphate buffered
saline (PBS) that is stained using the same procedure as used for field samples or test samples. The
laboratory should prepare and examine positive and negative staining controls with each batch of slides the
laboratory prepares during the study. Positive staining controls should exhibit acceptable fluorescence and
negative staining controls should not exhibit fluorescence.
5.2.12.3 Positive/Negative Staining Controls (Other)
At a minimum, the laboratory should prepare and examine a positive and negative control using the same
procedure as used for field or test samples whenever a new batch of media or reagents is used every day of
the study or every 20 samples, whichever is more frequent. Each control should be carried through the
entire procedure and should exhibit the expected positive or negative result.
22
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
5.2.13 Preparation Blanks (PB)
5.2.13.1 Membrane Filter Preparation Blank (PB-MF)
If membrane filtration is used, at the beginning and the end of each filtration series, a PB-MF is performed
by filtering 20-30 mL of dilution water through the membrane filter and testing for growth. If the control
indicates contamination with the target organism, all data from affected samples should be rejected. A
filtration series ends when 30 minutes or more elapse between sample filtrations.
5.2.13.2 Multiple-Tube Fermentation Test Preparation Blank (PB-MTF)
If a multiple-tube fermentation test is used, a volume of sterile buffered water that is analyzed exactly like
a field sample should be analyzed for every day of the study or every 20 samples, whichever is more
frequent. The preparation blank should be incubated with the sample batch and observed for growth of the
target organism. If the control indicates contamination with the target organism, all data from affected
samples should be rejected. If buffered water is not used for dilutions, only the multiple-tube fermentation
media should be included.
5.2.13.3 Other Preparation Blank (PB-Other)
A volume of sterilized water (e.g., reagent grade as defined in Specification D 1193, Annual Book of
ASTM Standards) that is analyzed exactly like a field sample should be analyzed for every day of the study
or every 20 samples, whichever is more frequent. The preparation blank should be incubated with the
sample batch and observed for growth of the target organism. If the control indicates contamination with
the target organism, all data from affected samples should be rejected.
5.2.14 Refrigerator/Freezer Temperatures
Refrigerator and freezer temperatures should be measured and recorded once per day when in use.
Refrigerator temperature should be maintained at 1°C to 4°C. Freezer temperatures should be maintained
at -15°C to -20°C. Special freezers capable of long-term storage of cultures or virus should be maintained
at -70°C to -80°C. Thermometers used to measure "in-use" temperatures should be calibrated yearly
against an NIST traceable thermometer.
5.2.15 Sample Processing Equipment Sterility Checks
A representative portion of non-disposable items such as sample containers, blender jars, etc., used to
collect or process samples should be checked for sterility prior to use in analyses. To test for sterility add
approximately 500 mL (or appropriate volume based on the size of the equipment being used) of a sterile
non-selective broth (e.g., tryptic soy, trypticase soy, or try tone broth) to the non-disposable item and
incubate at 35°C ± 0.5°C for 24 hours and check for growth. Depending on the incubation times specified
in the method, the length of the incubation time for the sample processing equipment sterility check may
be increased.
23
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
SECTION 6.0 STUDY DESIGN
This section provides a description of the study design that is important for assessing comparability using
either side-by-side method comparison studies (Section 6.1) or QC acceptance criteria-based studies
(Section 6.2). The sections below address the number of laboratories, number of matrices, and the number
of samples that should be analyzed in the studies.
6.1 Side-by-Side Comparison Studies
The reference method to be compared to the proposed method for drinking water presence/absence ATPs
are LTB/BGLB for total coliforms and LTB/EC-MUG for E. coli. For drinking water membrane filter
methods, the reference method for total coliforms is m-Endo and for E. coli, it is nutrient agar with MUG.
Methods are compared using the following parameters:
• Recovery. Does the new method have similar, better or worse recoveries of the target organism as the
reference method?
• Precision. Are the recoveries by the new method significantly less or more variable than the reference
method?
• False positive rate/specificity. Is the new method significantly more likely or less likely to detect
non-target organisms or other sample constituents that would be reported as the target organism by the
analyst when compared to the reference method?
• False negative rate/sensitivity. Is the new method significantly more likely or less likely to exhibit
non-detects for samples with the target organism or to exhibit results that are biased low when
compared to the reference method?
To generate these parameters, samples are analyzed by a single laboratory (6.1.1). The number of samples
(and matrix types) used in the study are determined using a historical EPA standard (6.1.2.1)
6.1.1 Number of Laboratories
A single laboratory should be used for a side-by-side comparability study. Since the study should be
conducted in a single independent laboratory with no conflict of interest; the laboratory selected cannot be
the method developer's laboratory and cannot be affiliated with the method developer.
6.1.2 Number of Samples
The following standards generally provide the minimum number of samples that should be analyzed.
Additional data are generally acceptable and may be very helpful when reviewing an ATP or new method.
6.1.2.1 Method Comparison Study Design Summary
Table 6.1 provides a summary of the method comparability requirements for nationwide microbiological
ATPs and new methods.
24
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Table t -1 Summary of Side-by-Side Method Comparison Study Design for Nationwide Microbiological Alternate Test Procedures
Regulatory Information
RtfafPragram
National Pollutant
Dtechargs Elimination
Program (NPOES)
GuWellrss Establishing
Test Procedure* For the
Analyala or Pollutants;
Analytical Msinoas Tor
Biological Pollutants In
Wastewater and Sewage
Sludge
Gulaellnes Establishing
Teet Procedures for the
Analysis of Pollutante:
Analytical Methods for
HofogHHI Pollutante In
AniBlsnt water
Total Conform Rule
surface water Treatment
Rule
Long Term Surface Water
Traafnunt Kite
«CFR
CItanon
136.3
136.3
136.3
14121
141.74
1417D4
Matrix
vvastewata-
VVaslewater
Swage siuflge
jBOSOIlQE)
Al
REcrsaltoral
iVas-5
Fresh
REcraattonal
Waters
Maine
Recraaloral
Watot
DrlrKInc W3»-
Ourrace water
OurtKe water
Type of Teat
Qsantnaflve
Qjartltaive
OuantnaDfe
Quartraive
PreBeroe
Msents
OjartTHlve
diantiatr.^
Regulated
Analytee"
Tctacamxni
Feaa Confoc™
e.««
EntertKoccss
Fecacarnmi
Sanona'a
Feca carorr
EHJrt
fifl^DOKCOS
EnJWTKOCCJS
TMacoHTDnn
Feca conform
ECM
Total ca iron!
FeeaColirorm
Haie«ajDc«iE
Sjserj
QjpfoaMrOliim
Eear
TechnkpiBB1
k*N
Iff
wm
P/F
WF
WF
M=N
Mm
WF
WF
MF
MPN
HP
Enzyme BuKiras
Mm
HT
MM
MF
Enzyme Substrate
WPN
MF
Enzyire suDslrate
WPN
MF
=>our Plate
FirjajarV'lMOFA
um
MF
Enzyme SiCElrate
Comparability Study1
Sample Source
Typs
WaEtEwatsr1
Waetewater1
vatetOaf
H
'.ea-eaaaia
Waci-E
Freai Water
Marre Wi^r
Drntlrg'.'Jaa
lOKHara-Treei
Dnntlpgwasr
(DKldait-free)
Surface 'A jsr
MurnOflr
10
10
ID
ID
ID
«
1
1
1
1
j
Natural Source of Organisms
(For SplKIng Into Sampte &c*jrcs)
Type
Mi
Mi
WaEtewaer
waaswaer'
lV3C2#3tET
wastenaaf
=t»iuea Surace
•AaHT1
Erajneraiea spite'1
W3Se*a1er
Kurtitwr
Ml
r»a
10
1
ID
ID
in
R3pUC3te9
3P
2D
in
3D
20
20
215
verification of
Reference Method'
Typical5
400
400
400
200
2DD
400
iac
itypica"
MM requires
',3: requrefi
NotreqUretf
Not
required
MM requlrett
Not required
Matraqulreo
Minimum Comparability
RBSUttB
•'.-'-<--
am
200
an
2DC^
2OD
2TJO
20C
Propo&dd
2DO
2CO
200
200
200
2m
am
Specificfty Study1
Minimum Proposed
Method Reruns
=ala«.pog
200
200
200
200
100
200
200
Fa=e r*9g
20C
2DD
2D:
2O:
m
200
2D:
25
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Ground Waiter Rute
•-M-:.
and
•41 .in
Ground waer
•-ijTvjr.j
EeUl
EnBQHXriC
caiphage
WFN
NF
Ertzyma sustrae
WPN
hr
Enzyire Substrate
PQuratate
MrtmgWasr
(otuam-lree)
i
Wastptaar
waatewaer
iVastewaer
13
2C
a
Not requires
an
20C
2K
aw
1) For sfjdies conducted in drying a-d s-raee vrale" matrees "'or d-inki-g water purposes, the comparanlity study i-oud ne ccnducted in 3 lacoraKry csdified u-d*- tfie dr -fcng i'.3tef larofaiofy cenf cat en pfDgran. A rTflhcc Kv^ccer ha^i-g 3 vestsa i
rn^hod, rstnjnentaton, 3ppaT3tus. reagenE meda cr asBodated kits s.'xxJc rxS perfonr the conr>araality study analyses r the appJicart's laixxatory
2) The specifidty sfjdy -nay De conducted in r* method de'neoper's aberatory. Spec-city data s not requiec fof the reference -netrod.
3) For wastewater vaiidatkxi sludes unsp k^d •rt3s1e¥^3f!er sarrpes should be teslec No addtionai n3tural sources O"' organ&Tts are required.
4) For coilbiTn methods, a s "gte dnnki1^ water sample source is spikec v/1h 10 d'^fepsnt wastewoters ID create "0 diferem lest samples- For HFC methods a sigle c"inkhg water source sample is spiked wth 10 different surface waters (e.g., ambient water, recevtig water,
source wader) to create 'Q different lest sampSes-
5) For quarrtitatwe rererence melhoos, ' Q typcal oocnes (sheen cdonesi must be verifed per lepicate for a total of 4DO colony vefrcations CO samples x 4 random^ chosen out of 20 replicates x 10 sheen colonies}. For presence'abserce tests, all postive samples must be
ver-Ted ;up to 200 samptes).
6) For presence'abseoce reference methods. ver-Tcatiwi «r atypical colonies s rex requirec Methoo developers may uerfy atjoical colonies to prowoe addrtional cata
7j Verificabcri of HDpossl met-od 15 "ct r=qu nsd.
6) Readers sroocJefer to Section 1.4.1 of the document for a list of approved reference mettxxjs.
9) Where appfcaUe (e.g., Oass A matices)
10} The eomparsor stjdy may also appiy to analytes not carefitly regulateo iby the Agency n standards -'twere are appnraned methods 'or that analyte rune specific rratra. For example. £ cali and entenxoeci r wastevjater and fecal cofiforms and Salmonella ii
sewage skxlge (biosolds; Interested ertfes s.--oud nqLire accut other anaiytes if there are apu-oved methoos purlisheo for trerr in Fede*al Fegsier f na r^lirtjs i4; C"R cons 136 -=r 141 j
1! I Enumerated spiking suspensions c^' Cvyptecpood'um are cerrmerciaSy availade fkw eytDmeler sorted spiking susperaons vrich ,"^ve lower tfarabiity than maruaTy preparec spikes.
26
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Number of Matrices
Ten different water/wastewater matrices from ten (10) different geographically diverse locations should be
included to obtain, as much as is practical, a good presentation of the wide range of water types with an
even wider range of target organisms to which the method should appropriately respond. Each water or
wastewater sample should be collected in sufficient volume to complete all replicate analyses of sample or
dilution volumes by both the ATP or new method and the EPA-approved reference method. For ambient
water studies, the turbidity of at least one matrix should be greater than 10 NTU. Generally, for matrices
other than finished drinking water or ambient water, matrix composition will be addressed on a study-
specific basis.
Number of Samples and Replicates
Twenty replicate analyses should be performed by each method for each of the 10 matrices for a total of
200 replicate analyses per method. The replicate analyses should be performed on the same day for both
the proposed and reference methods.
Verification of Results
For quantitative reference methods, verify 10 typical colonies from 4 randomly chosen replicates (of the 20
replicates) of each of the 10 samples for a total of 400 colony verifications. For presence/absence tests, all
positive samples should be verified for up to 200 samples. See Section 6.1.3 for additional information
pertaining to the verification of results. Calculations for false positive rates and false negative rates are
described in Section 8.4.3.
6.1.3 Verification of Results
6.1.3.1 False Positives
To assess whether the false positive rates are significantly different between methods, replicates known to
contain non-target organisms that could be falsely identified as the target organism should be analyzed by
both the ATP and new method. The determination that the samples do not contain the target organism
should be based on a third independent standard method (see Section 7.6) rather than by the EPA-
approved reference method being used in the comparison. This is because it should not be assumed that
the accepted method has a false positive or negative rate of zero.
For drinking water, total coliform specificity is determined by reaction with LTB/BGLB media in
accordance with the method defined definition described in Standard Methods. For side-by-side
comparison studies, 100 presence/absence broth cultures which are positive for total coliforms by the
proposed method must be tested for specificity with LTB/BGLB media. Side-by-side comparisons of
membrane filter methods must test 200 colonies determined to be total coliforms by the proposed method
for their reaction with LTB/BGLB media. Presence/absence drinking water methods must be tested for
positive E. coll reactions by testing 100 positive proposed ATP method cultures for their reaction in
LTB/EC-MUG media. Membrane filter methods must determine positive specificity by testing 200
isolated colonies which the proposed method identifies as E. coli positive for reaction in LTB/EC-MUG
media.
6.1.3.2 False Negatives
To assess whether the false negative rates are significantly different between methods, replicates known to
contain target organisms should be analyzed by both the ATP and new method. The determination that the
samples do not contain the target organism should be based on a third independent standard method (see
27
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Section 7.6) rather than by the EPA-approved reference method being used in the comparison. This is
because it should not be assumed that the accepted method has a false positive or negative rate of zero.
For total coliforms in drinking water, LTB/BGLB media are used to test cultures which are negative by the
proposed method. For side-by-side comparison studies, 100 presence/absence broth cultures which are
negative for total coliforms by the proposed method must be tested for specificity with LTB/BGLB media.
Side-by-side comparisons of membrane filter methods must test 200 atypical colonies determined to not be
total coliforms by the proposed method for their reaction with LTB/BGLB media. Presence/absence
drinking water methods must be tested for negative E. coli reactions by testing 100 negative proposed ATP
method cultures for their reaction in LTB/EC-MUG media. Membrane filter methods must determine false
negative specificity by testing 200 isolated atypical colonies which the proposed method identifies as E.
coli negative for reaction in LTB/EC-MUG media.
6.2 QC Acceptance Criteria-Based Comparison Studies
QC acceptance criteria-based comparison studies are generally conducted to demonstrate that the ATP or
new method is able to meet the QC acceptance criteria of the EPA-approved reference method. In some
instances, the quality control (QC) acceptance criteria specified in a method may not be sufficient to
demonstrate comparability between the ATP and the EPA-approved reference method and a side-by-side
comparison study should be conducted. Generally, EPA will make this decision based on review of the
application materials.
6.2.1 Number of Laboratories
A minimum of three laboratories should be used for a QC acceptance criteria-based comparison study. All
three laboratories should meet all QC acceptance criteria in the EPA-approved reference method. If more
than three laboratories participate, at least 75% of the participating laboratories should meet all QC
acceptance criteria in the EPA-approved reference method.
6.2.2 Number of Matrices
For all QC acceptance criteria-based comparison studies, there should be one matrix per laboratory. For
ambient water studies, the turbidity of at least one matrix should be greater than 10 NTU. Generally, for
matrices other than finished drinking water or ambient water, matrix composition should be addressed on a
study-specific basis.
6.2.3 Number of Replicates per Matrix
The number of replicates per matrix and laboratory should be specific to the QC criteria to which the
results will be compared. Generally, the number of reagent results should be 4 and the number of source
water results per matrix should be 2.
28
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
SECTION 7.0 SAMPLE PREPARATION AND ANALYSIS
7.1 Collection of Samples for Analysis
Each sample should be collected in sufficient volume to complete all replicate analyses by both the ATP or
new method and the EPA-approved reference method. Samples should be spiked (if necessary) and
analyzed as soon as possible after collection.
7.1.1 Source Water Characterization
Source water characterization information should be collected at the time of sample collection. This
information will be useful in characterizing the matrix to identify potential interferences and generally
includes, but is not limited to, the following:
• Sample collection location
• Source of water (e.g., ground water, stream, river, lake, etc.)
• Plant treatment processes (if sample is collected from a water treatment facility)
• Temperature
• pH
• Turbidity
• Total organic carbon (TOC)
• Free and total disinfectant residual at time of sample collection
• Heterotrophic plate count (HPC) unless heterotrophic bacteria are the target analytes, in which case
total coliforms should be measured
In addition to this information, data on the concentration of potential interferences (e.g., competing
bacteria, interfering chemicals, etc.) in the water collected for analysis may be necessary. Known
interferences should be discussed in the method and addressed in the study plan. The final results, as well
as bench sheets, log sheets, instrument printouts, and any associated quality control analyses should be
submitted as part of Appendix C of the study report (Section 9.10).
7.2 Sample Spiking and "Stressing" Procedures for Bacteriological Methods
Depending on the matrix and the analyte of interest, it may not always be necessary to spike samples prior
to analysis. Range finding analyses should be performed to assess the ambient concentration of target
organism(s) in the matrix of interest to determine whether sample spiking will be useful. Samples are
collected and dilutions are prepared in order to enumerate the number of organisms in the sample without
qualifiers (i.e. less than, greater than, or too numerous to count). In addition, range finding can also be
used to obtain environmental isolates for use in sample spiking or to determine the concentration of target
organism(s) in a spiking suspension or spiked sample.
If samples are spiked, environmental isolates should be used, as pure strains may exhibit different recovery
and precision characteristics than natural flora. NELAC (Reference 10.9) and ATCC (Reference 10.6)
recommend that bacterial cultures be transferred monthly and passed no more than five times before
returning to the original culture.
Sections 7.2.1 through 7.2.4 below, detail several different procedures for sample spiking and stressing.
29
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
7.2.1 Drinking Water: Spiking and Chlorine-Stressing
Drinking water samples should be spiked and stressed with chlorine as described in Sections 7.2.1.1 and
7.2.1.2 below.
7.2.1.1 Drinking Water Spiking
Because finished drinking water does not typically contain the target analyte(s) of interest, finished
drinking water should be spiked. The drinking water should be oxidant and reductant free. It can be
dechlorinated by exposure to UV light, held at room temperature for 24 to 48 hours, or with a Granular
Activated Carbon (GAC) filter. Additionally, the drinking water should not contain coliform bacteria or
more than 500 CFU/mL of heterotrophic bacteria so that it will not interfere with the determination of the
reduction of coliform bacteria during chlorine stressing, or coliform medium reactions during experiments.
For the evaluation of drinking water samples, a single finished drinking water matrix is spiked with target
organism(s) from other matrices, such as non-chlorinated secondary sewage effluents or polluted surface
waters. Sewage effluent generally has the advantage of providing a wide range of strains, whereas surface
waters typically have the advantage of providing organisms more variable in quality. Depending on the
study design (Section 6.0), the source of spiking suspensions (i.e., wastewaters) should come from
geographically dispersed sites.
To prepare spiked drinking water samples:
(1) Collect at least five liters of each non-chlorinated secondary sewage effluents or polluted surface
water to be used as spiking suspensions.
(2) Perform range finding analyses as described above on each spiking suspension to determine
target organism density as soon as possible after receipt of the sample to ensure that target
organism density and diversity are not reduced. Use sterile oxidant and reductant-free drinking
water described above as the diluent in range finding studies.
(3) For each spiking suspension, spike a sufficient volume of drinking water with a sufficient
volume of spiking suspension (based on range finding) to obtain 103-105 target organisms/100
mL. Please note: A single drinking water, with negligible concentrations of oxidants and
reductants (e.g., chlorine and sodium thiosulfate, respectively), should be used for the entire
study. Refrigerate all spiked drinking waters at 1°C - 4°C for use in preliminary chlorination
study and comparability study. High oxidant or reductant levels in the drinking water could
interfere with organism stressing.
7.2.1.2 Preliminary Chlorination Study to Determine Appropriate Exposure Time
Microorganisms in the spiked drinking water samples (from Section 7.2.1.1) should be stressed by
chlorination at ambient temperatures under conditions similar to those in drinking water treatment
facilities. The goal of chlorinating the spiked samples is to simulate drinking water treatment by reducing
the number of organisms in the spiked drinking water samples from 103-105 target organisms/100 mL to
1-10 chlorine-stressed target organisms/100 mL for most probable number methods or to 20-100 chlorine-
stressed target organisms/100 mL for membrane filtration methods (i.e., a 2-4 log removal).
After chlorination, no two samples are expected to produce the same levels of injured (stressed) target
organisms because the disinfection process is impacted by physical and biological factors. These include:
the type of spiked drinking water sample to be disinfected (e.g., spiked with sewage effluent or polluted
source water), the initial concentration of the target organism(s), the chlorine demand of the spiked sample,
the type and concentration of chlorinating agent, the exposure time, the sample mixing, pH, and
Jo
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
temperature. As a result, a preliminary chlorination study should be performed to establish the exposure
time necessary to reduce the number of organisms in the spiked drinking water samples from 103-105
target organisms/100 mL to 1-10 chlorine-stressed target organisms/100 mL (or a 2-4 log removal).
During this preliminary chlorination exposure time study, the physical and biological parameters in Section
7.1.1 should be carefully monitored and recorded for each sample.
The exposure time is directly dependent upon the initial concentration of the target organism present, the
matrices' chlorine demand, and the form of chlorine present. For testing, the spiked sample is generally
exposed to 2.0-2.5 mg total chlorine/L for over a range of times such as 10, 20, and 30 minutes to reduce
the density of target organisms from 103-105 CFU/lOOmLto 1-10 CFU/100 mL sample. However, the
period of exposure of a sample with a low chlorine demand may be significantly shorter than 20-30
minutes.
Suggested Preliminary Chlorination Study
(1) Determine and record the total residual and free residual chlorine concentrations using an EPA-
approved N, N diethyl-p-phenylenediamine (DPD) colorimetric method (e.g., Standard Method
4500-C1" G) initially, at midpoint, and at the end of the exposure time just prior to
dechlorination.
(2) For each exposure time, place 2 L of each spiked drinking water sample (from Section 7.2.1.1)
in a glass container.
(3) Add an appropriate volume of a diluted solution of reagent grade sodium hypochlorite (e.g., a
1:20 dilution of 5% (w/v) stock solution) to achieve the desired level of chlorinating agent and
stir the sample continuously during exposure to chlorination. If the spiked sample has an
appreciable chlorine demand (e.g., spiked with a primary effluent or a sewage sample), add
dilute sodium hypochlorite solution until a total residual chlorine level between 2.0 and 2.5
mg/L is maintained in the absence of free chlorine. If a sample has a low chlorine demand,
avoid over-stressing or killing the organisms by prolonged exposure to free residual chlorine.
The free residual chlorine concentration should not exceed 0.5-1.0 mg/L.
(4) Stop the chlorine oxidation (dechlorinate) at the end of the exposure period by adding 0.8 mL of
a 10% (w/v) sodium thiosulfate solution/L sample.
(5) Enumerate the target organism density in an aliquot of the spiked, chlorine-stressed,
dechlorinated drinking water sample using the appropriate EPA-approved reference method
from Table 1 -1. Different dilutions may be used for total coliform and E. coli enumerations.
(6) For each exposure time, repeat Steps 3-5, above.
7.2.1.3 Suggested Procedure for Chlorination and Dilution of Samples for Comparability Study
(1) Determine and record the total residual and free residual chlorine concentrations initially, at
midpoint, and at the end of the exposure time just prior to dechlorination using an EPA-
approved N, N diethyl-p-phenylenediamine (DPD) colorimetric method (e.g., Standard Method
4500-C1" G).
(2) Place a sufficient volume of each spiked drinking water sample in a glass container to perform
sufficient repeat analyses at multiple dilutions. Use oxidant and reductant-free drinking water
described in Section 7.2.1.1 as the diluent
31
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
(3) Immediately prior to chlorination, perform enumeration to determine target organism density as
described in 7.2.1.1 (2). This value should be used to determine the log reduction due to
chlorination.
(4) Add reagent-grade sodium hypochlorite to achieve the same concentration, as in the preliminary
chlorination exposure study.
(5) To reduce the density of target organisms from 103-105 CFU/lOOmL to 1-10 CFU/100 mL,
chlorinate each spiked drinking water sample for the appropriate time, based upon the
preliminary chlorination exposure study. Stir the sample continuously during the chlorination.
(6) Stop the chlorine oxidation (dechlorinate) at the end of the exposure period by adding 0.8 mL of
a 10% (w/v) sodium thiosulfate solution/L sample.
(7) Enumerate target organism density in an aliquot of the spiked, chlorine-stressed, dechlorinated
drinking water sample using the appropriate EPA-approved reference method from Table 1-1.
(8) Refrigerate the spiked, chlorine-stressed, dechlorinated drinking water samples at 1-4°C for use
in comparability testing.
(9) Read the plates to determine the approximate density of the target organisms. Use these results
to estimate the appropriate dilution necessary to reach the target organism density of 1-10
CFU/100 mL.
(10) Evaluate three dilutions of each spiked drinking water sample. Different dilutions may be used
for total coliform and E. coli enumerations since these likely occur in the spike source at
different concentrations. Samples should be diluted with the same, original, oxidant-free and
reductant-free drinking water, as necessary to reach a target organism density of 1-10 target
organisms/100 mL for most probable number methods or 20-100 target organisms/100 mL for
membrane filtration methods. Make the dilution and at least two others that bracket the target
density, for example, half and double that dilution. One of these dilutions should contain the
desired 1-10 target organisms per 100 mL. Immediately conduct the comparability analyses
with each sample using these three dilutions.
Please note: For the evaluation of presence/absence methods, the data used for comparison should be from
the dilution(s) which produces results closest to an equal number of positive and negative results for the
reference method. A 25% - 75% split in responses (in either direction) should be sought. A 100%
positive response will be considered unacceptable. For comparability, the evaluated results for the ATP or
new method should be from the same dilution as the reference method. If one of the dilutions does not
produce an acceptable split in positive and negative results for the reference method, the applicant should
return to the original, spiked sample.
7.2.2 Preparation of Enumerated Spiking Suspension
This dilution scheme is adapted from Standard Methods for the Examination of Water and Wastewater,
19th Edition, Section 9020 B (Reference 10.3). This entire process should be performed quickly to avoid
loss of viable organisms. There should be approximately 1010 organisms per slant. Therefore, dilution
bottles "A" through "E," below contain approximately 1010, 10s, 106, 104, and 103 organisms per dilution
bottle, respectively. Depending on the growing conditions, these numbers may vary. As a result, until
experience has been gained, more dilutions should be filtered to determine the appropriate dilution.
32
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Inoculate bacterial culture onto the entire surface of several nutrient agar slants with a slope approximately
6.3 cm long in a 125 x 16 mm screw-cap tube. Incubate for 24 ± 2 hours at 35°C ± 0.5°C.
From the slant that has the best growth, prepare serial dilutions using four dilution bottles with 99 mL of
sterile buffered dilution water (bottles A, B, C, and D) and one dilution bottle containing 90 mL of sterile
buffered dilution water (bottle E).
Pipette 1 mL of buffered dilution water from bottle "A" to one of the slants. Emulsify the growth on the
slant by gently rubbing the bacterial film with the pipette, being careful not to tear the agar. Pipette the
suspension back into dilution bottle "A." Repeat this procedure a second time to remove any remaining
growth on the agar slant, without disturbing the agar.
Make serial dilutions as follows:
(1) Shake bottle "A" vigorously and pipette 1 mL to bottle "B" containing 99 mL buffer
(2) Shake bottle "B" vigorously and pipette 1 mL to bottle "C" containing 99 mL buffer
(3) Shake bottle "C" vigorously and pipette 1 mL to bottle "D" containing 99 mL buffer
(4) Shake bottle "D" vigorously and pipette 10 mL to bottle "E" containing 90 mL buffer; this
should result in a final dilution of approximately 10 organisms/mL. If it is more convenient for
your laboratory, an acceptable alternative to the dilution scheme presented for this step is to
pipette 11 mL of dilution D into dilution bottle E, which contains 99 mL of dilution water.
Filter 1-5 mL portions in triplicate from bottles "D" and "E" according to standard membrane filtration
methods to determine the number of CPU in the dilutions. The recommended target dilution and spike
volume depends on the method and the target analyte(s). Typically, dilutions should be stored at 1-5°C
and may be used throughout the day they are prepared, however, storage conditions should be adjusted as
necessary since both storage conditions and viability may vary from organism to organism.
7.2.3 Log Phase Growth Curve
Inoculate 100 mL of broth media with a single isolated colony and incubate organisms at optimum
temperature. If possible, shake at 200 RPM during incubation.
Take optical density (OD) measurements at 550-600 nm at 30 minute intervals for the first 8 hours and
record readings. In addition to OD readings at 30 minute intervals, using aseptic technique, remove a
0.1 mL portion of the culture and make a series of 1:10 dilutions in sterile buffer. Initially plate 0.1 mL of
the 1.0, 10"1, and 10"2 dilutions in duplicate and incubate overnight at optimum temperature. Count
colonies and record data. As the optical density increases, evaluate serial dilutions to accommodate the
increased numbers of bacteria (i.e., when the optical density exceeds 1.0 then plate 0.1 mL from dilutions
10'1, ID'2, and 10'3).
Based on OD and the CFU/mL results, the OD of fresh cultures (in log-phase growth) can be used to
determine the concentration of bacteria in the tubes by a simple graphic representation of the combined OD
and CFU/mL results.
Commercially available McFarland standards may be used to determine the bacterial density instead of
actually doing a growth curve within the lab. In order to determine bacterial densities using McFarland
standards, the OD of the standards are compared to the OD of the log phase culture.
33
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
7.2.4 Commercially Available Enumerated Spikes
One time use, commercially available spiking suspensions may be obtained from a variety of vendors.
Prepare spiking suspensions according to manufacturer's instructions. It should be noted that a different
spiking volume than that recommended by the manufacturer may be necessary to achieve the target
density.
7.3 Spiking Procedures for Virus Methods
7.3.1 Cell Monolayer Propagation
In order to propagate virus stock suspensions cell monolayers should be propagated. The type of assay
cytopathic effect (CPE) or plaque assay (PA), being performed determines the size of the flask and the
number of days the monolayer incubates prior to use for the assay. Inoculate T75 or T25 flasks with 1x10
cells containing cell specific growth medium. Flasks for CPE should incubate for 5-7 days at 37°C ±
0.5°C with 95% relative humidity (and 5% CO2 concentration, if necessary). One CPE flask should be
stained with crystal violet and microscopically checked to ensure a 95% confluent cell monolayer prior to
use in the assay. Flasks used for PA incubate for 7-10 days under the same conditions as the monolayers
used for CPE. PA flasks should also be checked to ensure 95% confluency before use for assay.
7.3.2 Propagation of Virus Stock Suspension
For (CPE) analysis, utilizing the appropriate cell line (e.g., Buffalo green monkey kidney cells; BGMK),
inoculate the cell monolayer with virus and incubate culture flasks at 37°C ± 0.5°C until the entire
monolayer has been destroyed by virus replication (approximately 72-96 hours). Freeze (to approximately
-80°C) and thaw the flasks three times, then pool contents of the flasks and spin at 10,000 x g for 30
minutes. Filter supernatant using a 0.2 (im pore nylon filter to remove any remaining cell debris. The
filtrate is the virus stock suspension.
7.3.3 Titering of the Virus Stock Suspension
Titerthe virus stock suspension (from Section 7.3.2) by performing a plaque assay. Inoculate monolayers
with 0.2 mL of serial diluted viral stock suspension. After rocking and rinsing the monolayers, add the
agar overlay to the monolayer. Incubate the culture flasks for seven days while reading the number of
plaques each day for the entire seven days. After the viral stock suspension has been titered, appropriate
volumes of the suspension can be used to spike test matrices to obtain plaque forming units (PFU)/mL.
7.4 Spiking Procedures for Cryptosporidium and Giardia
Enumerated spiking suspensions are needed for initial and ongoing precision and recovery (IPR and OPR)
samples (often referred to as positive controls) and matrix spike (MS) samples. Flow cytometer-sorted
organisms are necessary for these spiking suspensions, rather than manual techniques. Flow cytometer-
sorted spikes generally are characterized by lower variability than manually enumerated spikes.
Spiking suspensions should be prepared using unstained organisms that have not been irradiated, heat-
fixed or formalin-fixed. Immediately before sorting spiking suspensions, initial calibration of the flow
cytometer should be performed by conducting a series of sequential sorts directly onto membranes or well
slides. These initial sorts should be stained and counted microscopically to verify the accuracy of the
system. When sorting the spiking suspensions, ongoing calibration samples should also be prepared and
counted at regular intervals. The mean of the ongoing calibration counts should be used as the estimated
spike dose. Flow-cytometer-sorted spiking suspensions should be used by the expiration date noted on the
~4
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
suspension. Flow-cytometer-sorted spiking suspensions containing live organisms should be used within
two weeks of the preparation date. General procedures for preparing flow-cytometer-counted spikes for
Cryptosporidium and Giardia can be found in EPA Method 1622 and 1623 (April 2001).
A potential commercial source of flow-sorted Cryptosporidium and Giardia spiking suspensions is:
Wisconsin State Laboratory of Hygiene
Flow Cytometry Unit
2601 Agriculture Drive
Madison, WI 53718
Phone: (608)224-6260
Fax: (608)224-6213
The Wisconsin State Laboratory of Hygiene prepares and distributes live Cryptosporidium parvum oocysts
and Giardia intestinalis cysts that have not been treated to reduce viability.
7.5 Analysis of Samples
Samples should be analyzed in accordance with the EPA-approved study plan. Any deviations from the
study plan may suggest the need for additional analyses and potentially, rejection of data generated from
the study. If any deviations from the approved study plan are necessary prior to or during the study, the
applicant should consult EPA and receive approval for the modification to the study plan. Deviations from
the approved study plan should be documented in the study report (Section 9.0).
Unless otherwise approved in the study plan, the minimum assay size is as specified in the EPA method
(100 mL).
Example analysis schemes are provided in Sections 7.5.1 and 7.5.2 below. Generally, the analysis
schemes will differ depending on the type of organism and the type of method. The study plan (Section
4.0) should detail the order in which samples will be analyzed.
7.5.1 Side-by-Side Comparison Studies for both the reference method and the proposed
method
Method Blank 1
Sample 1 up to Sample 20
Method Blank 2
Positive Control
Negative Control
Method Blank 3
Media Sterility Checks
7.5.2 QC Acceptance Criteria-Based Comparison Studies
Replicate 1 for IPR
Replicate 2 for IPR
Replicate 3 for IPR
Replicate 4 for IPR
Method Blank
Matrix Spike
Matrix Spike Duplicate
Unspiked Matrix Sample
J3
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
7.6 Verification of Results
The number of positive and negative results to be verified in a side-by-side comparison study is discussed
in Section 6.1.3. Sections 7.6.1-7.6.3 below discuss the types of independent standards that may be used
for the verification of results from bacteriological methods in Clean Water Act applications, total coliform
methods in drinking water applications, virus methods, and Cryptosporidium and Giardia methods,
respectively.
7.6.1 Verification of Results from Bacteriological Methods
7.6.1.1 Biochemical Tests that May be Used for Verifications in Clean Water Act Applications
Catalase Lysine decarboxylase
ONPG Methyl Red and Voges Proskauer (MRVP)
Indole Triple sugar iron (TSI)
Coagulase Lysine iron agar (LIA)
Esculin Urease
Citrate Sugars (e.g., Trehalose, lactose, mannitol, and sorbitol)
Oxidase
It is recommended that multiple biochemical tests be utilized to verify colony identification. The above list
of biochemical tests is not exhaustive. Applicants may refer to EPA methods for appropriate verification
tests. The choice of which biochemical tests to use is based on type of organism (i.e., gram stain results)
being identified/verified. Biochemical tests used for verification should be discussed in the study plan
(Section 4.0). It may be appropriate to perform a gram stain prior to biochemical identification. A
description of the biochemical tests listed above, as well as additional tests is provided in Standard
Methods for the Examination of Water and Wastewater, Method 9225.
7.6.1.2 Commercially Available Biochemical Testing Products
Commercial biochemical test systems incorporate multiple biochemical tests to allow for identification to
the genus and/or species level, which may be difficult when using individual biochemical tests prepared in
house.
Commercial biochemical test systems are available in two formats: systems that depend on the analyst to
manually interpret the results (e.g., API® strips, BBL crystal, and Enterotube) and systems that automate
the interpretation of results (e.g., Vitek® and Biolog).
7.6.1.3 Biochemical Tests for Verification of Total Coliform and E. coli Bacteria in Drinking Water
(Total Coliform Rule, Surface Water Treatment Rule, Ground Water Rule)
Verify total coliform bacteria by growth/gas production in LTB medium followed by inoculation of
positive LTB cultures into BGLB (as described in Standard Methods for the Examination of Water and
Wastewater, Method 922 IB). One hundred (100) presence/absence cultures of the proposed method that
are positive for total coliform growth must be tested for their reaction in LTB/BGLB. One hundred (100)
proposed method cultures that are negative for total coliform growth must be tested in LTB/BGLB for their
reaction. One hundred (100) presence/absence cultures that are positive for E. coli must be tested for their
reaction in LTB/EC-MUG. One hundred (100) proposed method cultures that are negative for E. coli must
be tested for their reaction in LTB/EC-MUG. Two hundred (200) total coliform colonies from the
proposed membrane filter method must be tested for their reaction in LTB/BGLB. Two hundred (200)
non-total coliform colonies must be picked from proposed membrane filter methods and tested in
LTB/BGLB for their reaction. Two hundred (200) E. coli colonies must be picked from the proposed
~6
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
membrane filter methods and be tested for their reaction in LTB/EC-MUG. Two hundred (200) non- E.
coll colonies must be picked from the proposed membrane filter method and be tested for their reaction in
LTB/EC-MUG.
7.6.2 Verification of Results from Virus Methods
Viral protocol testing confirmations tend to be method-specific; therefore a detailed description of the
confirmation/verification procedure should be included in the study plan and reviewed on a study-specific
basis.
7.6.3 Verification of Results from Cryptosporidium and Giardia Methods
Verification of results may not be necessary for Cryptosporidium and Giardia IF A methods beyond the
analyst's microscopic examination of the organism. If verification is necessary, a detailed description of
the confirmation/verification procedure should be included in the study plan and reviewed on a study-
specific basis.
37
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
SECTION 8.0 REVIEW OF STUDY RESULTS
Generally, upon receipt of the applicant's data, EPA will perform the following reviews discussed in
Sections 8.1 to 8.5:
(1) Assessment of compliance with the approved study plan
(2) Data review
(3) Data validation
(4) Development of descriptive statistics
(5) Statistical assessment of method comparability
Methods that are deemed acceptable will generally be recommended for approval (Section 8.6).
8.1 Assessment of Compliance with Approved Study Plan
Generally, EPA will review the study report and associated data to ensure the study was conducted
according to the approved study plan. The applicant should explain and justify (possibly with additional
studies) any deviations from the study plan. Deviations from the approved study plan that occur without
prior approval from EPA may result in the rejection of some or all study data.
8.2 Data Review
Upon receipt of the applicant's data, EPA will generally verify that all raw data described in Section 9.10
are present and complete. Generally, all calculations used in the method will be verified. This may
include calculations used for spiking enumeration, preliminary or presumptive stages of the method, and
the determination of the final result.
8.3 Data Validation
After verifying data completeness and reviewing all calculations, EPA will generally verify that all
measurements were performed in accordance with the method. This may include, but is not limited to, the
following:
• Temperature logs for incubator/water bath/refrigerator
• Media preparation records
• Sample incubation times
• Associated QC samples (as described in Section 5.0)
o Method blanks
o Preparation blanks
o Sterility checks
o Positive and negative controls
8.4 Development of Descriptive Statistics
8.4.1 Mean Recovery
To determine if matrix characteristics affect method performance, mean recoveries should be calculated
separately for each matrix and method. Mean recoveries should also be calculated for each method over all
matrices. Methods having an extended read time (e.g. 24 ±2 hours and 48 ± 3 hours for LTB/BGLB)
should base mean recoveries on the sum of the confirmed true positives of the replicate cultures which
gave positive reactions at anytime during the incubation. Individual positive replicate cultures of the
proposed or the reference method may appear positive at earlier read times and negative later due to
~38
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
overgrowth by non-target bacteria, or may appear negative at earlier read times due to a growth lag of the
target bacteria.
8.4.2 Precision
Precision can be expressed both on an absolute scale (i.e., standard deviation) and on a relative scale (i.e.,
relative standard deviation). The RSD (sometimes referred to as coefficient of variation) is calculated as
the standard deviation divided by the mean, expressed as a percent. For the purpose of summarizing the
data, both standard deviations and RSDs should be calculated and the one which is most appropriate for
assessing comparability will be used in analyses. Generally, RSDs are most appropriate for summarizing
precision when variability increases as concentration increases.
To give an indication of the effect of multiple matrices on precision, standard deviations should be
calculated separately for each matrix and method. Standard deviations also should be calculated for each
method over all matrices.
8.4.3 False Positive Rates, False Negative Rates, Sensitivity, and Specificity
False positive (FP) and false negative (FN) rates of approved and reference methods should be evaluated
when assessing comparability. An independent standard (described in Section 7.6) may sometimes be
necessary to confirm positives and negatives of both the ATP or new method and the EPA reference
method. For drinking water method evaluations, confirmed positives and negatives for the proposed
method should be determined as described in section 7.6.1.3. The confirmation methods used should be
discussed in the validation study plan.
False positive rates for methods with extended read times should be based on the sum of the number of
confirmed true positives appearing at any time during the incubation period. False negatives should be
based on cultures which appeared positive at any time during the incubation period, but never at any time
during the extended incubation read time were confirmed as being positive.
Generally, performance of the ATP or new method and EPA-approved reference methods will be defined
in terms of false positive rates and false negative rates. For the purposes of the ATP protocol, false
positive rates and false negative rates are equivalent to (1- Specificity) and (1-Sensitivity), respectively.
Specificity is defined as the percent of negative samples correctly identified as negative, and sensitivity is
defined as the percent of positive samples correctly identified as positive (see equations on the next page).
In order to calculate estimates of false positives, false negatives, sensitivity, and specificity for each
method, 2-by-2 tables for each matrix, and over all matrices, should be set up as follows in Table 8-1.
39
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Table 8-1. Standard Format for 2-by-2 Tables
ATP or New
Method
+
-
Total
Independent Standard
+
TPi
FN-i
TPi + FNi
-
FPi
TNi
FPi + TNi
Total
TPi + FPi
FNi + TNi
TP-i + FPi + TNi + FNi
EPA-
Ap proved
Reference
Method
+
-
Total
Independent Standard
+
TP2
FN2
TP2 + FN2
-
FP2
TN2
FP2 + TN2
Total
TP2 + FP2
FN2 + TN2
TP2 + FP2 + TN2 + FN2
Estimates of sensitivity, specificity, false positive rates, and false negative rates as percentages for the two
methods should be calculated as follows:
TPt
Sensitivity, = TP, + FN,
*
100 %
Specificity, = TN, + FP, * 100 %
TN,
False positive rate, = TN, + FP, *100%= (1- TN, + FP, )* 100 % = 1 - Specificity,
FN,
TP,
False negative rate, = TP, + FN, *100%= (1- TP, + FN, )* 100% = 1 - Sensitivity,
Where
F = False
N = Negative
T = True
i = the specified method (i=l for new method, i=2 for reference method)
40
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
8.5 Statistical Assessment of Method Comparability
8.5.1 Presence/Absence Methods
For presence/absence methods, the chi-square test and Breslow-Day test will generally be used to compare
the percent of false positives and false negatives in the proposed method analyses with the percent of false
positives and false negatives in the reference method analyses.
Generally, tests for significant differences in false positive and false negative rates between methods will
be run using the results summarized in Table 8-1 in Section 8.4. To perform the chi-square and Breslow-
Day tests, it may help to first rearrange the data as follows in Tables 8-2 and 8-3:
Table 8-2. False Negative Rate Comparison
Result
True +
False -
Total
Method
New
TPi
FN-i
TP-i+FN-i
Reference
TP2
FN2
TP2 + FN2
Total
TPi + TP2
FN-i + FN2
TP-i + TP2 + FN-i + FN2
Table 8-3. False Positive Rate Comparison
Result
False +
True -
Total
Method
New
FPi
TNi
FPi + TNi
Reference
FP2
TN2
FP2 + TN2
Total
FPi + FP2
TNi + TN2
FPi + FP2 + TNi + TN2
Before running the chi-square tests, positives and negatives should be confirmed using the independent
standard method. For drinking water methods, positives and negatives should be confirmed as described
in section 7.6.1.3. The chi-square test will be used to determine whether the false positive rate and false
negative rate have a statistically significant difference between the ATP or new method and the EPA-
approved reference method. Methods with an extended read time should statistically compare the positives
and negatives for the beginning and end of read times to the number of true and false positives and true
and false negatives obtained for the whole incubation period to determine if the full range of read times
gives acceptable recovery.
8.5.1.1 Assessing Method Differences (Chi-Square Test)
In order to assess whether the false positive and false negative rates differ between methods, chi-square
tests should be run over all matrices; additional tests in each matrix also may be necessary depending on
the presence of matrix interactions (see Section 8.5.1.2). For false negative rates, the chi-square test(s)
indicates whether the proportions of negative samples correctly identified as negative by the two methods
are significantly different, and for false positive rates, the chi-square test(s) indicates whether the
proportions of true positive results correctly identified as positive for the two methods are significantly
different.
41
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
8.5.1.2 Assessing Method/Matrix Interactions (Breslow-Day Test)
In order to assess whether there are false positive and false negative rate differences between methods, it is
also necessary to establish if there is a matrix effect on these parameters. The effect of matrices on false
positive and false negative rate differences between methods is assessed using the Breslow-Day test
(Reference 10.7). The Breslow-Day test is used to test whether there is an interaction between matrix and
method in terms of the likelihood of a false positive result (for specificity) and in terms of the likelihood of
a false negative result (for sensitivity). If a significant interaction is found between method and matrix,
then the chi-square tests for a difference between methods for that attribute should be done separately for
each matrix. Otherwise, separate chi-square tests for each matrix are not necessary, and a single chi-square
test can be run using the data from all matrices.
8.5.1.3 Method Comparability Conclusions
A decision on whether the ATP or new method is comparable to the EPA-approved reference method will
generally be made based on the results of the false positive and false negative rate comparisons.
Generally, the decision on acceptability of the new method should be made based on the chi-square test
using data from all matrices. However, if the results of the Breslow-Day test indicate that there is a
significant interaction between method and matrix for false positive and/or false negative rates, then further
review should be made for those matrices which yielded a higher false positive and/or false negative rate
for the proposed method.
If the results of the chi-square tests indicate that the false positive and false negative rates of the ATP or
new method are not significantly different from the EPA reference method, this generally will be
interpreted as not having enough evidence to conclude that the performance of the ATP or new method is
worse than the EPA-approved reference method. However, if the results of the chi-square test indicate that
the false positive and/or false negative rates of the new method are significantly less than that of the EPA-
approved reference method, this will generally be interpreted as worse performance and would lead to
rejection of the ATP or new method.
8.5.2 Quantitative Methods
8.5.2.1 Testing for Normality
Many of the statistical analyses used to assess differences in recovery and precision between methods
require certain assumptions about the data be met. Because the validity of these assumptions affects which
statistical tests will be used to assess recovery and precision differences, testing these assumptions is a
prerequisite as a first step. There are two assumptions that should be evaluated prior to comparing the two
methods statistically. One assumption is that the variability of the replicates is constant for each matrix
and method. This assumption is discussed in Section 8.5.2.2. The other main assumption that should be
met is that the data follow a normal distribution.
An assessment of normality can be done either graphically or with statistical tests, or both. A normal
distribution looks like a bell curve (i.e., symmetric around the mean). A graphical assessment can be done
using a histogram, stem-and-leaf, or Normal probability plot. Appropriate statistical tests include the
Shapiro-Wilk test, D'Agostino test, and Kolmogorov-Smirnov test. These tests are computation intensive,
and it may be necessary to use software (e.g., SAS) to run them. The Shapiro-Wilk test is generally
inappropriate for testing the normality assumption of a data set with greater than 50 results.
For environmental data (especially field data, though sometimes spiked data as well), the distribution of
results will often be positively skewed (i.e., with a few unusually high results). A common corrective
~2
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
action of this that can be utilized is the logarithmic transformation, base e (natural log). If the data are
positively skewed, this transformation should be attempted, assuming that none of the results are negative
or equal to zero. When zero-valued results are present in a skewed data set, a common approach is to add
a small constant to every result prior to transformation. However, the arbitrary choice of this constant will
have a large effect on the transformed data (the log of 1 and the log of 0.1 are very different, for example),
and this approach is not recommended. The uses of non-parametric tests, discussed later in this guidance,
are considered to be the appropriate alternatives.
If no zero-valued results are present in a positively skewed data set, then the log-transformed data should
be tested for normality. If the log-transformed data do follow a Normal distribution, then all analyses
should be done using the log-transformed results.
8.5.2.2 Evaluating Precision
The decision of what statistical test to use to evaluate precision will generally be made on the results of the
normality test described in Section 8.5.2.1. If the assumption of normality is met, then the F-test for
differences in variance should be used. If the assumption of normality is not met, a non-parametric test
such as the Conover Squared-Rank test should be used. The F-test is based on comparisons of variances
(i.e., the squared standard deviations). The Conover Squared-Rank test is based on ranking absolute
deviations from the mean. Figure 8-1 provides a summary of the procedure for evaluating precision.
Prior to testing whether there is a precision difference between methods, the precision of the different
matrices should be compared for each method.
If the results of the F-test indicate that variances differ significantly by matrix, or if the results of the
Conover Squared-Rank test indicate that absolute deviations from the mean differ significantly by matrix,
then comparisons of precision between methods should be done separately for each matrix. Otherwise, a
single comparison of method precision can be done using the data from all matrices. If the results of the F-
test indicate that there is not enough evidence to conclude that the variances differ significantly by matrix,
then the pooled within-matrix variance (i.e., the average of the matrix variances for the given method)
should be calculated for each method. The F-test should then be used to compare the pooled within-matrix
variance for the two methods. If the Conover Squared-Rank test was used and there was not enough
evidence to conclude that the mean absolute deviation differed significantly between matrices, then the
Conover Squared-Rank test should be used to compare the mean absolute deviations from the overall
method mean for the two methods.
43
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Figure 8-1. Evaluating Precision
Compare variances between matrices for
each method using F-test.
Compare absolute deviations from
method mean for two methods using
Conover squared rank test.
Pool variances over
matrices for each
method. Compare
pooled variances for
two methods using the
F-test.
/Is the nev
method
.significantly less/
Xorecise?//
Yes
//Is the neviN,
method \,
^significantly less/
precise?//
[Failure
Evaluate
Recovery
,/ls the new
method at leastx
as precise for at
least 80% of ,
matrices?,
rYes
Evaluate
Recovery
Failure
44
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
8.5.2.3 Evaluating Recovery
Comparisons of mean recovery of the different methods should be done using either an analysis of variance
(ANOVA) model or non-parametric test such as the Wilcoxon-Mann-Whitney (WMW) test. If the
assumption of normality is not met, or if the precision differs between methods or matrices based on the
precision evaluation described in the previous section, the WMW test should be used. If all assumptions
are met, the ANOVA model should be used. Figure 8-2 provides a summary of the procedure for
evaluating recovery.
If an ANOVA model is used, method-by-matrix interactions should be tested for significance first using
the corresponding ANOVA F-test. If the method-by-matrix interaction is significant, it would mean that
the difference between methods is not the same for each matrix. In this case, no conclusion regarding an
overall difference between the recoveries of methods can be made. If the interaction is not significant, then
the test for a significant difference between methods can be run in the ANOVA model, using the F-test for
a significant method main effect.
If the WMW test is used, between-matrix and between-laboratory variability cannot be separated from
replicate variability. Therefore, significant interactions between method and laboratory or matrix cannot be
tested when the WMW test is used. Instead the WMW model only tests for an overall difference between
methods. The model can be run separately for each matrix, but the diminished power will limit the value
of this approach. It is therefore recommended that a single WMW test be run comparing method
recoveries, without stratifying by or controlling for matrix type.
45
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Figure 8.2 Evaluating Recovery
Run single WMW test to
compare method
recovery, disregarding
matrices.
Run ANOVA model
testing for method
difference, controlling
for matrix.
Run ANOVA model
testing for method
difference separately for
each matrix. Count
number of matrices
where the new method
is equal to or
significantly better.
Is recovery
significantly lower for
the new method?
Yes-
-NO-,
Is recovery
significantly lower for
the new method?
Is recovery \
significantly
lower for the new
method for at least 20%
of all matricies? /
Yes-
-No
Failure
Evaluate false
positive and
false negative
rates.
Evaluate false
positive and
false negative
rates.
Failure
Evaluate false
positive and
false negative
rates. /
46
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
8.5.2.4 False Positive Rates, False Negative Rates, Sensitivity, and Specificity
Comparisons of false positive rates, false negative rates, sensitivity, and specificity should be conducted
according to the methods described in Sections 8.5.1.1 and 8.5.1.2.
8.5.2.5 Method Comparability Conclusions
If there is not enough evidence to conclude that there is a significant interaction between method and
matrix for recovery, or if there is not enough evidence to conclude that there is a statistically significant
difference in precision between matrices for either method, then it can be concluded that any such
differences of the method will be consistent. Therefore, lack of sufficient evidence to conclude a
significant difference between methods will generally be interpreted as equal or better performance, and
might lead to acceptance of the ATP or new method, pending review of false positive and false negative
rates. However, a statistically significant test (worse) statistic will be interpreted as worse performance of
the ATP or new method, and would generally lead to rejection of the new method.
In cases where significant interactions between matrix and method are found when assessing recovery, or
in cases where differences in precision between matrices for at least one method (and therefore a
comparison of method precision could not be done over all matrices), some judgment will be necessary in
deciding whether the proposed method should be deemed acceptable. The decision should be based on the
attribute (i.e., recovery or precision) for each matrix. As a general rule, if there was not enough evidence
to conclude that the new method was similar or better than the EPA-approved reference method for that
attribute for at least 80% of the matrices used in the study, then the new method can generally be
recommended for approval, pending review of false positive and false negative rates.
If there is not enough evidence to conclude that the new method is worse than the EPA-approved reference
method for at least 80% of the matrices for both precision and recovery, the false negative rate and false
positive rate should next be compared, based on the methods described in sections 8.5.1.1 and 8.5.1.2. If
the results of the chi-square test indicate that there is not enough evidence to conclude that the false
negative rate or false positive rate of the ATP or new method is worse than that of the EPA-approved
reference method, this will be interpreted as equal or better performance and might lead to acceptance of
the new methods. However, if the results of the chi-square test indicate that the false negative rate or false
positive rate of the new method is significantly greater than that of the EPA-approved reference method,
this will generally be interpreted as worse performance for 100% of the matrices used in the study, and
would lead to rejection of the ATP or new method.
For example, suppose side-by-side testing is done in one laboratory for ten different matrices. For
recovery, there is not enough evidence to conclude that the new method is significantly worse than the
EPA-approved reference method for eight of the ten matrices. For precision, there is not enough evidence
to conclude a significant matrix effect, and there is not enough evidence to conclude a significant
difference between methods. There is not enough evidence to conclude the false negative rate or false
positive rate of the ATP or new method is worse than the EPA-approved reference method. Under these
circumstances, the ATP or new method may be recommended for approval. However, if there is not
enough evidence to conclude that the new method is not worse than the approved method for only seven of
ten matrices for recovery, then the ATP or new method will not, generally, be recommended for approval.
8.5.3 QC Acceptance Criteria-Based Comparison Studies
For methods with QC acceptance criteria, necessary calculations and QC criteria are typically provided in
the method. Generally, QC criteria will include both recovery and precision specifications. Recovery
criteria take the form of a recovery interval for either a single result or mean of multiple results. Precision
47
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
criteria will generally be a maximum standard deviation, RSD, or RPD. Details on calculations that may
be necessary for the comparisons are described below.
8.5.3.1 Recovery
Recovery results will generally be necessary for data spiked into both reagent water (IPR, IDC, etc.) and
source water (MS, MSD, etc.). For reagent water, percent recovery should be calculated as follows:
Result
% Recovery = Spike *100%
Where,
Result = the amount recovered from the sample after spiking
Spike = the estimated amount spiked into the sample.
For source water, percent recovery should be calculated as follows:
Result - Background
% Recovery = Spike *100%
Where,
Result = the amount recovered from the sample after spiking,
Spike = the estimated amount spiked into the sample, and
Background = the estimated background amount measured in the sample prior to spiking.
For some QC criteria, it may be necessary to calculate the mean of the sample recoveries prior to
comparing the results to the criteria. Generally, this will be specified with that given method.
8.5.3.2 Precision
Calculation of an RSD, RPD, or standard deviation is generally necessary for precision criteria. For source
water data, the calculation should be based on the recovered amount, rather than the percent recovery.
This is because the calculated percentage will be inflated if the background amount is large compared to
the total amount recovered. For reagent water-spiked data, where the background count is zero, the
calculation can be made either using the recovered amount or the percent recovery.
Where the precision criteria are to be based on two results, the precision criterion will generally be an
RPD. The equation for an RPD is given below:
\Amounti - Amount.
RPD = * 100 %
(Amount! + Amount2) I 2
Where,
Amount! andAmount2 = the two recovered amounts.
48
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Where the precision criterion is to be based on more than two results, the precision criterion will generally
be either a standard deviation or RSD. An RSD is calculated based on the equation below:
SD
RSD = * 100 %
Mean
Where,
SD = the standard deviation of all recovered amounts
Mean = the mean of all recovered amounts
8.5.3.3 Presence/Absence Criteria
QC criteria for presence/absence methods may be defined as false positive and/or false negative rates,
sensitivity and/or specificity, or as a specified proportion of positive and/or negative results out of a given
number of samples. All calculations should be specific to the given situation and should be defined
explicitly therein.
8.6 Method Recommendation and Approval
Generally, after completion of the technical and statistical reviews for nationwide-use applications, the
Director of Analytical Methods, Attn: ATP Program Coordinator (see Table 2-1 and Appendix E) will
prepare a recommendation for approval/disapproval of the ATP or new method and notify the applicant of
the recommendation.
If the data evaluation demonstrates that the applicant's method performs at least as well as the EPA-
approved reference method, the Director of Analytical Methods, Attn: ATP Program Coordinator will
generally recommend approval to the Office of Ground Water and Drinking Water (OGWDW) or
appropriate office, which begins the regulation development process. Regulation development includes a
Federal Register notice proposing to approve an ATP, public comment on the proposed method, and
(depending on public comment) a final rule published in the Federal Register that approves the method.
Generally, the regulation development process may take one year or more.
For limited-use ambient water or wastewater applications the Regional Administrator (see Table 2-1 and
Appendix E) will generally prepare a recommendation of approval/disapproval of the ATP or new method
and notify the applicant of the recommendation after completion of the technical and statistical reviews.
49
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
SECTION 9.0 STUDY REPORT
Laboratories or other organizations responsible for developing ATPs or new methods should document the
results of the side-by-side comparison study or QC acceptance criteria-based comparison study in a formal
study report that contains the elements described in this section.
The information and supporting data included in the study report should be sufficient to enable EPA to
evaluate the performance of the ATP or new method and make a decision on whether it is comparable to
the reference method. The applicant is responsible for ensuring that all method-specified criteria are met
by the laboratory(ies) involved in the study and that the study report contains all data from the
laboratory(ies). A copy of all comparison study data should be maintained at the participant laboratory(ies)
or other organization responsible for developing the ATP or new method.
Like the study plan developed and approved by EPA before the study was performed, the study report
contains background information and describes the study design. However, the study report also details
the process and results of the study, provides an analysis and discussion of the results, and presents study
conclusions. The approved study report should also identify and discuss any deviations from the study
plan that were made in implementing the study, and the study plan should be appended to and referenced
in the study report. Significant deviations from the study plan without prior EPA approval could result in
the rejection of the study data.
The study report should be organized into the following sections:
• Background
• Study Objectives and Design
• Study Implementation
• Data Reporting and Validation
• Results
• Data Analysis and Discussion
• Conclusions
• Appendix A - Method
• Appendix B - Study Plan
• Appendix C - Supporting Data
• Appendix D - Supporting References
Details on the information that should be included in each of these sections are provided below, in Sections
9.1 through 9.11.
9.1 Background
This section of the study report should describe the ATP or new method that was tested, and identify the
organization responsible for developing the ATP or new method. The background section of the study
report should include the following information:
• A method summary
• The organization, method number, and title for the ATP or new method
• The method number or title and publication number (given in 40 CFR parts 136, 141, and 405 -503)
for the EPA-approved reference method that is being used for demonstrating method comparability
(i.e., the reference method)
• A description of the nature of the ATP (e.g., alternate media, alternate concentration technique, etc.)
50
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
• The matrices, matrix types, and/or media to which the ATP or new method is believed to be applicable
• The analyte(s) measured by the ATP or new method, including corresponding CAS Registry or other
identification numbers, when available
9.2 Study Objectives and Design
This section of the study report should identify the overall objectives and data quality objectives of the
study and briefly describe the study design. This information should be consistent with the study
objectives and design specified in the approved study plan. Any study limitations should be identified.
The approved study plan should be appended to the study report.
9.3 Study Implementation
This section of the study report should describe the methodology and approach undertaken in the study.
This section should include the following information:
• The organization that was responsible for managing the study
• The laboratories, facilities, and other organizations that participated in the study; describe how
participating laboratories were selected; and explain the role of each organization involved in the study
• The type of study performed (i.e., side-by-side or QC acceptance criteria-based comparison study)
• The study schedule that was followed
• A brief description of how sample matrices were chosen, including, for QC acceptance criteria-based
comparison studies, a statement of compliance with recommendations for matrix type selection
• A description of any preliminary testing conducted prior to the side-by-side or QC acceptance criteria-
based comparison study (e.g., method validation, physical and chemical assessment of the matrices,
preliminary range-finding analyses)
• The numbers and types of analyses performed by the participating laboratories
• A description of how samples were collected, distributed, and stored
• The source and strain of the organism used for sample spiking
• If spikes were quantified, a description of how estimated true spike values were determined and
provide all supporting data
• The type of water used in the preparation of sample dilutions if not specified by the method (e.g.,
reagent water, phosphate buffered water, phosphate buffered saline, etc.)
• Any problems encountered with samples, spiking organisms, equipment, etc. and their subsequent
resolution
• Any communications with EPA relevant to the study, such as clarification of the study design or
approved changes to the study plan
• Any deviations from the study plan and their impact on study performance and/or results
9.4 Data Reporting and Validation
This section of the study report should describe the procedures that were used to report and validate study
data. While EPA generally does not use a standard format for analytical data submission, a list of
necessary data elements and an example bench sheet may be found in Appendix D of this document.
9.5 Results
This section of the study report presents the study results. Results may be presented in a summary table
that lists the recovery or concentration of each sample, by test method, laboratory, and matrix.
51
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
For QC acceptance criteria-based comparison studies, results should indicate the QC test associated with
each sample (e.g. IPR, method blank, MS/MSD, unspiked matrix sample). Raw data and example
calculations should be submitted, and should be included in an appendix to the study report (see Section
9.10.1).
9.6 Data Analysis and Discussion
This section of the study report provides a statistical analysis and discussion of the study results.
Recovery, precision, false positive rates, false negative rates, specificity, and sensitivity, as appropriate,
should be calculated by test method, laboratory, and matrix, and summarized in a tabular format that
includes the mean, standard deviation, and relative standard deviation. The discussion should address any
discrepancies between the results and comparability guidelines, or, for QC acceptance criteria-based
comparison studies, any discrepancies between the results and the QC acceptance criteria of the EPA-
approved reference method.
9.7 Conclusions
This section of the study report should describe the conclusions drawn from the study based on the data
analysis discussion. The section should contain a statement(s) regarding achievement of the study
objective(s).
9.8 Appendix A - Method
The ATP or new method, prepared in accordance with EPA's Guidelines and Format document (Section
3.0 and Reference 10.14), should be appended to the study report.
9.9 Appendix B - Study Plan
The study plan approved by EPA (Section 4.0) should be appended to the study report.
9.10 Appendix C - Supporting Data
The study report should be accompanied by raw data, quality control information, and example
calculations that support the summary results presented in the report.
9.10.1 Raw Data
Appendix C of the study report should include sufficient raw data so that an independent reviewer can
verify each determination and calculation performed by the laboratory or the study coordinator. This
verification consists of tracing all steps of the method to the final result reported. The raw data are,
generally, method-specific and may include but are not limited to the following:
• Sample numbers or other identifiers used by the laboratory
• Sample collection dates and times
• Verification that method-specified QC procedures were met for test samples and all associated QC
samples
• Analysis dates and times for all steps in the method
• Sample volume
• Any measurements of ancillary parameters (i.e., temperature, pH, turbidity, percent solids, etc.)
• Results for all intermediate steps in the method
• Preliminary data steps to determine the final result
~2
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
• Final result
• If appropriate, quantitation reports, data system outputs, and other data to link the raw data to the
results reported
• Laboratory bench sheets and copies of all pertinent logbook pages for all sample preparation and
cleanup steps, and for all other parts of the determination
• Temperature logs for water baths, incubators, refrigerators, etc.
• Media preparation information
• If appropriate, direct instrument readouts and other data to support the final results
Raw data are generally needed for all samples, positive and negative controls, sterility checks,
verifications, blanks, matrix spikes and duplicates, and other QC analyses specified in the EPA-approved
reference method. Data should be organized so that a microbiologist can clearly understand how the
analyses were performed. The names, titles, addresses, and telephone numbers of the analysts who
performed the analyses and of the quality assurance officer who verified the analyses should be provided.
For instruments involving data systems, raw data on magnetic tape or disk should be made available on
request.
9.10.2 Electronic Data Reporting
In addition to the hard copy raw data, applicants should also submit data in electronic format (Excel
spreadsheet, or equivalent) so that EPA can create a database of study results. EPA anticipates that this
database will facilitate automated review and statistical analysis of study results. The information included
in electronic format may include: laboratory, analyst, method, sample type, sample number, date and time
of analysis, volume analyzed, replicate number, raw data, and calculated results. The applicant should
discuss an appropriate electronic format with EPA prior to data submission.
9.10.3 Example Calculations
Generally, the study report should provide example calculations that will allow the data reviewer to
determine how the laboratory used the raw data to arrive at the final results. Useful examples include both
detected analytes and undetected analytes. If the laboratory or the method employs a standardized
reporting level for undetected analytes, this should be made clear in the example, as should adjustments for
sample volume, etc.
9.11 Appendix D - Supporting References
Hard copies of all references and supporting documentation for the ATP or new method should be attached
to the study report as an appendix. The list of references may contain links to websites, or documents
available on-line. However, a hard copy should be submitted with the final study report.
53
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
SECTION 10.0 REFERENCES
10.1 AOAC. 1999. Qualitative and Quantitative Microbiology Guidelines for Methods Validation,
Journal of AOAC International, Vol. 82, No. 2.
10.2 APHA. 1998. Standard Methods for the Examination of Water and Wastewater. 20th Edition.
American Public Health Association. 1015 15th Street, NW, Washington, DC 20005.
10.3 APHA. 1995. Standard Methods for the Examination of Water and Wastewater. 19th Edition.
American Public Health Association. 1015 15th Street, NW, Washington, DC 20005.
10.4 APHA. 1992. Standard Methods for the Examination of Water and Wastewater. 18th Edition.
American Public Health Association. 1015 15th Street, NW, Washington, DC 20005.
10.5 ASTM. 1999. D4855-91: Standard Practice for Comparing Test Methods, ASTM Standards on
Precision and Bias for Various Applications. 1999 Annual Book of ASTM Standards: Water and
Environmental Technology, Volume 7.02. 100 Barr Harbor Drive, West Conshohocken, PA
19428.
10.6 ATCC. http://www.atcc.org
10.7 Fleiss, J. F. Statistical Methods for Rates and Proportions, 2nd ed., John Wiley & Sons, New York,
NY.
10.8 ISO. 2001. CD17994. Water Quality - Criteria for the Establishment of Equivalence Between
Microbiological Methods, Final Version, June 15, 2001.
10.9 National Environmental Laboratory Accreditation Conference. 2001. National Environmental
Laboratory Accreditation Conference: Quality Systems. Approved May 25, 2001, effective July 1,
2003 unless otherwise noted. Appendix D - Essential Quality Control Requirements, section D.3,
pp D-15 to D-16.
10.10 Title 40, Code of Federal Regulations, Sections 136.4, 136.5, and 141.27.
10.11 USEPA. 2001. EPA Requirements for Quality Management Plans. USEPA Office of
Environmental Information. EPA/240/B-01/002.
10.12 USEPA. 1999. Environmental Regulations and Technology Control of Pathogens in Vector
Attraction in Sewage Sludge. USEPA Office of Research and Development. EPA/625/R-92/013,
revised October 1999.
10.13 USEPA. 2005. Manual for the Certification of Laboratories Analyzing Drinking Water: Criteria
and Procedures Quality Assurance, Fifth Edition. USEPA Office of Ground Water and Drinking
Water. EPA-815-R-05-004.
10.14 USEPA. 1996. Guidelines and Format for Methods to be Proposed at 40 CFR Part 136 or Part
141. USEPA Office of Science and Technology. EPA-821-B-96-003.
54
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
10.15 USEPA. 1995a. Presence/Absence Membrane Filter Methods for Finished Waters, USEPA
Protocol for Alternate Test Procedures for Coliform Bacteria in Compliance with Drinking Water
Regulations, Version 1.2, December 1995. USEPA Office of Research and Development,
Cincinnati, OH.
10.16 USEPA. 1995b. Quantitative Membrane Filter Methods, USEPA Protocol for Alternate Test
Procedures for Coliform Bacteria in Compliance with Water and Wastewater Regulations, Version
1.0, December 1995. USEPA Office of Research and Development. Cincinnati, OH.
10.17 USEPA. 1995c. Presence/Absence Liquid Culture Methods for Finished Waters, USEPA Protocol
for Alternate Test Procedures for Coliform Bacteria in Compliance with Drinking Water
Regulations, Version 1.2, December 1995. USEPA Office of Research and Development.
Cincinnati, OH.
10.18 USEPA 1989. Memorandum: Analytical Methods for Compliance and Limited Alternate Test
Procedures Approvals. December 27, 1989.
10.19 USEPA. 1979. Handbook for Analytical Quality Control in Water and Wastewater Laboratories.
EPA-600/4-79-019. Environmental Monitoring and Support Laboratory, Cincinnati, OH. March
1979.
55
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
APPENDIX A
GLOSSARY
56
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
APPENDIX A: GLOSSARY
40 CFR part 136 — Title 40, part 136 of the Code of Federal Regulations. This part specifies
approved test procedures for the analysis of pollutants regulated under the Clean Water Act.
40 CFR part 141 — Title 40, part 141 of the Code of Federal Regulations. This part specifies
EPA's N ational P rimary D rinking W ater R egulations pur suant t o t he Safe Drinking Water Act;
Subpart C of 40 CFR part 141 lists analytical methods required for monitoring under the Act.
95% confidence interval — A statistical level indicating a 95% probability that the parameter
variable is enclosed within the given data interval.
Acceptable version — An acceptable version is a method that is either identical to the approved
method or exercises the flexibility explicitly allowed in the method. See "minor
modification."
Accuracy — The degree of agreement between an observed value and an accepted reference
value. Accuracy includes random error (precision) and systematic error (recovery) that are
caused by sampling and analysis.
Aliquot — A representative portion of a sample.
Ambient water—Ambient water refers to any fresh, marine, or estuarine surface water used for
recreation; propagation offish, shellfish, or wildlife; agriculture; industry; navigation; or as
source water for drinking water facilities.
Analysis of variance (ANOVA) — A study of the effect of a set of qualitative variables on a
quantitative response variable, based on a decomposition of the variance of the response
variable.
Analyte — The target organism or class of organisms that are measured by the method.
Analyte of concern — An analyte designated by EPA to adversely affect or have the potential to
adversely affect human health, the environment, aesthetics, or the senses. Analytes of concern
are listed in approved methods.
Approved method — A testing procedure (analytical method) promulgated at 40 CFR parts 136,
141,405-500, and other parts of the CFR that support EPA's water programs.
Average percent recovery — The average of the recovery, expressed as percent. See recovery.
Bias — A systematic or persistent distortion of a measurement process that deprives the result of
representativeness; i.e., the expected sample measurement is different than the sample's true value.
A data quality indicator. (QAMS)
Blank— See "method blank."
Bulk sample — A large sample that is aliquoted into smaller volumes prior to analyses.
57
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Calibration — The process of establishing the relationship between the concentration or
amount of material introduced into an instrument or measurement process and the output signal.
Calibration verification — Means of establishing that the instrument performance remains within
pre-established limits.
Code of Federal Regulations — A codification of the general and permanent rules published in
the Federal Register by the Executive departments and agencies of the Federal Government.
Comparability test — See side-by-side comparison.
Confidence interval — The numerical interval constructed around a point estimated of a
population parameter, combined with a probability statement (the confidence coefficient) linking
it to the population's true parameter value. If the same confidence interval construction
technique and assumptions are used to calculate future intervals, they will include the unknown
population parameter with the same specified probability. (EMMC)
Confirmed counts — Organism counts that have been verified to ensure proper identification.
Conover Squared-Rank test — A nonparametric test for equality of variability, based on the
joint squared ranks of deviations from the means. (SPRENT)
Contract laboratory — Private, academic, or commercial laboratory under contract to EPA or
other organization to perform testing.
D'Agostino test — A statistical test for determining whether a given set of results follow a
normal or log-normal distribution. Best used for datasets with at least 50 results. (GILBERT)
Data quality objective — Qualitative and/or quantitative statement of the overall level of
uncertainty that a decision-maker is willing to accept in results or decisions derived from
environmental data Data quality objectives provide the statistical framework for planning and
managing environmental data operations consistent with the data user's needs. (EMMC)
Determinative technique — The physical and/or chemical process by which measurement of the
identity and concentration of an analyte is made.
Differential medium — A solid culture medium that makes it easier to distinguish colonies of the
target organism.
Dilution/rinse water blank— An aliquot of dilution/rinse water that is treated exactly like a
sample and carried through all portions of the procedure until determined to be negative or
positive. The dilution/rinse water blank is used to determine if the sample has become
contaminated by the introduction of a foreign microorganism through poor technique.
Discharge — Generally, any spilling, leaking, pumping, pouring, emitting, emptying or
dumping (40 CFR 109.2; 110.1; 116.3); also, see "discharge of a pollutant" (40 CFR 122.2); the
medium that is spilled, leaked, pumped, poured, emitted, emptied, or dumped.
Discharge of pollutant — Any addition of any pollutant or combination of pollutants to
(1) waters of the U.S. from any point source or (2) to the waters of the contiguous zone or the
ocean from any point source other than a vessel or other floating craft which is being used as a
means of transportation (40 CFR 122.2; 401.11)
58
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Duplicate — A second sample collected from the same sampling point at the same time the
original sample is collected and analyzed exactly like the original sample. Duplicate samples
can be used as a measure of sample variability.
Effluent — A medium that flows out of a point source, e.g., the discharge from a sewage treatment
plant.
Enrichment — Using a culture medium to enhance growth of the target organism prior to
isolation of that organism.
Explicit flexibility — Modifications that are explicitly allowed in an approved method.
F distribution — A type of sampling distribution for a random variable. The ratio of 2 chi-square
distributions, each divided by their respective degrees of freedom.
F-test — In an Analysis of Variance, a test for the equality of factor level means (such as for
different methods) or for the presence of an interaction between two factors (such as method and
matrix). (RICE) (ASTM)
Facility — A plant or group of plants within a single location that is regulated under the CWA
and/or SDWA A single facility may have multiple water supplies, discharges, waste streams, or
other environmental media that are subject to compliance monitoring. For example, a single
facility within the Pulp, Paper, and Paperboard industrial category may have a direct
discharge, an indirect discharge, and an in-process waste stream, all of which are subject to
compliance monitoring.
False negative — A target organism incorrectly identified as a non-target organism or not
identified at all using the method of interest.
False positive — A non-target organism incorrectly identified as the target organism using the
method of interest.
False negative error rate — The proportion of target organisms incorrectly identified as a non-
target organism or not identified at all using the method of interest, equal to (1 - Sensitivity). In
statistical testing, the rate at which one falsely accepts a statistical hypothesis (such as that a
difference between methods does not exist) based on a statistical test, when the hypothesis is
actually false. Abbreviated as (3, and also referred to as the Type II error rate. (ASTM)
False positive error rate — The proportion of non-target organisms incorrectly identified as the
target organism using the method of interest, equal to (1 - Specificity). In statistical testing,
the rate at which one falsely rejects a statistical hypothesis (such as that a difference
between methods does not exist) based on a statistical test, when the hypothesis is actually
true. Abbreviated as a, and also referred to as the Type I error rate. (ASTM)
Federal Register — A daily publication that provides a uniform system for publishing
Presidential and Federal agency documents. Documents published in the Federal Register make
changes to the CFR to keep the CFR current. (OFR)
59
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Guidelines and Format — The document titled Guidelines and Format for Methods to be
Proposed at 40 CFR Parts 136 and 141; available from the National Technical Information
Service (NTIS), U.S. Department of Commerce, Springfield, Virginia, 22161 (703-487-4600) as
NTIS publication PB96-2 10448.
Histogram — A bar diagram of the distribution of a set of analytical results. The range of
values is categorized into sets of subintervals, or bins, and the number of results within each bin
are displayed as the height of the bars. (BERRY)
Industrial category — A category listed in 40 CFR parts 405-503.
Industrial subcategory— A subcategory defined at 40 CFR parts 405-503.
Initial demonstration of capability — A test performed to establish the ability to demonstrate
control over the analytical system and to demonstrate acceptable performance.
Initial precision and recovery — The analysis of a minimum of four spiked reagent water
samples under the same conditions as will be used for analysis of environmental samples. The
IPR is used to demonstrate that a laboratory is able to produce reliable results with the method
prior to analysis of environmental samples.
Interaction — The situation where the effect of one variable (such as method type) on a
dependent variable (such as recovery) is affected by the value of a third variable (such as matrix).
Interference — A positive or negative effect on a measurement caused by a substance other
than the one being investigated. (QAD)
Intel-laboratory — Occurring in multiple laboratories.
Intralaboratory — Occurring within a single laboratory.
Kolmogorov-Smirnov test — A statistical test for determining whether a given set of results
follow a normal distribution or any other specified distribution. (GILBERT)
Limited use — Use of a method by a single regulated entity or laboratory for analysis of one or
more matrix types.
Log-normal — A distribution of a random variable X such that the natural logarithm of X is
normally distributed.
Log-phase — Bacterial growth phase in which the logarithm of the bacterial biomass increases
linearly with time.
Main effect — Situation where a variable (such as method type) has a consistent effect on a
dependent variable (such as recovery).
Matrix— The component or substrate that contains the analytes of interest.
Matrix effect — Variability in the analytical performance of a method that can be attributed to
the type of sample analyzed.
60
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Matrix spike — A sample prepared by adding a known mass of target analyte to a specified
amount of a sample matrix for which an independent estimate of target analyte concentration is
available. A matrix spike is used, for example, to determine the effect of the matrix on a
method's recovery efficiency. (QAMS)
Matrix spike duplicate — A replicate of the matrix spike to test precision. The MS/MSD is
used in combination to test the precision of an analysis. (QAD)
Matrix type — A sample medium with common characteristics across a given industrial
category or subcategory. For example, C-stage effluents from chlorine bleach mills, effluent
from the continuous casting subcategory of the iron and steel industrial category, POTW sludge,
and in- process streams in the Atlantic and Gulf Coast Hand-shucked Oyster Processing
subcategory are each a matrix type. For the purposes of this initiative all drinking waters
constitute a single matrix type.
May — This action, activity, or procedural step is neither required nor prohibited.
May not — This action, activity, or procedural step is prohibited.
Measurement quality — Critical level which, if exceeded, is considered to append objective
additional, and possibly unacceptable, measurement uncertainty to the corresponding data.
Method — A body of procedures and techniques for performing a task (e.g. sampling,
characterization, and quantitation) systematically presented in the order in which they are to be
executed. (QAD)
Method blank— An aliquot of reagent water or designated matrix that is treated exactly as a
sample, including exposure to all glassware, equipment, solvents, and procedures that are used
with samples. The method blank is used to determine if analytes or interferences are present
in the laboratory environment, the reagents, or the apparatus.
Method-defined analyte — An analyte without a specific, known composition where the analytical
result depends totally on the measurement procedure.
Method modification — A change made to an approved method.
Method validation — A process by which a laboratory or vendor establishes the
performance of a new method or substantiates the performance of a method modification.
Methods and Criteria— The document titled: Analysis of Pollutants in Municipal Water and
Industrial Wastewater: Test Procedures and Quality Control Acceptance Criteria', available
from the National Technical Information Service (NTIS), U.S. Department of Commerce,
Springfield, Virginia, 22161 (703-487-4600) as NTIS publication PB 96-210463, and
incorporated by reference into this part.
Mid-point response factor — The response factor at the concentration at which calibration is
verified.
Minor modification — A modified method that has been reviewed by EPA and has been
determined to be technically equivalent to a method approved for use in compliance monitoring.
A minor modification employs the same chemistry and/or biological principles as the approved
method to determine the presence/absence or to quantify the amount of the target organism in a
61
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
sample. Supporting data may be necessary to demonstrate that a minor modification will yield
results equivalent to those obtained using the approved method but does not require approval as
an alternate test procedure through proposal and promulgation in the Federal Register.
Modified method — An approved method that has been modified to change a front-end
technique. EPA will judge a modified method to be: 1) an acceptable version or minor
modification of a previously promulgated method, which does not require approval as an ATP or
2) a significantly different method which requires an application for an ATP approval.
Nationwide use — Use of a method by all regulated entities and laboratories for analysis of one
or more matrix types.
Navigable waters — All waters of the United States, including the territorial seas. (40 CFR
110.1)
Negative control — A non-target organism processed to ensure the laboratories are familiar
with the identification of the target organism and to ensure that confirmation test results are
appropriate.
New method — A method that employs a determinative technique for an analyte of concern that
differs from determinative techniques employed for 1hat analyte in methods previously approved at
40 CFR part 136 or 141.
Nonparametric — A type of statistical analysis for which no assumptions about the underlying
distribution of the data are necessary.
Non-selective media — An enrichment media that allows most bacteria to grow.
Normal probability plot— A graphical depiction of the distribution of a set of analytical
results. A normal probability plot is a scatter plot depicting the observed results compared to the
expected results based on a normal distribution. If the observed data follow a normal
distribution, the graph will display a line at a 45 degree angle from the x-axis. Also known as a
Q-Q plot.
Ongoing demonstration of capability — The laboratory needs to demonstrate that the analytical
system is in control on an ongoing basis through the analysis of ODC samples (positive
control/positive control duplicate).
Ongoing precision and recovery—A reagent water sample method blank spiked with known
quantities of analytes. The OPR is analyzed exactly like a sample. Its purpose is to assure that
the results produced by the laboratory remain within the limits specified within the method for
precision and recovery.
Other approved methods — Promulgated methods that are not designated as a reference method,
but continue to carry the same regulatory status.
62
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Physical phase — The physical phase of a sample matrix (e.g., air, water, soil).
Positive control — A target organism that is analyzed to ensure that the laboratory is
performing the method acceptably and that the media is providing appropriate results.
Power — The probability that a statistical test will conclude that a difference (for example,
between methods) exists, when a difference truly does exist. Equal to 1 - (3.
Precision — The degree to which a set of observations or measurements of the same property,
usually obtained under similar conditions, conform to themselves; a data quality indicator.
Precision is usually expressed as standard deviation, variance, or range, in either absolute or
relative terms.
The precision obtainable from an environmental measurement method may be estimated from
replicate analyses of subsamples taken from the same (homogenous) sample. Generally
speaking, the more carefully one executes the various steps of a method and controls the
variables affecting the method's capability, the more precise will be the results. The use of a
non-homogeneous sample will compound the precision estimate with the sample variability.
Preparation — Processing performed on a sample prior to analysis, e.g. extraction, concentration,
cleanup, etc.
Presumptive counts — Numbers of organisms based on results that have not been confirmed or
verified.
Procedures — A set of systematic instructions for performing an activity. (QAD)
Promulgated method — A method that has been published or incorporated by reference into
40 CFR parts 136,141,405-500, or other parts that support EPA's water programs (i.e., an
approved method).
Promulgation — Publication of a final rule in the FR.
Public water system (PWS) — A system for the provision to the public of piped water for human
consumption, if such system has at least fifteen service connections or regularly serves an
average of at least twenty-five individuals daily at least 60 days out of the year. Such term
includes (1) any collection, treatment, storage, and distribution facilities under control of the
operator of such system and used primarily in connection with such system, and (2) any collection
or pretreatment storage facilities not under such control which are used primarily in connection
with such system. A public water system is either a "community water system" or a "non-
community water system."
Quality assurance — An integrated system of activities involving planning, quality control,
quality assessment, reporting, and quality improvement to ensure that a product or service meets
defined standards of quality with a stated level of confidence. (QAD)
Quality control — The overall system of technical activities whose purpose is to measure and
control the quality of a product or service so that it meets the needs of users. The aim is to
provide quality that is satisfactory, adequate, dependable, and economical. (QAD)
QC acceptance criteria — Performance specifications developed from validation data and
used to control the limits within which an analytical method is operated.
63
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
QC acceptance criteria-based comparison study — A study performed to evaluate the
performance of a modified method against the quality control acceptance criteria of a reference
method.
Range finding — Preliminary analyses conducted to assess the ambient concentration of the
target organism in a matrix to be used in a study or preliminary analyses of spiked samples
involving few, if any, replicates, to assess method performance to identify the spike dose to be
used in a study. i i
Raw data — Data that have not been processed.
Reagent water — Water conforming to Specification D 1193, Annual Book of ASTM
Standards, or specifications in Standard Methods 9020B.4.d.
Recovery — The total amount of the analyte found in the sample divided by the amount of the
analyte added into the sample as a spike.
Reference method — A method that serves as a standard against which method modifications
can be compared.
Regulated entity — Permittees, PWSs, POTWs, and other entities responsible for compliance
with provisions of the CWA or SDWA
Relative percent difference (RPD) — An estimate of the variability of two numbers expressed in
relative terms. Calculated as the absolute value of the difference of the two numbers, divided by
their mean:
A - B
RPD = * 100%
1*04 + 5)
Equal to the relative standard deviation of the two numbers multiplied by the square root of 2.
Relative standard deviation (RSD) — The standard deviation expressed as a percentage of the
mean (lOOo/X); i.e., the coefficient of variation.
Replicate — Multiple samples collected from the same sampling point at the same time and analyzed
exactly the same way. Replicate samples can be used as a measure of sample variability.
Sample matrix— See "matrix."
Sample preparation technique — Any technique in the analytical process that precedes the
determinative technique, including all procedures, equipment, solvents, etc. that are used in the
preparation and cleanup of a sample for analysis. Sample preparation techniques do not include
conditions and/or procedures for the collection, preservation, shipment, and storage of the
sample.
Sensitivity — In presence/absence testing, sensitivity is the proportion of target organisms in the
sample that were correctly detected by the method of interest.
64
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Shapiro-Wilk test — A statistical test for determining whether a given set of results follow a
normal or log-normal distribution. Best used for datasets with at most 50 results. (GILBERT)
Side-by-side comparison — Parallel testing of a new or modified method and a reference
method to determine whether the performance of the new or modified method is acceptable
compared to the reference method.
Specificity — In presence/absence testing, the proportion of non-target organisms in the sample
that were correctly identified as not being the target organism by the method of interest.
Spike — The process of adding a known amount of target analyte to a sample; used to
determine the recovery efficiency of the method. (QAD)
Spiking suspension—Diluted stock suspension containing the organism(s) of interest at a
concentration appropriate for spiking samples.
Standard deviation (o) — The measure of the dispersion of observed values expressed as the
positive square root of the sum of the squares of the difference between the individual values of a
set and the arithmetic mean of the set, divided by one less than the number of values in the set.
For a total of n numbers:
Statistical power — See power.
Stem-and-leaf plot — A graphical depiction of a set of analytical results that conveys
information about the shape of the distribution while retaining the numerical information. The
stem-and-leaf plot separates the digits of the values as leaves (the last digit of the values) and
stems (the remaining digit of the values). The individual results are grouped according to the
stems, and the leaves are listed separately in a format similar to a histogram. (RICE)
Stock suspension — A concentrated suspension containing the organism(s) of interest that is
obtained from a source that will attest to the host source, purity, authenticity, and viability of the
organisms.
Study plan — A study design submitted for EPA review, comment, and approval prior to
conducting the side-by-side or QC acceptance criteria-based method comparability study.
This process protects the applicant by providing written approval of the study design before
resources are spent to conduct the study. Data from studies conducted without EPA review
and approval may not adequately address the applicant's study objectives. A detailed procedure
for the ATP or new method should be included as an attachment to the study plan. EPA will
evaluate the study plan to verify that the appropriate data quality objectives identified in this
protocol are defined and addressed. EPA comments are incorporated into the study design and
this review process is repeated until EPA has approved the study design.
Study report — A formal report developed by laboratories or other organizations responsible for
developing ATPs or new methods documenting the results of the side-by-side or QC acceptance
criteria-based method comparability study. The information and supporting data needed in
the study report should be sufficient to enable EPA to evaluate the performance of the ATP or
new method and make a decision on whether it is comparable or superior to the reference
65
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
method. The approved study report should also identify and discuss any deviations from the
study plan that were made in implementing the study, and the approved study plan should be
appended to and referenced in the study report.
Summary results — Overall study statistics (not sample-specific results).
Target organism — The organism the method is designed to detect.
Validate — Reliably assess the performance (bias and precision) of a method in a reference
matrix (such as reagent water) and the matrix in which the validated method will be used (such as
drinking water, surface water, or municipal wastewater effluent).
Validation, single-laboratory — Assessment of method performance (see "validate") in one
laboratory.
Validation, interlaboratory — Assessment of method performance (see "validate") at multiple
laboratories.
Variance — A measure of the dispersion of a set of values. The sum of the squares of the
difference between the individual values of a set and the arithmetic mean of the set, divided by
one less than the number of values in the set. (The square of the sample standard deviation) (QAD)
Wilcoxon-Mann-Whitney (WMW) test — A nonparametric analysis used to determine
whether the medians from two levels of a given factor (such as method) differ from each
other, based on the ranks of all results. (SPRENT)
The above definitions are referenced to the following organizations:
ASTM ASTM D 4855-9 1. Standard Practice for Comparing Test Methods
BERRY Berry, D. A. and B. Lindgren. Statistics Theory and Methods. Wardsworth,
Belmonst, CA, 1990.
EMMC Environmental Monitoring Management Council
GILBERT Gilbert, R. O. Statistical Methods for Environmental Pollution Monitoring. Van
NostrandReinhold, New York, 1987.
NELAC QS National Environmental Laboratory Accreditation Conference, Quality Systems
OFR Office of Federal Register
QAD Quality Assurance Division, National Center for Environmental Research and
Quality Assurance, Office of Research and Development, USEPA
RICE Rice, J. A. Mathematical Statistics and Data Analysis, Second Edition. Wadsworth,
Belmont, CA, 1995.
SPRENT Sprent, P. Applied Nonparametric Statistical Methods, Second Edition. Chapman
& Hall, London, 1993.
66
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
APPENDIX B
ATP APPLICATION FORM
67
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
EPA Office of Water
Alternate Test Procedure or New Method Preliminary Application Form
for Microbiological Analytes
Applicant Name and
Address:
Date Application Submitted:
Type of Application:
Circle appropriate application
type
Method Number
Title of Method
Revision Date
EPA-Approved Reference
Method:
Analyte(s):
Applicable Matrices:
Circle all that apply
Level of Use:
Study Design:
Circle appropriate study design
EPA Use Only
ATP Case Number:
Alternate Test Procedure New Method
Ambient Water Biosolids Drinking Water Wastewater
Limited
Use Nationwide
Side-by-Side Comparison Study QC Acceptance Criteria-Based
Comparison Study
FOR LIMITED USE APPLICATIONS ONLY
ID Number of Existing or
Pending Permit:
Issuing Agency:
Type of Permit:
Discharge Serial Number:
Attachments
Check all that apply
D Justification for ATP
D Alternate Test Procedure or New Method (in standard
D Method Comparison Table
D Study Plan
D Other:
EPA format)
Submit Application and Attachments in Triplicate
68
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
EPA Office of Water
Alternate Test Procedure or New Method Final Application Form
for Microbiological Analytes
Applicant Name and Address:
Date Application Submitted:
Type of Application:
Circle appropriate application
type
Method Number
Title of Method
Revision Date
EPA-Approved Reference
Method:
Analyte(s):
Applicable Matrices:
Circle all that apply
Level of Use:
Study Design:
Circle appropriate study design
EPA Use Only
ATP Case Number:
Alternate Test Procedure
New Method
Ambient Water Biosolids
Limited Use
Side-by-Side Comparison Study
FOR LIMITED USE APPLICATIONS
ID Number of Existing or
Pending Permit:
Issuing Agency:
Type of Permit:
Discharge Serial Number:
Drinking Water Wastewater
Nationwide
QC Acceptance Criteria-Based
Comparison Study
ONLY
Attachments
Check all that apply
D Justification for ATP
D Alternate Test Procedure or New Method (in standard EPA format)
D Method Comparison Table
D Study Plan
D Study Report
D Other:
Submit Application and Attachments in
Triplicate
69
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
APPENDIX C
APPLICATION INVENTORY FORM
70
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
1. Completed application form.
Includes the name and address of the applicant; the date of submission of the application;
the method number, title, and revision date; the EPA-approved reference method; the
analyte(s) for which the ATP or new method is proposed; the type of application; applicable
matrices; study design; level of use; NPDES permit information, if applicable; and the
attachments submitted with the application.
Section 2.2
Appendix C
-
2. Justification for ATP or new method.
Brief justification for why the ATP or new method is being proposed.
Section 2.3
n
3. Method in EPA format.
Scope and application
Summary of method
Definitions of method
Interferences
Safety
Equipment and supplies
Reagents and standards
Sample collection, preservation, and storage
Quality control
Calibration and standardization
Procedure
Data analysis and calculations
Method performance
Pollution prevention
Waste management
References
Tables, diagrams, flowcharts, and validation data
Section 3.1
Section 3.2
Section 3.3
Section 3.4
Section 3.5
Section 3.6
Section 3.7
Section 3.8
Section 3.9
Section 3.10
Section 3.11
Section 3.12
Section 3.13
Section 3.14
Section 3.15
Section 3.16
Section 3.17
n
n
n
n
n
n
n
n
n
n
n
n
n
n
n
n
n
4. Method comparison table.
A two-column table comparing the proposed ATP or new method with the EPA-approved
reference method. This table should include the number and title of each method, the
latest revision date of the ATP or new method, and a detailed discussion of each of the
method sections listed in Section 3.0. Each topic should be discussed in a separate row
of the table and the applicant should highlight any differences between the ATP or
new method and the EPA-approved reference method.
Section 2.5
-
71
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
5. Study plan.
Background
Objectives
Study design
Coordination
Data reporting
Section 4.1
Section 4.2
Section 4.3
Section 4.4
Section 4.5
n
n
n
n
n
6. Study report.
Background
Study objectives and design
Study implementation
Data reporting and validation
Results
Data analysis and discussion
Conclusions
Appendix A: Method
Appendix B: Approved study plan
Appendix C: Supporting data
Appendix D: Supporting references
Section 9.1
Section 9.2
Section 9.3
Section 9.4
Section 9.5
Section 9.6
Section 9.7
Section 9.8
Section 9.9
Section 9.10
Section 9.11
n
n
n
n
n
n
n
n
n
n
n
72
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
APPENDIX D
DATA ELEMENTS AND EXAMPLE BENCH SHEETS
73
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Data Elements
The data elements listed below should be reported on the bench sheets or in the lab notebook for each
method, as applicable. EPA will review the information during the data validation process to ensure the
method-specific QC measures are met, as agreed to in the approved study plan.
• Laboratory name
• Method number
• Media
• Procedure
• Matrix
• Sample collection date/time
• Dates and times for all method steps associated with holding times or incubation times
• Analyst initials for each processing step in the method
• Presumptive results for all applicable media
• Confirmed/completed results for all applicable media
• All measured volumes
• Dilution information
• Final result per units of measurement
Example Bench sheets
Example bench sheets for the following EPA-approved reference methods are included in this appendix:
• Aeromonas (USEPA Method 1605)
• Cryptosporidium (USEPA Methods 1622 and 1623)
• K coli (SM 922 IF, SM 9222G)
• Enterococci (SM 9230C)
• Fecal coliforms (SM 9221E, 9222D)
• Fecal streptococcus (SM 9230B, SM 9230C)
• Giardia (USEPA Method 1623)
• Total coliforms (SM 922IB, SM 9222B)
Note: Additional example bench sheets or electronic copies of the attached bench sheets are available upon
request.
74
-------
Laboratory:
Sample Collection Date:
Multiple-Tube Fermentation: Total Coliform, Fecal Coliform, E. coli
(SM 9221B, SM 9221E, SM 9221F)
Sample Collection Time:
Sampling Point:
LTB: Replicate 1
24hr/48hr
LTB read
Analyst
Initials
/
Read Temp
7
10
mL
/
1.0
mL
/
0.1
mL
/
0.01
mL
/
0.001
mL
/
Please enter date and time for the following:
LTB start
LTB 24 hr read
LTB 48 hr read,
BOB 24 hr read (from 24 hr LTB),
ECMUG 24 hr read (from 24 hr LTB)
BOB 48 hr read (from 24 hr LTB),
BOB 24 hr read (from 48 hr LTB),
ECMUG 24 hr read (from 48 hr LTB)
BOB 48 hr read (from 48 hr LTB)
LTB: Replicate 2
24hr/48hr
LTB read
Analyst
Initials
Z
Read Temp
7
10
mL
/
1.0
mL
/
0.1
mL
/
0.01
mL
/
0.001
mL
/
LTB: Replicate 3
24hr/48hr
LTB read
Analyst
Initials
7
Read Temp
7
10
mL
/
1.0
mL
/
0.1
mL
/
0.01
mL
/
0.001
mL
/
Comments:
BGB: Replicate 1
24hr/48hr
BGB read
(From 24 hr LTB)
24hr/48hr
BGB read
(From 48 hr LTB)
Analyst
Initials
7
z
Read Temp
7
Z
Final tube combination:
10
mL
/
/
1.0
mL
/
/
0.1
mL
/
/
0.01
mL
/
/
0.001
mL
/
/
Total coliforms/100 mL:
EC-MUG: Replicate 1
24 hr fecal read
(From 24 hr LTB)
24 hr fecal read
(From 48 hr LTB)
24 hr E. coli read
(From 24 hr LTB)
24 hr E. coli read
(From 48 hr LTB)
Analyst
Initials
Read Temp
Fecal tube combination:
E. coli tube combination:
10
mL
1.0
mL
0.1
mL
0.01
mL
0.001
mL
Fecal/100 mL:
E. CO///100 mL:
BGB: Replicate 2
24hr/48hr
BGB read
(From 24 hr LTB)
24hr/48hr
BGB read
(From 48 hr LTB)
Analyst
Initials
7
/
Read Temp
7
/
Final tube combination:
10
mL
/
/
1.0
mL
/
/
0.1
mL
/
/
0.01
mL
/
/
0.001
mL
/
/
Total coliforms/100 mL:
EC-MUG: Replicate 2
24 hr fecal read
(From 24 hr LTB)
24 hr fecal read
(From 48 hr LTB)
24 hr E. coli read
(From 24 hr LTB)
24 hr E. coli read
(From 48 hr LTB)
Analyst
Initials
Read Temp
Fecal tube combination:
E. coli tube combination:
10
mL
1.0
mL
0.1
mL
0.01
mL
0.001
mL
Fecal/100 mL:
E. CO///100 mL:
BGB: Replicate 3
24hr/48hr
BGB read
(From 24 hr LTB)
24hr/48hr
BGB read
(From 48 hr LTB)
Analyst
Initials
7
Z
Read Temp
7
Z
Final tube combination:
10
mL
/
/
1.0
mL
/
/
0.1
mL
/
/
0.01
mL
/
/
0.001
mL
/
/
Total coliforms/100 mL:
EC-MUG: Replicate 3
24 hr fecal read
(From 24 hr LTB)
24 hr fecal read
(From 48 hr LTB)
24 hr E. coli read
(From 24 hr LTB)
24 hr E. coli read
(From 48 hr LTB)
Analyst
Initials
Read Temp
Fecal tube combination:
E. coli tube combination:
10
mL
1.0
mL
0.1
mL
0.01
mL
0.001
mL
Fecal/100 mL:
E. co///100 mL:
-------
Membrane Filtration: mEndo/NA-MUG
(SM 9222B/SM 9222G)
Laboratory:
Sample collection date:
Sample collection time:
Sampling point:
mEndo incubation start temperature (°C):
mEndo incubation end temperature (°C):
mEndo incubation start date/time:
mEndo incubation end date/time:
NA-MUG incubation start temperature (°C):
NA-MUG incubation end temperature (°C):
NA-MUG incubation start date/time:
NA-MUG incubation end date/time:
Replicate number 1 |
Analyst initials
Sample volume
filtered
100mL
10.0 mL
1.0 mL
0.1 mL
Total conforms
No. colonies
per filter
Total Conforms
per 100 mL
£ co//
No. colonies
per filter
£ co//
per 100mL
Replicate number 2 |
Analyst initials
Sample volume
filtered
100mL
10.0 mL
1.0 mL
0.1 mL
Total coliforms
No. colonies
per filter
Total Coliforms
per 100 mL
£ co//
No. colonies
per filter
£. co//
per 100 mL
Replicate number 3 |
Analyst initials
Sample volume
filtered
100mL
10.0 mL
1.0 mL
0.1 mL
Total coliforms
No. colonies
per filter
Total Coliforms
per 100 mL
£ co//
No. colonies
per filter
£ co//
per 100 mL
-------
Membrane Filtration: mFC/NA-MUG
(SM 9222D/SM 9222G)
Laboratory:
Sample collection date:
Sample collection time:
Sampling point:
mFC incubation start temperature (°C):
mFC incubation end temperature (°C):
NA-MUG incubation start temperature (°C):
NA-MUG incubation end temperature (°C):
mFC incubation start date/time:
mFC incubation end date/time:
NA-MUG incubation start date/time:
NA-MUG incubation end date/time:
Replicate number 1 |
Analyst initials
Sample volume
filtered
100mL
10.0 mL
1.0 mL
0.1 mL
Fecal coliforms
No. colonies
per filter
Fecal coliforms
per 100 mL
£ co//
No. colonies
per filter
£ co//
per 100 mL
Replicate number 2 |
Analyst initials
Sample volume
filtered
100mL
10.0 mL
1.0 mL
0.1 mL
Fecal coliforms
No. colonies
per filter
Fecal coliforms
per 100 mL
£ co//
No. colonies
per filter
£. co//
per 100 mL
Replicate number 3 |
Analyst initials
Sample volume
filtered
100mL
10.0 mL
1.0 mL
0.1 mL
Fecal coliforms
No. colonies
per filter
Fecal coliforms
per 100 mL
£ co//
No. colonies
per filter
£ co//
per 100mL
-------
E. co// Membrane Filtration: mTEC
(EPA 1103.1, SM 9213D)
Laboratory:
Sample collection date:
Sample collection time:
Sampling point:
mTEC incubation start temperature (°C):
mTEC incubation end temperature (°C):
Urease substrate incubation start date/time:
mTEC incubation start date/time:
mTEC incubation end date/time:
Urease substrate incubation end date/time:
Replicate number 1 |
Analyst initials
Sample volume filtered
100mL
10.0 mL
1.0 mL
0.1 mL
E. coll
No. colonies
per filter
£. co//
per 100 mL
Replicate number 2
Analyst initials
Sample volume filtered
100mL
10.0 mL
1.0 mL
0.1 mL
E. coll
No. colonies
per filter
£. coll
per 100 mL
Replicate number 3
Analyst initials
Sample volume filtered
100mL
10.0 mL
1.0 mL
0.1 mL
£. coll
No. colonies
per filter
£. co//
per 100 mL
-------
E. co// Membrane Filtration: Modified mTEC
(EPA 1603)
Laboratory:
Sample collection date:
Sample collection time:
Sampling point:
modified mTEC incubation start temperature (°C):
modified mTEC incubation end temperature (°C):
modified mTEC incubation start date/time:
modified mTEC incubation end date/time:
Replicate number 1 |
Analyst initials
Sample Volume filtered
100mL
10.0 mL
1.0 mL
0.1 mL
£. co//
No. colonies
per filter
E.coli
per 100 mL
Replicate number 2
Analyst initials
Sample Volume filtered
100mL
10.0 mL
1.0 mL
0.1 mL
£. co//
No. colonies
per filter
E.coli
per 100mL
Replicate number 3
Analyst initials
Sample Volume filtered
100mL
10.0 mL
1.0 mL
0.1 mL
£. co//
No. colonies
per filter
E.coli
per 100mL
-------
Multiple-Tube Fermentation: Fecal streptococcus
(SM 9230B)
Laboratory:
Sample Collection time:
Sample Collection Date:
Replicate Number 1
Azide Dextrose Broth (ADB)
24hrADBread
48 hr ADB read
Analyst
Initials
10
ml
/
1.0
ml
/
0.1
ml
/
0.01
ml
/
0.001
ml
/
Comments:
Sampling point:
BEA Plates
24 hr BEA read
from 24 hr ADB
24 hr BEA read
from 48 hr ADB
Analyst
Initials
Final tube combination:
10
ml
/
1.0
ml
/
0.1
ml
/
0.01
ml
/
0.001
ml
/
Fecal streptococcus/1 00 ml_:
Replicate Number 2
Azide Dextrose Broth (ADB)
24 hr ADB read
48 hr ADB read
Analyst
Initials
10
ml
/
1.0
ml
/
0.1
ml
/
0.01
ml
/
0.001
ml
/
Comments:
BEA Plates
24 hr BEA read
from 24 hr ADB
24 hr BEA read
from 48 hr ADB
Analyst
Initials
Final tube combination:
10
ml
/
1.0
ml
/
0.1
ml
/
0.01
ml
/
0.001
ml
/
Fecal streptococcus/1 00 ml_:
Replicate Number 3
Azide Dextrose Broth (ADB)
24 hr ADB read
48 hr ADB read
Analyst
Initials
10
ml
1.0
ml
/
0.1
ml
/
0.01
ml
/
0.001
ml
/
Comments:
BEA Plates
24 hr BEA read
from 24 hr ADB
24 hr BEA read
from 48 hr ADB
Analyst
Initials
Final tube combination:
10
ml
/
1.0
ml
/
0.1
ml
/
0.01
ml
/
0.001
ml
/
Fecal streptococcus/100 ml_:
ADB incubation start date/time:
Start temp:
ADB incubation end date/time:
End temp:
BEA incubation start date/time:
(From 24hrADB)
Start temp:
BEA incubation end date/time:
(From 24hrADB)
End temp:
BEA incubation start date/time:
(From 48 hr ADB)
Start temp:
BEA incubation end date/time:
(From 48 hr ADB)
End temp:
-------
Membrane Filtration: Fecal Streptococcus
(SM 9230C)
Laboratory:
Sample collection date:
Sample collection time:
Sampling point:
mEnterococcus incubation start temperature (°C):
mEnterococcus incubation end temperature (°C):
mEnterococcus incubation start date/time:
mEnterococcus incubation end date/time:
Replicate number 1 |
Analyst initials
Sample volume filtered
100mL
10.0mL
1.0ml
0.1 ml
Fecal streptococcus
No. colonies per filter
Fecal streptococcus
per 100 ml
Replicate number 2 |
Analyst initials
Sample volume filtered
100ml
10.0ml
1.0ml
0.1 ml
Fecal streptococcus
No. colonies per filter
Fecal streptococcus
per 100 ml
Replicate number 3 |
Analyst initials
Sample volume filtered
100ml
10.0ml
1.0ml
0.1 ml
Fecal streptococcus
No. colonies per filter
Fecal streptococcus
per 100 ml
-------
Membrane Filtration: Enterococcus (mE-EIA)
(EPA1106.1,SM9230C)
Laboratory:
Sample collection date:
Sample collection time:
Sampling point:
mE incubation start temperature (°C):
mE incubation end temperature (°C):
EIA incubation start temperature (°C):
EIA incubation end temperature (°C):
mE incubation start date/time:
mE incubation end date/time:
EIA incubation start date/time:
EIA incubation end date/time:
Replicate number 1 |
Analyst initials
Volume filtered
100mL
10.0mL
1.0ml
0.1 ml
Enterococcus
No. colonies
per filter
Enterococci
per 100 ml
Replicate number 2
Analyst initials
Volume filtered
100ml
10.0ml
1.0ml
0.1 ml
Enterococcus
No. colonies
per filter
Enterococci
per 100 ml
Replicate number 3
Analyst initials
Volume filtered
100ml
10.0ml
1.0ml
0.1 ml
Enterococcus
No. colonies
per filter
Enterococci
per 100 ml
-------
Laboratory:
Membrane Filtration: Enterococcus (mEI)
Sample collection time:
Sample collection date:
mEI incubation start temperature (°C):
mEI incubation end temperature (°C):
Sampling point:
mEI incubation start date/time:
mEI incubation end date/time:
Replicate number 1 |
Analyst initials
Volume filtered
100mL
10.0mL
1.0ml
0.1 ml
Enterococcus
No. colonies
per filter
Enterococcus
per 100ml
Replicate number 2 |
Analyst initials
Volume filtered
100ml
10.0ml
1.0ml
0.1 ml
Enterococcus
No. colonies
per filter
Enterococcus
per 100ml
Replicate number 3 |
Analyst initials
Volume filtered
100ml
10.0ml
1.0ml
0.1 ml
Enterococcus
No. colonies
per filter
Enterococcus
per 100ml
-------
Batch-specific Cover Sheet:
Method 1605 (Aeromonas) -ADA with Vancomycin
Note: Please complete one sheet per week of analysis.
Laboratory name:
Section 1. Media Preparation Information
1
2
3
4
5
6
Kovac's expiration date
Oxidase dry slides expiration date
Date of nutrient agar plate preparation
Date of nutrient agar slant preparation
Date nutrient agar slants for positive
controls and matrix spikes were inoculated
Time nutrient agar slants for positive
controls and matrix spikes were inoculated
Section 2: Sample Processing Information
7
8
9
10
11
12
13
14
Dilution preparation date (for IDC, ODC, and MS/MSD samples)
Dilution preparation time (for IDC, ODC, and MS/MSD samples)
Analyst preparing dilutions for IDC, ODC, and MS/MSD samples)
Sample spiking date
Sample spiking time
Analyst spiking samples
Analyst performing filtration
Funnel decontamination method
Section 3: Sample Analysis Information
15
16
16
17
18
19
20
21
22
23
24
25
26
27
28
29
Incubator temperature at start date
Incubator temperature at read date
ADA-V incubation start date / time
ADA-V read date /time
ADA-V read analyst
Nutrient agar plate incubation start date / time
Nutrient agar plate read date / time
Nutrient agar plate read analyst
Oxidase confirmation date / time
Oxidase read analyst
Trehalose incubation start date / time
Trehalose read date / time
Trehalose read analyst
Tryptone (indole) incubation start date /time
Tryptone (indole) read date / time
Tryptone (indole) read analyst
0 C
c
-------
QC Checklist:
Method 1605 (Aeromonas) - ADA with Vancomycin
Note: Please complete one sheet per week of analysis. Please circle the appropriate response and provide supporting information as
necessary.
Laboratory name:
Batch identification:
1
2
3
4
5
6
7
8
Did all method blanks (dilution/rinse water) exhibit the appropriate response?
Yes No
If no, please list contaminated method blank(s) and associated samples.
Did ADA-V media sterility check exhibit the appropriate response?
Yes No
If no, please explain.
Did nutrient agar plate media sterility check exhibit the appropriate response?
Yes No
If no, please explain.
Did nutrient agar slant media sterility check exhibit the appropriate response?
Yes No
If no, please explain.
Did trehalose media sterility check exhibit the appropriate response?
Yes No
If no, please explain.
Did tryptone media sterility check exhibit the appropriate response?
Yes No
If no, please explain.
Did the unspiked reagent water sample exhibit the appropriate response?
Yes No
If no, please explain.
Did negative controls for oxidase, trehalose, and indoleexhibit the appropriate responses?
Yes No
If no, please explain.
-------
Sample-specific Data Report Form:
Method 1605 (Aeromonas) - ADA With Vancomycin
Laboratory name:
Section 1: Sample information
1 Sample number:
2 Utility:
3 Sampling point:
QC Analysis or Matrix filtered sterility check, direct streak, streak with filter
(please circle one): reage^ ^^ di|ution/rinse
5
6
7
8
Volume (ml_) of spike (QC samples only):
Volume filtered (ml):
Dilution bottle (for laboratory-prepared QC samples only, pli D
If this report form is for a method blank,
please indicate samples that are
associated with this method blank:
D2
Please note: It is important to record the number of colonies for each presumptively positive morphological type so that the final density otAeromonas per sample can be reported based on percent confirmation of each colony type.
Section 2: Sample results
1
2
3
4
5
6
7
8
9
10
ADA-V colony descript
(this section is option
Color
pale yellow
dark yellow
Morphology*
1358
1358
on
al)
size (mm)
2-4 mm
1-3 mm
No. of presumptive
positive colonies for
this colony type
60
5
No. of presumptive colonies
submitted to confirmation
for this colony type
2
1
Nutrient agar colony description
(this section is optional)
Color
off-white
off-white
Morphology*
1358
1358
size (mm)
2-4 mm
1-2 mm
No. oxidase
positive
per colony type
2 of 2
1 of 1
No. trehalose
positive
per colony type
2 of 2
1 of 1
No. Indole
positive
per colony type
1 of 2
1 of 1
*Morphology choices (list all that apply): (1) Round, (2) oval, (3) symmetric, (4) asymmetric, (5) shiny, (6) dull, (7) translucent, (8) opaque, (9) grainy, (10) fuzzy, (11) other
Section 3: Calculations (Use one row for each presumptive positive colony color and morphology. If more than five colony types, please attach another sheet.)
A
No. presumptive positive colonies
for each colony type
60
5
B
No. of each colony type
submitted to
confirmation
2
1
C
How many submitted colonies per
colony type confirmed (oxidase
positive, ferments trehalose, and
produces indole)?
1
1
A * (C/B) = D
Calculated no. of
confirmed Aeromonas per colony type
D1= 30
D2= 5
D3=
D4=
D5=
D1 + D2 + D3 + D4 + D5
Total confirmed Aeromonas per sample
30 + 5 = 35
Section 5: Comments
-------
Sample-specific Data Report Form:
Method 1605 (Aeromonas) - ADA With Vancomycin
Laboratory name:
Section 1: Sample information
1 Sample number:
2 Utility:
3 Sampling point:
4 QC Analysis or Matrix filtered sterility check, direct streak, streak with filter
(please circle one): reagen( ^.^ di|ution/rinse
5 Volume (mL) of spike (QC samples only):
6 Volume filtered (mL):
7 Dilution bottle (for laboratory-prepared QC samples only, please circle one): D
D2
If this report form is for a method blank, please
8 indicate samples that are associated with this
method blank:
I Please note: It is important to record the number of colonies for each presumptively positive morphological type so that the final density ofAeromonas per sample can be reported based on percent confirmation of each colony type.
Section 2: Sample results
1
2
3
4
5
6
7
8
9
10
ADA-V colony description
(this section is optional)
Color
Morphology*
size (mm)
No. of presumptive
positive colonies for
this colony type
No. of presumptive colonies
submitted to confirmation
for this colony type
Nutrient agar colony description
(this section is optional)
Color
Morphology*
size (mm)
No. oxidase
positive
per colony type
No. trehalose
positive
per colony type
No. Indole positive
per colony type
"Morphology choices (list all that apply): (1) Round, (2) oval, (3) symmetric, (4) asymmetric, (5) shiny, (6) dull, (7) translucent, (8) opaque, (9) grainy, (10) fuzzy, (11) other
Section 3: Calculations (Use one row for each presumptive positive colony color and morphology. If more than five colony types, please attach another sheet.)
A
No. presumptive positive colonies
for each colony type
B
No. of each colony type
submitted to
confirmation
C
How many submitted colonies per
colony type confirmed (oxidase
positive, ferments trehalose, and
produces indole)?
A * (C/B) = D
Calculated no. of
confirmed Aeromonas per colony type
D1 =
D2=
D3=
D4=
D5=
D1 + D2 + D3 + D4 + D5
Total confirmed Aeromonas per sample
Section 5: Comments
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Laboratory Name:
Laboratory ID:
Method 1622/23 Bench Sheet
1. Client sample number
2. Internal laboratory sample ID (if applicable)
3. Date and time of sample receipt
4. Received by
5. Temperature of sample and condition of sample upon arrival
6. Storage location and storage temperature
7. Sample turbidity, in NTU
8. Sample type (IPR, method blank, field sample, OPR, MS, PT sample)
9. Spiking suspension number (for IPR, OPR, MS, and PT samples only)
10. Estimated number of oocysts/cysts spiked (for IPR, OPR, MS, and PT samples only)
11. Spiking date and time
12. Sample volume spiked, in L
13. Sample filtration start date and time
14. Type of filter used (Envirochek, Envirochek HV, FiltaMax, CrypTest, other [specify]) and lot number:
15. Name of analyst performing filtration
16. Sample volume filtered, to nearest 1/4 L (do not include rinse volume)
17. Did filter clog?
18. Elution date and time (must be performed within 96 hours of sample collection/filtration)
Crypto
Giardia
19. Elution procedure: D wrist shaker D FiltaMax wash station stomacher D backflush/sonication
20. Name of analyst performing elution
21. Elution buffer: Elution buffer lot number and expiration date:
22. Concentration procedure (centrifugation, FiltaMax concentrator, other [specify])
23. Name of analyst performing concentration
24. Pellet volume after concentration, in mL
25. (a) Total volume of resuspended concentrate; (b) volume transferred to IMS (in mL)
26. Number of subsamples processed independently through the remainder of the method
27. IMS system used (Dynal anti-Cryptosporidium, Dynal GC-Combo, other [specify]) and lot number
27. Name of analyst performing IMS procedure
28. Slide(s) used (Meridian, Dynal, other [specify]) and lot number
29. Date and time sample applied to slide(s) to dry (must be completed same working day as Row 18)
30. Detection kit used (Merifluor, AquaGlo, Crypt-a-Glo, Giardi-a-Glo, other [specify]) and lot number
31. Analyst performing staining procedure
32. Staining completion date and time (must be complete within 72 hours of Row 29)
33. Total number of oocysts and cysts counted in sample (sum of counts in subsamples, if applicable)
(a)
(b)
Crypto
Giardia
Comments:
Ftess steps must to eompfeted in cw working day
HV=high volume
NTU=nephelometric turbidity unit
IPR=initial precision and recovery
OPR=ongoing precision and recover
MS=matrix spike
PT=performance test
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Laboratory Name:
Laboratory ID:
Method 1622/1623 Cryptosporidium Report Form
Client sample number:
10-mL subsample ID (if packed pellet > 0.5 ml):
Analyst:
Object
located
by FA
No.
1
2
3
4
5
6
7
8
9
10
Shape
(oval
or
round)
Size
LxW
(Fm)
DAPI-
Light blue internal
staining, no
distinct nuclei,
green rim
(A)
Internal laboratory sample ID (if applicable):
Volume examined (in L) on this slide:
Positive staining control acceptable YES NO
Negative staining control acceptable YES NO
DAPI +
Intense blue
internal staining
(B)
Total FA number from this slide:
DAPI -: Total number (A):
DAPI +: Total number (B):
DAPI +: Total number (C):
Total count DAPI + (C) that show structure by D.I.C. (F):
Number of
nuclei stained
sky blue
(C)
D.I.C.
Empty oocysts
(D)
Oocysts with
amorphous
structure
(E)
Oocysts with internal structure (F)
Number of sporozoites
Examination completion date:
Examination completion time (must be complete within 7 days of staining):
D.I.C. - Total number of empty oocysts (D):
D.I.C. - Total number of oocysts with amorphous structure (E):
D.I.C. - Total number of oocysts with internal structure (F):
-------
EPA Microbiological Alternate Test Procedure (ATP) Protocol
Laboratory Name:
Laboratory ID (if applicable):
Method 1623 Giardia Report Form
Client sample number:
1 0-mL subsample ID (if packed pellet > 0.5 mL):
Analyst:
Object
located
by FA
No.
1
2
3
4
5
6
7
8
9
10
Shape
(oval
or
round)
Size
LxW
(Fm)
DAPI-
Light blue
internal staining,
no distinct
nuclei, green rim
(A)
Internal laboratory sample ID (if applicable):
Volume examined (in L) on this slide:
Pos. staining control acceptable YES NO
Neg. staining control acceptable YES NO
DAPI +
Intense blue
internal
staining
(B)
Total FA number from this slide:
DAPI-: Total number (A):
DAPI+: Total number (B):
DAPI+: Total number (C):
Total number DAPI + (C) that show structure by D.I.C. (F):
Number of
nuclei stained
sky blue
(C)
D.I.C.
Empty cysts
(D)
Cysts with
amorphous
structure
(E)
Cysts with internal structure (F)
Number of
nuclei
Median
body
Axonemes
Examination completion date:
Examination completion time (must be complete within 7 days of staining):
D.I.C.: Total number of empty cysts (D):
D.I.C.: Total number of cysts with amorphous structure (E):
D.I.C.: Total number of cysts with one internal structure (F):
D.I.C.: Total number of cysts with >one internal structure (F):
------- |