Batteiie
The Business Of Innovation
Environmental Technology
Verification Program
Advanced Monitoring
Systems Center
Generic Verification Protocol
for Verification of
Online Turbidimeters
EW ET
-------
GENERIC
VERIFICATION PROTOCOL
for
Verification of
Online Turbidimeters
Version 1.0
June 4, 2012
Prepared by
Battelle
505 King Avenue
Columbus, OH 43201-2693
-------
Generic Verification Protocol Turbidimeters
Page 1 of 42
Version 1.0
June 4, 2012
SECTION A
PROJECT MANAGEMENT
Al VENDOR APPROVAL PAGE
ETV Advanced Monitoring Systems Center
Generic Verification Protocol
for Verification of
Online Turbidimeters
Version 1.0
June 4, 2012
APPROVAL:
Name
Company
Date
The U.S. Environmental Protection Agency, through its Office of Research and Development, funded and managed,
or partially funded and collaborated in, the research described herein. It has been subjected to the Agency's peer
and administrative review. Any opinions expressed in this report are those of the author(s) and do not necessarily
reflect the views of the Agency, therefore, no official endorsement should be inferred. Any mention of trade names
or commercial products does not constitute endorsement or recommendation for use.
-------
A2 TABLE OF CONTENTS
Section
Generic Verification Protocol Turbidimeters
Page 2 of 42
Version 1.0
June 4, 2012
Page
PROJECT MANAGEMENT 1
Al Vendor Approval Page 1
A2 Table of Contents 2
A3 List of Acronyms and Abbreviations 4
A4 Distribution List 5
A5 Verification Test Organization 6
A6 Background 13
A7 Verification Test Descriptions and Schedule 15
A8 Quality Objectives and Criteria for Measurement Data 17
A9 Special Training/Certification 18
A10 Documentation and Records 19
MEASUREMENT AND DATA ACQUISITION 21
Bl Experimental Design 21
B2 Sampling Method Requirements 29
B3 Sample Handling and Custody Requirements 29
B4 Analytical Method Requirements 30
B5 Quality Control Requirements 30
B6 Instrument/Equipment Testing, Inspection, and Maintenance 30
B7 Instrument Calibration andFrequency 31
B8 Inspection/Acceptance of Supplies and Consumables 32
B9 Non-Direct Measurements 33
BIO Data Management 33
ASSESSMENT AND OVERSIGHT 35
Cl Assessments and Response Actions 35
C2 Reports to Management 37
DATA VALIDATION AND USABILITY 39
Dl Data Review, Verification, and Validation Requirements 39
D2 Verification and Validation Methods 39
D3 Reconciliation with User Requirements 40
REFERENCES 42
-------
Generic Verification Protocol Turbidimeters
Page 3 of 42
Version 1.0
June 4, 2012
Section Page
APPENDIX
Appendix A EPA Method 180.1
Appendix B Recirculation System Schematic [to be completed for final QAPP]
Appendix C Example Data Sheets [to be completed for final QAPP]
Figures
Figure 1. Organizational Chart
Tables
Table 1. Estimated Verification Testing Schedule 17
Table 2. DQI and Criteria for Critical Supporting Measurements 19
Table 3. Summary of Tests and Testing Frequency 22
Table 4. Criteria for QC and Critical Measurements for Turbidimeter Testing 32
Table 5. Summary of Data Recording Process 34
Table 6. Summary of Quality Assessment and Control Reports 38
-------
Generic Verification Protocol Turbidimeters
Page 4 of 42
Version 1.0
June 4, 2012
A3 LIST OF ACRONYMS AND ABBREVIATIONS
%RSD percent relative standard deviation
ADQ audit of data quality
AMS Advanced Monitoring Systems
ATP Alternate Test Procedure
COA certificates of analysis
DI deionized
DQIs data quality indicators
EPA U.S. Environmental Protection Agency
ETV Environmental Technology Verification
L liter
LRB laboratory record book
NTU nephelometric turbidity unit
MQO measurement quality objective
PEA performance evaluation audit
QA quality assurance
QAPP Quality Assurance Project Plan
QM Quality Manager
QC quality control
QMP Quality Management Plan
RMO Records Management Office
RSD relative standard deviation
SDVB styrene divinylbenzene
TSA technical systems audit
VTC Verification Test Coordinator
IR infrared
PC personal computer
-------
Generic Verification Protocol Turbidimeters
Page 5 of 42
Version 1.0
June 4, 2012
A4 DISTRIBUTION LIST*
U.S. Environmental Protection Agency (EPA)
EPA Advanced Monitoring Systems (AMS) Center Project Officer
EPA AMS Center Quality Manager (QM)
Battelle
Battelle AMS Center Manager
Battelle Verification Test Coordinator (VTC)
Battelle AMS Center QM
Battelle Technical Staff
* Once vendors agree to participate in a verification test in this technology category, this generic
protocol will be modified to be specific for the technology(ies) to be verified and then reviewed,
finalized, and distributed to the following:
Vendor(s)
Peer Reviewers, at least one EPA Office of Water reviewer and one non-EPA reviewer
Reference Laboratory, if applicable
Test Collaborators (e.g., water utilities), if applicable
-------
Generic Verification Protocol Turbidimeters
Page 6 of 42
Version 1.0
June 4, 2012
A5 VERIFICATION TEST ORGANIZATION
This protocol provides generic procedures for implementing a verification test for the
performance of online turbidimeters. The verification tests described in this document will be
conducted under the Environmental Technology Verification (ETV) Program. Verification tests
will be performed by Battelle, which is managing the ETV Advanced Monitoring Systems
(AMS) Center through a cooperative agreement with the EPA. The scope of the AMS Center
covers verification of monitoring technologies for contaminants and natural species in air, water,
and soil.
Quality Assurance (QA) oversight will be provided by the Battelle AMS Center QM and by the
EPA AMS Center QM at their discretion. Based on the procedures outlined in this document, it
is anticipated that verifications performed based on this generic protocol will be EPA Category
III verification tests. The final determination will be made by the EPA AMS Center QM once the
generic protocol is modified to be specific to the technology(ies) being verified. The organization
chart in Figure 1 identifies the responsibilities of the organizations and individuals associated
with these verification tests. Roles and responsibilities are defined further below.
A5.1 Battelle
Battelle's AMS Center VTC. Battelle's AMS Center VTC will have overall responsibility for
ensuring that the technical, schedule, and cost goals established for the verification tests are met.
Specifically, the VTC will:
• Assemble a team of qualified technical staff to conduct the verification tests,
• Hold a kick-off meeting approximately one week prior to the start of the verification
tests to review the critical logistical, technical, and administrative aspects of the
verification tests and confirm responsibility for each aspect of the verification test,
• Direct the team (e.g., Battelle testing staff and vendor) in performing the verification
tests in accordance with the Quality Assurance Project Plan (QAPP),
• Ensure that all quality procedures specified in the QAPP and in the AMS Center
Quality Management Plan1 (QMP) are followed,
-------
Battelle
Management
Battelle AMS Center
Quality Manager
Battelle QA Staff
Water Utilities
Generic Verification Protocol Turbidimeters
Page 7 of 42
Version 1.0
June 4, 2012
AMS Center
Stakeholders
Battelle AMS
Center Manager
EPA AMS Center
Project Officer
Battelle Verification
Test Coordinator
Battelle Testing Staff
EPA AMS Center Quality
Manager
Vendor(s)
Representative(s)
Figure 1. Organizational Chart
Maintain real-time communication with the Battelle AMS Center Manager, Battelle
AMS Center QM, EPA AMS Center Project Officer, and EPA AMS Center QM on
any potential or actual deviations from the QAPP,
Prepare the draft and final QAPP, verification report, and verification statements,
Provide test data, including data from the first day of testing, to the Battelle AMS
Center Manager, Battelle AMS Center QM, EPA AMS Center Project Officer, and
EPA AMS Center QM,
Conduct a technical review of all test data. Designate an appropriate Battelle
technical staff member to review data generated by the VTC,
Revise the draft QAPP, verification report, and verification statements in response to
reviewers' comments,
Document and prepare any deviations to the QAPP that may occur during testing,
Address any comments from reviewers regarding testing or the deviations,
-------
Generic Verification Protocol Turbidimeters
Page 8 of 42
Version 1.0
June 4, 2012
• Respond to any issues raised in assessment reports and audits, including instituting
corrective action as necessary,
• Serve as the primary point of contact for vendor(s) representative(s),
• Coordinate distribution of the final QAPP, verification report(s), and statement(s),
and,
• Establish a budget for the verification tests and manage staff to ensure the budget is
not exceeded.
Battelle's AMS Center Manager. Battelle's manager for the AMS Center will:
• Review the draft and final QAPP,
• Review the draft and final verification report and verification statements,
• Ensure that necessary Battelle resources, including staff and facilities, are committed
to the verification tests,
• Ensure that confidentiality of sensitive vendor information is maintained,
• Ensure that testing staff respond to QAPP deviations and any issues raised in
assessment reports, audits, or from test staff observations, and that any necessary
corrective actions have been implemented,
• Maintain communication with EPA's AMS Center Project Officer and QM, and
• Facilitate a stop work order if Battelle or EPA QA staff discover adverse findings that
will compromise data quality or test results.
Battelle Testing Staff. Battelle Testing Staff will support the VTC in conducting these
verification tests. Battelle Testing Staff will:
• Assist in planning for the tests, and making arrangements for the receipt of and
training on the technologies,
• Attend the verification test kick-off meeting, as requested,
• Assist vendor staff as needed during technology receipt and training,
• Participate in training provided by the vendor(s), as requested,
• Conduct verification testing following all aspects of the ETV AMS Center QMP as
well as this QAPP,
-------
Generic Verification Protocol Turbidimeters
Page 9 of 42
Version 1.0
June 4, 2012
• Record qualitative observations about the maintenance and operation of the
technology(ies) during testing,
• Ensure that the data from the technology(ies) are immediately reviewed for quality,
and compiled, recorded, and transmitted to the VTC, on the first day of testing and
thereafter on at least a weekly basis,
• Notify the VTC of any QAPP deviations and institute corrective action as necessary,
• Support the VTC in the preparation of the QAPP, report, and verification statements,
as necessary, and
• Support the VTC in responding to any issues raised in assessment reports and audits
related to technical performance, statistics, or data reduction as needed.
Battelle's AMS Center Quality Manager. The Battelle QM or a designated QA Officer will:
• Review the draft and final QAPP,
• Attend the verification test kick-off meeting and lead the discussion of the QA
elements of the kick-off meeting checklist,
• Prior to the start of verification testing, verify the presence of applicable training
records, including any vendor training on test equipment,
• Prepare audit checklists,
• Conduct a technical systems audit at least once near the beginning of each
verification test,
• Conduct audits to verify data quality,
• Prepare and distribute an audit report for each audit,
• Verify that audit responses for each audit finding and observation are appropriate and
that corrective action has been implemented effectively,
• Communicate to the VTC and/or technical staff the need for immediate corrective
action if an audit identifies QAPP deviations or practices that threaten data quality,
• Provide a summary of the QA/quality control (QC) activities and results for the
verification reports,
• Review the draft and final verification report and verification statements,
-------
Generic Verification Protocol Turbidimeters
Page 10 of 42
Version 1.0
June 4, 2012
• Maintain real-time communication with the VTC on QA activities, audit results, and
concerns,
• Recommend a stop work order if audits indicate that data quality or safety is being
compromised,
• Work with the VTC and Battelle's AMS Center Manager to resolve data quality
concerns and disputes,
• Delegate QA activities to other Battelle quality staff as needed to meet project
schedules, and
• Review and approve QAPP amendments, deviations and audit reports.
A5.2 Vendor(s)
The vendor's responsibilities are as follows:
• Review and provide comments on the draft QAPP,
• Approve the final QAPP prior to test initiation,
• Provide the technology to be tested for evaluation during the verification tests,
• Provide all equipment/supplies/reagents/consumables needed to operate their
technology for the duration of the verification tests,
• Supply a representative to train Battelle staff in operation of their technology and
provide written consent for Battelle staff to operate their technology during
verification testing,
• Provide written instructions for routine calibration, operation, and maintenance of
their technology, and
• Review and provide comments on the draft verification report and statement for their
technology.
-------
Generic Verification Protocol Turbidimeters
Page 11 of 42
Version 1.0
June 4, 2012
A5.3 EPA
EPA's responsibilities are based on the requirements stated in the "Environmental Technology
Verification Program Quality Management Plan"2 (ETV QMP). The roles of specific EPA
verification staff are as follows:
EPA AMS Center Project Officer. The EPA AMS Center Project Officer will:
• Review the draft QAPP,
• Approve the final QAPP,
• Review and approve deviations to the approved final QAPP,
• Appoint a delegate to review and approve deviations to the approved final QAPP in
his absence, so that testing progress will not be delayed,
• Review the first day of data from the verification tests and provide immediate
comments if concerns are identified,
• Review the draft verification report and statements,
• Oversee the EPA review process for the verification report and statements, and
• Coordinate the submission of verification report(s) and statement(s) for final EPA
approval.
EPA AMS Center Quality Manager. The EPA AMS Center QM will:
• Review the draft QAPP,
• Review deviations to the approved final QAPP,
• Review the first day of data from the verification tests and provide immediate
comments if concerns are identified,
• Perform at the EPA AMS Center QM's option one external technical systems audit
and/or audit of data quality during the verification tests,
• Notify the EPA AMS Center Manager of the need for a stop work order if the
external audit indicates that data quality or safety is being compromised,
• Prepare and distribute an assessment report summarizing results of any external
audits, and
• Review the draft verification report(s) and statement(s).
-------
Generic Verification Protocol Turbidimeters
Page 12 of 4 2
Version 1.0
June 4, 2012
A5.4 Test Facilities
Portions of this verification test will be conducted at the facilities of different water utilities. The
roles of specific water utilities participating in this verification test are as follows:
• Allow facility access to vendor(s), Battelle, and EPA representatives during the
scheduled verification test including set-up and tear-down operations,
• Define facility health and safety requirements to Battelle, EPA, and vendor staff who
may visit the testing facility,
• Provide adequate working space during verification test,
• Provide access to adequate water flow, and
• Provide sufficient power for the simultaneous operation of all test equipment and
technology(ies) being verified.
A5.5. Verification Test Stakeholders
Stakeholders for the generic protocol included:
• Rick Sakaji, East Bay Municipal Water District
• Steve Wendelken and Derek Losh, EPA Office of Water.
A QAPP will developed based on this generic protocol. The responsibilities of verification test
stakeholders who will contribute to the QAPP include:
• Participate in technical panel discussions (when available) and/or review an outline of
the verification tests to provide input to the test design,
• Review and provide input to the QAPP, and
• Review and provide input to the verification report and verification statements.
The names and affiliations of the verification test stakeholders will be listed in the final QAPP.
One of the verification test stakeholders will be from EPA's Office of Water. If the vendor will
be utilizing ETV data to have the turbidimeter recognized under the Alternative Test Procedure
(ATP) program, the Office of Water representative will be one that is involved with the ATP
program.
-------
Generic Verification Protocol Turbidimeters
Page 13 of 42
Version 1.0
June 4, 2012
A6 BACKGROUND
A6.1 Technology Need
The ETV Program's AMS Center conducts third-party performance testing of commercially-
available technologies that detect or monitor natural species or contaminants in air, water, and
soil. Stakeholder committees of buyers and users of such technologies recommend technology
categories, and technologies within those categories, as priorities for testing. Among the
technology categories recommended for testing are turbidimeters. An ETV AMS Center test/QA
plan for online turbidimeters was originally published in 1999
(http://www.epa.gov/nrmrl/std/etv/vt-ams.htmltfTurbidimeters)3. Four technologies were verified
under this test/QA plan. This generic protocol builds on the original test/QA plan for
turbidimeters plus adds elements of testing consistent with current approvals of online
turbidimeters under the EPA ATP program.
The technologies tested under this plan are commercial turbidimeters capable of real-time
monitoring of the low-level turbidity necessary to reliably assess compliance with current
drinking water regulations. In such applications these turbidimeters can provide real-time
continuous monitoring of water quality and allow early warning of potential non-compliance
conditions, whereas grab sample analysis by standard methods is both time-consuming and non-
continuous.
In order for turbidimeters to be used for compliance monitoring, the technology and method
must gain acceptance under the EPA ATP program
(http://water.epa.gov/scitech/methods/cwa/atp/). This acceptance is based on the performance of
the vendor's turbidimeter against an EPA Method 180.14 compliant turbidimeter. This protocol
describes generic testing procedures to evaluate the performance of a turbidimeter that would be
submitted by a vendor for ETV testing. The verification test will involve comparison to an online
turbidimeter which is compliant with EPA Method 180.1 (Appendix A).
A.6.2 Technology Description
This section will describe the specific technology(ies) identified for ETV testing . This section
will be updated for the final version of the QAPP based on the participating technology(ies).
-------
Generic Verification Protocol Turbidimeters
Page 14 of 42
Version 1.0
June 4, 2012
What follows is an example of what might be included in this section; the text should be
accompanied by figures, as appropriate, that illustrate the principles of technology operation.
o
The online turbidimeter technologies to be verified rely upon 90 light scattering (i.e.
nephelometry), or forward scattering, as a means of water quality characterization. These
technologies are capable of continuous monitoring and can be designed either for use directly
in-line by immersion in the sample stream, or alternatively, in a by-pass mode of operation. In
the case of by-pass turbidimeters, the sample stream is drawn from a larger source stream and is
directed through the nephelometer for subsequent analysis, whereas the immersion
turbidimeters are designed for operation through direct submersion in the source water stream.
Although the overall design requirements are significantly different, the basic components of
these technologies are similar.
In general, these technologies contain at least the following components:
• Light source
• Optics
• Detector.
Typically the light sources for these technologies belong to one of two distinct groups.
Historically, a filtered, broadband source has been used for turbidity measurement. This type of
source consists of a tungsten lamp operated at a color temperature between 2200 - 3000 °C.
More recently, narrow wavelength sources, including light emitting diodes (LEDs) and lasers,
with intensity maxima in the IR wavelength range have been introduced as an alternative light
source for these measurements. The technologies to be tested employ one or more light sources
which fit into these categories and can be configured in single or multiple beam arrangements.
Optics in these technologies are used for focusing of the incident source beam and collection of
the scattered light. The detectors used are generally either photomultiplier tubes or photodiode
assemblies and are chosen to match the spectral output of the light source with the peak detector
-------
Generic Verification Protocol Turbidimeters
Page 15 of 42
Version 1.0
June 4, 2012
response. The technologies generally provide a digital output which can be processed remotely to
allow continuous, in-situ monitoring capabilities.
A7 VERIFICATION TEST DESCRIPTION AND SCHEDULE
A final QAPP derived from this generic protocol will provide a plan for generating performance
data for online turbidimeters. The data generated are intended to provide organizations and users
interested in turbidimeter performance with information on the tested technology(ies) in
comparison to turbidimeters compliant with EPA Method 180.1.
The overall objective of the verification test is to provide quantitative verification of the
performance of online turbidimeters under realistic operational conditions. These technologies
are commonly used for water quality monitoring in water treatment facilities and to help ensure
compliance with drinking water regulations. For these applications, the turbidimeters must be
accurate (±10%) relative to the reference measurement (in this case, a Method 180.1 compliant
turbidimeter) used for reporting, and must be precise (±10%). Since these technologies are
intended for use online for compliance purposes, they should be reliable and exhibit stability to
avoid frequent or unscheduled offline maintenance. The verification test is designed to address
and quantify these performance characteristics.
A7.1 Verification Test Description
Since turbidity is a measurement of light scattering, a number of factors can influence the
responses of these technologies to a given sample solution. Instrumental design, including light
source selection and geometric differences, may result in significant differences between the
responses of the technology(ies) being verified and the reference measurements. Further
differences may result from the variable nature of both the size and composition of particles
typically found in water streams, relative to those in standard formazin or SDVB solutions.
These issues will be evaluated in this verification test by utilizing a variety of samples in the test
design.
Additionally, to assess the response of these technologies to both prepared solutions and to real
world water samples, verification will involve both offline and online tests. The offline test will
-------
Generic Verification Protocol Turbidimeters
Page 16 of 42
Version 1.0
June 4, 2012
include challenging the technologies with a series of prepared standards or other test solutions to
verify performance under well controlled conditions. The online test will assess performance
under realistic operating conditions by monitoring a sample stream in at least three municipal
treatment facilities under typical normal operation.
Testing will consist of analyzing surface water, ground water, and fortified deionized (DI) water
samples using both the turbidimeter undergoing verification as well as an online reference
turbidimeter which is EPA Method 180.1 compliant. The reference turbidimeter must be
specified in the final QAPP. The turbidimeters will be tested online in at least three water
utilities and "offline" in a laboratory. Side-by-side measurements of turbidity using both the
turbidimeter undergoing verification and a Method 180.1 compliant reference turbidimeter using
the respective plant effluent will be conducted at each water utility. Offline testing of the
turbidimeters will require the use of a sample recirculation system, similar to that used in the
1999 ETV test of online turbidimeters conducted at Battelle3 with the exception that grab
sampling ports will not be required. In the final QAPP, Appendix B should describe and
illustrate the sample recirculation system to be used in testing.
Turbidimeters will be verified for the following performance parameters (attributes):
• Accuracy,
• Precision,
• Data completeness, and
• Operational and sustainability factors.
A7.2 Proposed Testing Schedule
Table 1 shows an estimated schedule of testing and data analysis/reporting activities to be
conducted in a verification test designed using this generic protocol. Data from the verification
testing should be immediately checked by the testing staff. For each technology, data should be
compiled, recorded, and transmitted to the VTC on the first day of testing and on a weekly basis
thereafter so that any data quality issues can be rapidly identified. The VTC should post the first
day of testing data for QA and EPA review within five days of test initiation and the remaining
-------
Generic Verification Protocol Turbidimeters
Page 17 of 42
Version 1.0
June 4, 2012
data every two weeks thereafter. Unaudited data should include the disclaimer "has not been
reviewed by Battelle QM."
Table 1. Estimated Verification Testing Schedule
Task
Finalize QAPP
Test Preparation
Testing at Battelle
Testing at Water Utility 1
Testing at Water Utility 2
Testing at Water Utility 3
Draft report
Final report
Technical Systems Audits
Audit of Data Quality
Month 1
Month 2
X
X
Month 3
X
Month 4
X
X
Month 5
Month 6
A7.3 Testing Facilities
At least three water utilities will participate in the online testing of turbidimeters for
verifications. Online testing must include at least one surface water source and one ground
water source. Offline testing is anticipated to be conducted at Battelle's Columbus, OH facility,
although another laboratory could be utilized if the accommodations for sample recirculation
system are available to support testing.
The vendor must train Battelle staff and participating staff at each utility in the operation of their
turbidimeter. Battelle staff trained in the operation of the turbidimeter to be verified will set up
the turbidimeter for online operation at the testing site. It is anticipated that the same operator
from each participating laboratory will operate both the turbidimeter being verified and the
reference turbidimeter during testing.
A8 QUALITY OBJECTIVES AND CRITERIA FOR MEASUREMENT DATA
The objective of these verification tests is to verify the performance of online turbidimeters
against an EPA Method 180.1 compliant reference turbidimeter. The verification tests will also
rely upon operator observations to assess other performance characteristics of the turbidimeters
including data completeness, ease of use, and maintenance requirements.
-------
Generic Verification Protocol Turbidimeters
Page 18 of 42
Version 1.0
June 4, 2012
Data quality indicators (DQIs) ensure that these verification tests provide suitable data for a
robust evaluation of performance. DQIs have been established for flow meter accuracy and
reference turbidimeter accuracy vs. an independent standard. The DQIs were established to
ensure that data used to support the quantitative performance evaluations of turbidimeters are of
sufficient quality. The DQI and quantitative acceptance criteria for these supporting
measurements are defined in Table 2. Quantitative performance parameters for vendor
technology performance are discussed in Section B.
Additionally, the verification tests rely in part on observations of the Battelle testing staff for
assessment of the performance of the turbidimeters being tested. The requirements for these
observations are described in the discussion of documentation requirements and data review,
verification, and validation requirements for these verification tests.
The Battelle QM or designee will perform a technical systems audit (TSA) of laboratory testing
activities to augment these QA/QC requirements. A TSA will be performed at Battelle during
offline testing and at one participating utility during online testing and will occur within the first
week of each testing phase. The EPA QM also may conduct an independent TSA at the EPA
QM's discretion.
A9 SPECIAL TRAINING/CERTIFICATION
Documentation of training related to technology testing, data analysis, and reporting is
maintained for all Battelle technical staff in training files at their respective locations. The
Battelle QM may verify the presence of appropriate training records prior to the start of testing.
The vendors will be required to train technical staff from Battelle and each participating utility
prior to the start of testing. Battelle will document this training with a consent form, signed by
the vendor, which states which staff have been trained to use the vendor's turbidimeter. In the
event that other staff members are required to use the technologies, they will be trained by the
Battelle staff that were trained by the vendors. Battelle technical staff supporting these
verification tests have a minimum of a Bachelor's degree in a scientific field or equivalent work
experience.
-------
Generic Verification Protocol Turbidimeters
Page 19 of 42
Version 1.0
June 4, 2012
Table 2. DQI and Criteria for Critical Supporting Measurements
Technology
Flow Meter
Reference
Turbidimeter
DQI
Flow Meter
Accuracy
Reference
Method
Accuracy
Method of
Assessment
Stop watch and
graduated
cylinder
Formazin or
SDVB standard
Frequency
Once
Daily prior to
testing
Measurement
Quality
Objectives
(MQO)
±10%
±10%
Corrective Action
Recalibrate or
replace
Recalibrate
A10 DOCUMENTATION AND RECORDS
The documents for these verification tests will include the final QAPP, vendor instructions,
reference methods, the verification report, verification statement, and audit reports. The project
records will include certificates of analysis (COA), chain-of-custody forms, laboratory record
books (LRB), data collection forms, electronic files (both raw data and spreadsheets), and QA
audit files. The final QAPP should include the forms to be used for online and offline data
collection. All of these documents and records will be maintained at the laboratory, with the
participating utilities, or in the VTC's office during the tests. At the conclusion of testing, all
raw data and test records will be provided to the VTC. All test records and copies of supporting
records from the participating utilities (and laboratory(ies), if not Battelle) will be transferred to
permanent storage at Battelle's Records Management Office (RMO) at the conclusion of the
verification tests. Electronic documents and records will also be uploaded to a SharePoint site
designated for these tests and will be provided to EPA upon request. All Battelle LRBs are
stored indefinitely by Battelle's RMO; other project-related data are stored for 10 years. EPA
will be notified before disposal of any files. Section BIO further details the data recording
practices and responsibilities.
All data generated during the conduct of this project will be recorded directly, promptly, and
legibly in ink. All data entries will be dated on the date of entry and signed or initialed by the
person entering the data. Any changes in entries will be made so as not to obscure the original
-------
Generic Verification Protocol Turbidimeters
Page 20 of 42
Version 1.0
June 4, 2012
entry, will be dated and signed or initialed at the time of the change and shall indicate the reason
for the change. Project-specific data forms will be developed prior to testing to ensure that all
critical information is documented in real time. The draft forms will be provided to the Battelle
QM for review prior to use so that appropriate changes, if any, can be made.
-------
Generic Verification Protocol Turbidimeters
Page 21 of 42
Version 1.0
June 4, 2012
SECTION B
MEASUREMENT AND DATA ACQUISITION
Bl EXPERIMENTAL DESIGN
The verification tests described in this generic protocol address verification of turbidimeters by
evaluating the following performance factors:
• Accuracy,
• Precision,
• Data completeness, and
• Operational and sustainability factors.
To assess the response of these technologies to both prepared solutions and to real world water
samples, verification will involve both online and offline tests. The online test will assess long-
term performance under realistic operating conditions by monitoring a sample stream in a
municipal treatment facility under typical normal operation. The offline test will include
challenging the technologies with a series of prepared test solutions to verify performance under
well controlled conditions. Comparisons will be made to an approved EPA Method 180.1
alternative online turbidimeter to assess performance relative to this standard reference method.
Both online and offline testing will involve continuous monitoring of turbidity by multiple
technologies. Throughout the testing period, a PC-based data acquisition system or data logger
will be used to collect measurements from the online turbidimeters at preset intervals, as needed.
If a turbidimeter has its own data logging capability, then that capability will be used to record
the data.
Data will be evaluated in terms of accuracy and precision for the turbidimeter undergoing
testing. Accuracy will be determined as the degree of agreement of the turbidimeter undergoing
testing with a reference turbidimeter; and precision will be determined as the degree of
repeatability between successive measurements of the same sample. Any seemingly large
differences between the turbidimeter undergoing testing and the reference turbidimeter (i.e., an
approved EPA Method 180.1 alternative online turbidimeter) noted during testing will be
reported immediately to the vendor so that corrective action can be taken, as necessary. Table 3
-------
Generic Verification Protocol Turbidimeters
Page 22 of 42
Version 1.0
June 4, 2012
presents a summary of the tests to be performed. The verification test will be conducted during
an approximate 6 month timeframe. Throughout the verification tests, each turbidimeter will be
operated by Battelle staff or water utility staff that have been trained by the vendor.
Table 3. Summary of Tests and Testing Frequency
Dhacp
t 1 ICIOG
Onlinp
Vu/l 1 III 1C
Offline
Offline
Both phases
Performance
Parameter
Accuracy
(Comparability)
Accuracy
(Percent Error)
Precision
(RSD)
Data
Completeness
Objective
Determine the
degree of
agreement
between trends
in the data from
the EPA
reference
method
Determine the
degree of
agreement with
the EPA
reference
method using
formazin or
SDVB solutions
Determine the
degree of
repeatability
between
successive
measurements
of the same
sample
Overall Amount
of data
returned by
each
technology
Comparison
Based On
EPA 180.1
accepted
alternative
online
turbidimeter
results
EPA 180.1
accepted
alternative
online
turbidimeter
results
Technology
results
Technology
results
Testing
Frequency
3 utilities
1 0 replicates
low NTU
I\_/VV 1 >l 1 W
cnh ition
OUILJ LIUI 1
onrj
Cll IU
1 0 replicates
high NTU
cnh ition
OUILJ LIUI 1
1 0 replicates
low NTU
solution
and
1 0 replicates
high NTU
solution
Once. Based
on overall data
return achieved
Number of
Data Points
Total of 3
surface and
groundwater
runs
20
replicates
20
replicates
Variable,
depending
on
frequency of
data
collection by
technology
Bl.l Online Testing
The online test phase will focus on assessing the accuracy and performance of the turbidimeters
under realistic operating conditions, through the monitoring of typical sample streams. It is
expected that natural meteorological occurrences will contribute to the variability of water
quality in the treatment facility, and therefore provide a natural range of turbidity over which
technology performance can be characterized.
-------
Generic Verification Protocol Turbidimeters
Page 23 of 42
Version 1.0
June 4, 2012
The online phase will involve monitoring a sample stream of variable turbidity within the
treatment facilities. Testing will be conducted at least three water utility facilities and at least one
surface water sample and one ground water sample will be evaluated. Thus, at least three
technology data sets will be collected during the online testing phase. One water utility may
provide both a surface water and a groundwater sample, but from separate plants in the city. The
specific location and number of water utilities to be included in the verification test will be
defined in the final QAPP. The online turbidimeters will operate in parallel from the same
source, not in series.
An online Method 180.1-compliant reference turbidimeter installed in parallel to the
technologies being verified will be used to assess accuracy (comparability). Both the
turbidimeter undergoing verification and the reference turbidimeter will be connected to the
same water effluent line for analysis. The turbidimeters will either be directly connected to the
effluent line within 5 feet of each other or connected to the same port in an effluent line using a
y-connector to split the effluent evenly into both turbidimeters, depending on the configuration at
the water utility. Both turbidimeters will operate for 24 hours, collecting data once per minute,
or at the output rate of the turbidimeter. After 24 hours, the turbidimeter undergoing verification
will be turned off to preserve the data for downloading. Reference turbidimeter readings
collected by the water utility during this time period will be provided to Battelle within 5 days of
collection. One aim of the verification test is to assess the real-world variability of the
technologies being tested. To that end, measurements which may appear to be anomalous with
comparable data will be retained in the data set. If an assignable cause can be identified, this
cause will be described in the verification report.
B1.2 Offline Testing
The offline phase of the test will be aimed at assessing the accuracy and precision of the
turbidimeters relative to the standard methods under controlled conditions. These parameters
will be tested in an offline recirculation system that will enable testing of known formazin or
styrene divinylbenzene (SDVB) solutions by the technologies being verified and a method
180.1-compliant online reference turbidimeter. Offline testing will be conducted at Battelle's
-------
Generic Verification Protocol Turbidimeters
Page 24 of 42
Version 1.0
June 4, 2012
laboratory or a laboratory with an appropriate recirculation system as shown in Appendix B [to
be included in the final QAPP].
Ten separate solutions of formazin in DI water will be prepared in individual 10 liter (L)
containers using formazin or SDVB primary stock solution(s) purchased from a commercial
supplier. These formazin or SDVB stock solution(s) will be diluted to the appropriate
concentration in DI water. A high nephelometric turbidity unit (NTU) and a low NTU fortified
DI water sample will be prepared and evaluated on each turbidimeter during offline analysis.
Concentrations to be tested will be approximately 100 'milli' (1/1,000 units) nephelometric
turbidity units (NTU). Milli-NTU will be expressed as mNTU, or simply 'low' NTU. In
contrast, a concentration of 800 mNTU will be set as the 'high' NTU. Ten replicates of each
fortified DI water sample (low NTU and high NTU) will be tested simultaneously on both the
technology being verified and a reference turbidimeter . Because online turbidimeters require
that a continuous flow be maintained for proper operation, a sample recirculation system will be
used to introduce the fortified samples to the turbidimeters.
B 1.2.1 Recirculating Test System for Offline testing
The recirculation system used for this verification test will be designed and built at Battelle or
the selected laboratory or location used for offline analysis. Any laboratory or location used for
offline analysis will need to be able to supply and build a sufficient recirculation system. The
selected location of offline testing will be defined in the final QAPP. The recirculation system
will be designed and built to minimize the number of flow obstructions and potential sources of
turbulence. Any valves used in the recirculation system will be either two or three way, full bore,
ball valves. The recirculation pump will be a standard centrifugal pump and will have sufficient
flow and pressure capabilities to meet the requirements of all the turbidimeters being verified. A
flow meter will be installed downstream of each turbidimeter, and if needed, a pressure gauge
will be installed downstream of the turbidimeters. In general, the tubing will be a flexible plastic
material appropriate for high purity applications, and the diameter will be at least 1/2M to allow
adequate flow for all the turbidimeters. Similarly, most connections will be made using hard
plastic compression fittings, although in some cases, tubing of smaller diameter or different
material, or other fittings may be used for certain portions of the system. Each turbidimeter will
-------
Generic Verification Protocol Turbidimeters
Page 25 of 42
Version 1.0
June 4, 2012
be installed per the recommendations of the respective vendor. Before the test, the recirculation
system will be checked by laboratory staff to ensure system integrity, including proper flow
through the system, and adequate pumping capacity for recirculation. The flow rate
requirements will be defined in the final QAPP.
Bl.2.2 Detailed Procedure for Offline Testing
The turbidimeters will operate in parallel from the same source, not in series so that the
turbidimeter(s) being verified and the EPA-approved Method 180.1 compliant reference
turbidimeter are testing the same solution simultaneously. Fortified samples will be prepared as
described in Section B1.2. Each test solution will be introduced individually to each
turbidimeter at approximately the same flow rate (±10%). Flow meters will be used to monitor
the flow rates of the circulation system. Testing will proceed according to the following steps:
1. Unspiked DI water will be pumped to each of the turbidimeters.
2. Once continuous flow is established through each of the turbidimeters and the readings
have stabilized, a reading will be recorded from each turbidimeter onto datasheets. This
reading will represent the baseline turbidity before the formazin or SDVB spike solution
is added.
3. A formazin or SDVB stock solution will be added to the recirculation system at the
desired concentration (low NTU or high NTU) .
4. The fortified water will be pumped to each of the turbidimeters. Once the turbidimeter
readings have stabilized, a reading will be recorded from each turbidimeter onto
datasheets. This reading will represent the measured turbidity after the formazin or
SDVB spike.
5. Readings will not be recorded from either turbidimeter before or after the spike until both
turbidimeter readings have stabilized. Stable is defined as changes in instrument read-
outs of < 10% for 5 minutes.
6. Once a fortified replicate sample has been measured, both turbidimeters will be flushed
clean using a container filled with clean, unspiked DI water.
7. As part of the flushing procedure, the turbidimeter sample chamber will be emptied at
least 4 times to ensure that no residual fortified sample remains. Flush is sufficient when
the readings for each unit return to within 5% of background (DI only) levels.
-------
Generic Verification Protocol Turbidimeters
Page 26 of 42
Version 1.0
June 4, 2012
8. After both turbidimeters have been flushed clean, another replicate sample will be
introduced to both turbidimeters following the same procedure previously described
(Steps 3 - 6).
9. Ten replicates of each standard concentration level will be evaluated by the turbidimeters.
Results from the turbidimeters being evaluated will be recorded by the operator on data
sheets supplied by Battelle or automatically by any data-logging system supplied with the
turbidimeter or used by the participating water utilities.
B1.3 Data Completeness
No additional test procedures will be carried out specifically to address data completeness. This
parameter will be assessed based on the overall data return achieved by each technology (Section
B.I.5.3).
B1.4 Operational and Sustainability Factors
Operational and sustainability factors such as waste generated, maintenance needs, calibration
frequency, data output, consumables used, power requirements, hazardous components, ease of
use, repair requirements, and sample throughput will be evaluated based on operator
observations. Battelle testing staff and testing staff from any participating utilities will document
observations in a LRB or data sheets. Examples of information to be recorded include the daily
status of diagnostic indicators for the technology, use or replacement of any consumables, the
duration and causes of any technology down time or data acquisition failure, operator
observations about technology startup, ease of use, clarity of the vendor's instruction manual,
user-friendliness of any needed software, overall convenience of the technologies and
accessories/consumables, or the number of samples that could be processed per hour or per day.
Battelle will summarize these observations to aid in describing the technology performance in
the verification report on each technology.
B1.5 Statistical Evaluation
The statistical methods and calculations used for evaluation of the quantitative performance
parameters are described in the following sections.
-------
Generic Verification Protocol Turbidimeters
Page 27 of 42
Version 1.0
June 4, 2012
Bl. 5.1 Accuracy
For offline testing, the relative accuracy of the results of the turbidimeter undergoing verification
with respect to the Method 180.1 compliant turbidimeter results will be assessed. Relative
accuracy will be determined for the standard formazin or SDVB solution using a percent error
calculation, where the absolute difference between the average reference turbidimeter results and
average turbidimeter undergoing verification results is divided by the average reference
turbidimeter results. Accuracy results will be evaluated in regard to any criteria defined by EPA
as part of the verification test to determine the turbidimeters acceptance against Method 180.1.
... , .. „ . „ \Average Reference Turbidimeter Results-Average Technology Results] ,,x
Relative Percent Error = J — (1)
Average Reference Turbidimeter Resutls ^ '
The purpose of the online portion of the test will not be to determine if the Method 180.1
compliant reference turbidimeter and the turbidimeter undergoing verification provide the same
turbidity readings at each interval, but to determine if the turbidimeter undergoing verification is
tracking the same changes that the reference turbidimeter is reporting across the measurement
period. Therefore, for the ground and surface water data from the public utilities, accuracy will
be assessed by plotting the raw data for both turbidimeters on the same graph to determine how
well the measurements track each other. Averages and standard deviations of the data for each
turbidimeter will be reported. Based on calculations performed in the Hach FT660 protocol5,
comparisons between the reference turbidimeter and turbidimeter undergoing verification will be
conducted using non-parametric tests as appropriate.
BL 5.2 Precision
Precision will be evaluated using the replicate results for the fortified DI water samples.
Precision will be reported in terms of the percent relative standard deviation (%RSD) of a group
of measurement replicates. Readings from the spiked replicate samples will be blank (i.e.,
background)-corrected using the initial, before-spike measurements made on each replicate. The
average, standard deviation, and %RSD will be calculated using these blank-corrected values for
each turbidimeter at each spike level. Equations 2 and 3 will be used to calculate precision:
-------
n— i
Generic Verification Protocol Turbidimeters
Page 28 of 42
Version 1.0
June 4, 2012
(2)
where S is the standard deviation, n is the number of replicate samples, A-4 is the technology
measuren
samples.
measurement for the k* sample, and Mis the average technology measurement of the replicate
S
RSD(%) =
^ } M
x 100 (3)
Bl.5.3 Data Completeness
Data completeness will be assessed based on the overall data return achieved by the technology
during the testing period. For each technology, this calculation will use the total number of
apparently valid data points divided by the total number of data points potentially available from
all testing. The causes of any incompleteness of data return will be established from operator
observations, and noted in the discussion of data completeness results. The goal for data
completeness is 100%. Any problems with the data will be brought to the attention of the VTC.
The VTC will first work with the vendor to resolve any data issues. Data issues which remain
will be discussed with the Battelle QM and AMS Center Manager, and EPA Project Officer and
QM, as necessary.
B1.6 Reporting
The statistical comparisons described above will be conducted separately for each technology,
and information on the operational performance will be compiled and reported. One verification
report and one verification statement will be prepared for each technology. The verification
report will present the test procedures and test data, as well as the results of the statistical
evaluation of those data.
Operational aspects of the technologies will be recorded by testing staff during and immediately
following testing and will be summarized in the verification report. For example, descriptions of
the data acquisition procedures, use of vendor-supplied proprietary software, consumables used,
repairs and maintenance needed, and the nature of any problems will be presented in the report.
-------
Generic Verification Protocol Turbidimeters
Page 29 of 42
Version 1.0
June 4, 2012
The verification report will briefly describe the ETV program, the AMS Center, and the
procedures used in verification testing. The results of the verification tests regarding technology
performance will be stated quantitatively. The draft verification report will be reviewed by the
vendor, EPA, and other peer reviewers. The resulting review comments will be addressed in a
subsequent revision of the report, and the peer review comments and responses will be tabulated
to document the peer review process and submitted to EPA. The reporting and review process
will be conducted according to the requirements of the ETV/AMS Center QMP.1
B2 SAMPLING METHOD REQUIREMENTS
No discrete grab samples will be collected for this test and therefore the use of traditional sample
collection and handling methods are not applicable. All samples generated and analyzed for this
test will be in situ samples and tested by in-line technologies. Water effluent will be tested as-is
from each participating water utility.
Formazin or SDVB solutions, standards for use as calibration standards for the reference
turbidimeters, and the material used for the performance evaluation audit (PEA) will be
purchased from a commercial supplier (i.e., Hach Company, Loveland, CO). When available,
stock solutions of the correct turbidity needed for calibration will be purchased. When not
available, the standard solution will be prepared through the dilution of a purchased formazin or
SDVB solution using distilled, deionized water. Preparation of diluted standard solutions will be
performed within 24 hours of their use and stored at 25±3°C. For long term storage, the
purchased standards will be stored as recommended by the vendor. Excess and waste solutions
will be disposed of in accordance with the site procedures. When not in use, the glassware used
for preparation and storage of these solutions will be kept scrupulously clean.
B3 SAMPLE HANDLING AND CUSTODY REQUIREMENTS
No discrete grab samples will be collected for this test and therefore the use of traditional sample
handling and custody procedures are not applicable. All solutions used in offline testing will be
prepared at Battelle. The receipt of standards used for testing will be documented.
-------
Generic Verification Protocol Turbidimeters
Page 30 of 42
Version 1.0
June 4, 2012
B4 ANALYTICAL METHOD REQUIREMENTS
ANPA Method 180.1-compliant online turbidimeter will be used as the reference technology for
this verification test. If not available otherwise, this turbidimeter will be rented for testing and
operated by Battelle (or other laboratory) for the offline tests and will be supplied and operated
by the participating water utilities for the online testing. Testing using the reference turbidimeter
will follow manufacturer's recommendations and EPA Method 180.1. Once the specific
turbidimeter(s) is identified, detailed operational requirements will be defined in the final QAPP.
Appropriate data logging instruments will be used during offline testing to record results from
the reference turbidimeter. Results will be recorded by the individual water utilities and supplied
to Battelle for the online testing.
B5 QUALITY CONTROL REQUIREMENTS
Quality control procedures will follow the requirements described in this protocol, the final
r\
QAPP, EPA Method 180.1, the ETV QMP , and any vendor specified requirements for analysis
using their turbidimeters. All standard values and equipment calibrations for these technologies
will be documented in the study records. DQIs are defined in Table 2. Potential QC samples
and measurement quality objectives (MQOs) are defined in Table 5.
B6 INSTRUMENT/ EQUIPMENT TESTING, INSPECTION, AND MAINTENANCE
Battelle staff will operate and maintain the turbidimeters as directed by the vendor during staff
training and as noted in the technology operating manuals. The vendor will be consulted if
issues with the technologies arise. The reference turbidimeter will be operated and maintained
per the manufacturer's instructions or applicable testing facility SOPs by Battelle and the water
utility staff. Critical measurements and MQOs related to operating the turbidimeters are
included in Table 4.
-------
Generic Verification Protocol Turbidimeters
Page 31 of 42
Version 1.0
June 4, 2012
B7 INSTRUMENT CALIBRATION AND FREQUENCY
Each reference turbidimeter used in testing will be calibrated before any testing begins in
accordance with the procedures described in the manufacturer's instrument manual to ensure that
the instrument is working properly. The calibration will be verified daily using a check standard
in the mid-range of the initial calibration prior to testing. Calibration standards must bracket the
NTU range being tested. Standard solutions necessary for calibration of the reference
turbidimeter will be purchased from a commercial vendor. When available, the standards used in
the calibration, or calibration check, will be purchased with the appropriate turbidity value for
direct evaluation. Otherwise, the standard solution will be prepared through subsequent dilution
of stock formazin or SDVB solution with DI water using Class A volumetric glassware.
Calibration for each participating technology will be performed according to the vendor's
instructions.
Each flow meter will be factory calibrated and will be checked once during the verification test
by measuring the time required to pass a known volume of liquid through the individual meters
for a specific time period. If the calibration check indicates an error in excess of 10%, the meter
will be recalibrated, when feasible, or replaced.
-------
Generic Verification Protocol Turbidimeters
Page 32 of 42
Version 1.0
June 4, 2012
Table 4. Acceptance Criteria for Quality Control Samples and Turbidimeter Calibration
Test
Reference
turbidimeter
calibration (if
needed)
Reference
turbidimeter
calibration
rhprk
Reference
turbidimeter
Reagent
Blank
Reference
turbidimeter
Quality
Control
Sample
turbidimeter
Reagent
Blank
Method of
Assessment
Initial Calibration
Linearity test or
as specified by
manufacturer
Formazin or
SDVB check
standard
Deionized water
AMCO-AEPA-1
standard solution
or independent
formazin or
SDVB solution
Per vendor
Deionized water
Frequency
Initially prior to
testing
Daily prior to
use
Prior to each
test and
between
offline
replicate test
solutions
Once prior and
quarterly
Once or per
vendor's
instructions
Prior to each
test and
between
offline
replicate test
solutions
Measurement Quality
Objective (MQO)
0.90
-------
Generic Verification Protocol Turbidimeters
Page 33 of 42
Version 1.0
June 4, 2012
B9 NON-DIRECT MEASUREMENTS
Non-direct measurements will not be used during these verification tests.
BIO DATA MANAGEMENT
Various types of data will be acquired and recorded electronically or manually by Battelle staff
and staff from participating utilities during these verification tests. All manually-recorded data,
such as solution preparation records and results from supporting analyses, will be recorded
according to Section BIO. Table 5 summarizes the types of data to be recorded. All
maintenance activities, repairs, calibrations, and operator observations relevant to the operation
of the monitoring systems being tested will be documented by Battelle staffer staff from
participating utilities in an LRB or on data sheets. Report formats will include all necessary data
to allow traceability from the raw data to final results.
Records received by or generated by any Battelle staffer staff from participating utilities during
testing will be reviewed by a Battelle staff member within five days of receipt or generation,
respectively, before the records are used to calculate, evaluate, or report verification results. If a
Battelle staff member generated the record, this review will be performed by a Battelle technical
staff member involved in the verification test, but not the staff member who originally received
or generated the record. The review will be documented by the person performing the review by
adding his/her initials and date to the hard copy of the record being reviewed. Some of the
checks that will be performed include:
• QC samples and calibration standards were analyzed according to the QAPP and
the acceptance criteria were met. Corrective action for exceedances was taken,
• 100% hand-entered and/or manually calculated data were checked for accuracy,
• Calculations performed by software are verified at a frequency sufficient to ensure
that the formulas are correct, appropriate, and consistent,
• For each cut and paste function, the first and last data value was verified vs. the
source data,
• Data are reported in the units specified in the QAPP, and
-------
Generic Verification Protocol Turbidimeters
Page 34 of 42
Version 1.0
June 4, 2012
Results of QC samples are reported.
Calculations to be checked include any statistical and concentration calculations described in the
QAPP. A dedicated shared folder within the ETV AMS Center SharePoint site will be
established for all project records.
Battelle will provide technology test data (including records, data sheets, and notebook records)
from the first day of testing within five days of generation to EPA for simultaneous review.
Thereafter, the data will be provided to EPA every two weeks. The goal of this data delivery
schedule is prompt identification and resolution of any data collection or recording issues. These
data will be labeled as preliminary and may not have had a QA review before their release.
Table 5. Summary of Data Recording Process
Data to Be Recorded
Dates and details of
test events
Technology
operator/analyst, data
collection and analysis
dates, sample volume
and/or time, sample
description
Technology and
reference test
calibration information,
reagent and test
solution information,
Turbidimeter readings
Where
Recorded
ETV LRBs or
data forms
ETV LRBs or
electronically
ETV LRBs or
electronically
ETV LRBs or
electronically
How Often
Recorded
Start/end of
test event
When
performed
When
performed
Each
measurement
initiated by
testing staff
By Whom
Battelle staff
Staff from
participating
utilities
Battelle staff
Staff from
participating
utilities
Battelle staff
Battelle staff
Staff from
participating
utilities
Battelle staff
Battelle staff
Staff from
participating
utilities
Battelle staff
Disposition of Data
Used to
organize/check test
results; manually
incorporated in data
spreadsheets as
necessary
Incorporated in
verification report as
necessary
Incorporated in
verification report as
necessary
Converted to
spreadsheet for
statistical analysis
and comparisons
-------
Generic Verification Protocol Turbidimeters
Page 35 of 42
Version 1.0
June 4, 2012
SECTION C
ASSESSMENT AND OVERSIGHT
Cl ASSESSMENTS AND RESPONSE ACTIONS
Every effort will be made in these verification tests to anticipate and resolve potential problems
before the quality of performance is compromised. One of the major objectives of the QAPP is
to establish mechanisms necessary to ensure this. Internal quality control measures described in
the final QAPP, which is peer reviewed by a panel of outside experts, implemented by the
technical staff and monitored by the VTC, will give information on data quality on a day-to-day
basis. The responsibility for interpreting the results of these checks and resolving any potential
problems resides with the VTC, who will contact the Battelle AMS Center Manager, Battelle
AMS Center QM, EPA AMS Center Project Officer, and EPA AMS Center QM if any
deviations from the QAPP are observed. The VTC will describe the deviation in a
teleconference or by email, and once a path forward is determined and agreed upon with EPA,
the deviation form will be completed. Technical staff have the responsibility to identify
problems that could affect data quality or the ability to use the data. Any problems that are
identified will be reported to the VTC. Technical staff and the VTC will work with the Battelle
QM to resolve any issues. Action will be taken by the VTC and Battelle testing staff to identify
and appropriately address the issue, and minimize losses and correct data, where possible.
Independent of any EPA QA activities, Battelle will be responsible for ensuring that the
following audits are conducted as part of these verification tests.
Cl.l Performance Evaluation Audit
A PEA will be conducted to verify the accuracy of reference turbidimeter readings which will
be the basis of determining technology accuracy. A separate PEA will be conducted for each
reference turbidimeter used in testing.
The PEA sample will be analyzed after routine maintenance and calibration of the reference
turbidimeters by analyzing a standard formazin or SDVB solution and comparing the results to a
reference that is independent of standards used during the test (i.e., AMCO-AEPA-1 standard
-------
Generic Verification Protocol Turbidimeters
Page 36 of 42
Version 1.0
June 4, 2012
solution). Agreement between the formazin or SDVB solution turbidity readings and AMCO-
AEPA-1 must be within 10% for each reference turbidimeter. If this criterion is not met, the
reference turbidimeter must be recalibrated.
C1.2 Technical Systems Audits
The Battelle QM or designee will perform a TSA at Battelle during offline testing and at one
participating utility during online testing. The purpose of these audits is to ensure that the
verification tests are being performed in accordance with the AMS Center QMP1 and the QAPP.
The Battelle QM will compare actual test procedures to those specified or referenced in this plan,
and review data acquisition and handling procedures. The Battelle QM or designee will prepare
a project-specific checklist based on the QAPP requirements to guide the TSA, which will
include a review of the test location and general testing conditions; observe the testing activities;
and review test documentation. The Battelle QM will also check data acquisition procedures,
and confer with testing staff. The Battelle QM will prepare an initial TSA report and will submit
the report to the EPA QA Manager (with no corrective actions documented) and VTC within 10
business days after completion of the audit. A copy of each final TSA report (with corrective
actions documented) will be provided to the EPA AMS Center Project Officer and QM within 20
business days after completion of the audit. At EPA's discretion, EPA QA staff may also
conduct an independent on-site TSA during the verification tests. The TSA findings will be
communicated to technical staff at the time of the audit and documented in the TSA reports.
C1.3 Data Quality Audits
As an EPA QA Category III test, the Battelle QM, or designee, will audit at least 10% of the
sample results data acquired in the verification tests and 100% of the calibration and QC data
versus the QAPP requirements. Two Audits of Data Quality (ADQs) will be conducted for this
project: Data collected on the first day of testing for each technology will be audited within 10
business days of receipt and assessed using a project-specific checklist. The remaining data will
be audited at the conclusion of testing and will be completed within 10 business days of receipt
of all test data. During these audits, the Battelle QM, or designee, will trace the data from initial
acquisition (as received from the vendor's technology), through reduction and statistical
comparisons, to final reporting. All calculations performed on the data undergoing the ADQ will
-------
Generic Verification Protocol Turbidimeters
Page 37 of 42
Version 1.0
June 4, 2012
be checked. Data must undergo a 100% validation and verification by technical staff (i.e., VTC
or designee) before it is assessed as part of the ADQ. All QC data and all calculations performed
on the data undergoing the audit will be checked by the Battelle QM or designee. Results of
each ADQ will be documented using the checklist and reported to the VTC and EPA within 10
business days after completion of the audit. A final ADQ that assesses overall data quality,
including accuracy and completeness of the technical report, will be prepared as a narrative and
distributed to the VTC and EPA within 10 business days of completion of the audit.
C1.4 QA/QC Reporting
Each assessment and audit will be documented in accordance with Section 3.3.4 of the AMS
Center QMP.1 The results of all audits will be submitted to EPA within 10 business days as
noted above. Assessment reports will include the following:
• Identification of Findings and Observations,
• Recommendations for resolving problems,
• Response to adverse findings or potential problems,
• Confirmation that solutions have been implemented and are effective, and
• Citation of any noteworthy practices that may be of use to others.
C2 REPORTS TO MANAGEMENT
During the laboratory evaluation, any QAPP deviations will be reported immediately to EPA.
The Battelle QM and/or VTC, during the course of any assessment or audit, will identify to the
technical staff performing experimental activities any immediate corrective action that should be
taken. A summary of the required assessments and audits, including a listing of responsibilities
and reporting timeframes, is included in Table 6. If serious quality problems exist, the Battelle
QM will notify the AMS Center Manager, who is authorized to stop work. Once the assessment
reports have been prepared, the VTC will ensure that a response is provided for each adverse
finding or potential problem and will implement any necessary follow-up corrective action. The
Battelle QM will ensure that follow-up corrective action has been taken. The QAPP and final
report are reviewed by the EPA AMS Center QM and the EPA AMS Center Project Officer.
-------
Generic Verification Protocol Turbidimeters
Page 38 of 42
Version 1.0
June 4, 2012
Upon final review and approval, both documents will then be posted on the ETV website
(www.epa.gov/etv).
Table 6. Summary of Quality Assessment and Control Reports1
Assessment
Technology
Offline Testing
TSA
(within the first
week of testing)
Technology
Online Testing
TSA
(within the first
week of testing)
ADQ (Day 1
data)
each technology
ADQ (Remaining
data and
verification
report)
Prepared By
Battelle
Battelle
Battelle
Battelle
Report Submission
Timeframe
10 business days after TSA is
complete2
TSA response is due to QM
within 10 business days
TSA responses will be verified
by the QM and provided to
EPA within 20 business days
10 business days after TSA is
complete2
TSA response is due to QM
within 10 business days
TSA responses will be verified
by the QM and provided to
EPA within 20 business days
ADQ will be completed within
10 business days after receipt
of first data set
ADQ will be completed within
10 business days after
completion of the verification
report review
Submitted To
EPA ETV AMS Center
EPA ETV AMS Center
EPA ETV AMS Center
EPA ETV AMS Center
Any QA checklists prepared to guide audits will be provided with the audit report.
2A separate TSA report will be prepared for each technology; the report submission timeframe is the
same for each.
-------
Generic Verification Protocol Turbidimeters
Page 39 of 42
Version 1.0
June 4, 2012
SECTION D
DATA VALIDATION AND USABILITY
Dl DATA REVIEW, VERIFICATION, AND VALIDATION REQUIREMENTS
The key data review and data verification requirements for these tests are stated in Section BIO
of this protocol. In general, the data review requirements specify that data generated during
these tests will be reviewed by a Battelle technical staff member within five days of generation
of the data. The reviewer will be familiar with the technical aspects of the verification test but
will not be the person who generated the data. This process will serve both as the data review
and the data verification, and will ensure that the data have been recorded, transmitted and
processed properly. Furthermore, this process will ensure that the monitoring systems data were
collected under appropriate testing.
The data validation requirements for these tests involve an assessment of the quality of the data
relative to the DQIs and MQOs for these tests referenced in Tables 2 and 5. Any deficiencies in
these data will be flagged and excluded from any statistical comparisons, unless these deviations
are accompanied by descriptions of their potential impacts on the data quality.
D2 VERIFICATION AND VALIDATION METHODS
Data verification is conducted as part of the data review as described in Section BIO of this
protocol. A visual inspection of handwritten data will be conducted to ensure that all entries
were properly recorded or transcribed, and that any erroneous entries were properly noted (i.e.,
single line through the entry, with an error code, such as "wn" for wrong number, and the initials
of the recorder and date of entry). Electronic data from technology, if applicable, and any other
analytical equipment used during the test will be inspected to ensure proper transfer from the
data logging system. All calculations used to transform the data will be reviewed to ensure the
accuracy and the appropriateness of the calculations. Calculations performed manually will be
reviewed and repeated using a handheld calculator or commercial software (e.g., Excel).
Calculations performed using standard commercial office software (e.g., Excel) will be reviewed
by inspection of the equations used for the calculations and verification of selected calculations
by handheld calculator. Calculations performed using specialized commercial software (i.e., for
-------
Generic Verification Protocol Turbidimeters
Page 40 of 42
Version 1.0
June 4, 2012
analytical instrumentation) will be reviewed by inspection and, when feasible, verified by
handheld calculator, or standard commercial office software.
To ensure that the data generated from these tests meet the goals of the tests, a number of data
validation procedures will be performed. Sections B and C of this protocol provide a description
of the validation safeguards employed for these verification tests. Data validation efforts include
the completion of QC activities, and the performance of two TSA audits as described in Section
C. The data from these tests will be evaluated relative to the MQOs described in Sections A and
B of this protocol. Data failing to meet these criteria will be flagged in the data set and not used
for evaluation of the technology, unless these deviations are accompanied by descriptions of their
potential impacts on the data quality.
An ADQ will be conducted by the Battelle QM to ensure that data review, verification, and
validation procedures were completed, and to assure the overall quality of the data.
D3 RECONCILIATION WITH USER REQUIREMENTS
This purpose of these verification tests is to verify the performance of turbidimeters compared to
an online turbidimeter which is compliant with EPA Method 180.1. To meet the requirements of
the user community, input on the tests described in the final QAPP will be provided by external
experts. Additional performance data regarding operational characteristics of the evaluated
turbidimeters will be collected by verification test personnel. To meet the requirements of the
user community, these data will include thorough documentation of the performance of the
technologies during the verification tests. The data review, verification, and validation
procedures described above will assure that data meeting these requirements are accurately
presented in the verification reports generated from this test, and will assure that data not
meeting these requirements will be appropriately flagged and discussed in the verification
reports.
This protocol and the resulting ETV verification report will be reviewed by the vendor, EPA, and
expert peer reviewers. The reviews of the QAPP will help to improve the design of the
-------
Generic Verification Protocol Turbidimeters
Page 41 of 42
Version 1.0
June 4, 2012
verification tests and the resulting report such that they better meet the needs of potential users of
these technologies.
-------
Generic Verification Protocol Turbidimeters
Page 42 of 42
Version 1.0
June 4, 2012
SECTION E
REFERENCES
1. Battelle, Quality Management Plan for the ETV Advanced Monitoring Systems Center,
Version 8.0, U.S. EPA Environmental Technology Verification Program, prepared by
Battelle, Columbus, Ohio, April 2011.
2. U.S. EPA, Environmental Technology Verification Program Quality Management Plan, EPA
Report No: 600/R-08/009 EPA/600/R-03/021, U.S. Environmental Protection Agency,
Cincinnati, Ohio, January 2008.
3. "Test/QA Plan for Verification of Online Turbidimeters," U.S. Environmental Protection
Agency Environmental Technology Verification Program, prepared by Battelle, Columbus,
OH, June 1999.
4. EPA Method 180.1 Turbidity (Nephelometric), Methods for the Determination of Inorganic
Substances in Environmental Samples EPA-600-R-93-100. 1993.
5. Method Validation Study Plan, Hach Method 10133, Determination of Turbidity by Laser
Nephelometry Revision 2, as part of the DynCorp Memorandum (Subject: D99-0002:Hach
Filter Trak Turbidity Method 10133), Hach Company, January 10, 2000.
-------
APPENDIX A
EPA Method 180.1
-------
METHOD 180.1
DETERMINATION OF TURBIDITY BY NEPHELOMETRY
Edited by James W. O'Dell
Inorganic Chemistry Branch
Chemistry Research Division
Revision 2.0
August 1993
ENVIRONMENTAL MONITORING SYSTEMS LABORATORY
OFFICE OF RESEARCH AND DEVELOPMENT
U.S. ENVIRONMENTAL PROTECTION AGENCY
CINCINNATI, OHIO 45268
180.1-1
-------
METHOD 180.1
DETERMINATION OF TURBIDITY BY NEPHELOMETRY
1.0 SCOPE AND APPLICATION
1.1 This method covers the determination of turbidity in drinking, ground, surface,
and saline waters, domestic and industrial wastes.
1.2 The applicable range is 0-40 nephelometric turbidity units (NTU). Higher
values may be obtained with dilution of the sample.
2.0 SUMMARY OF METHOD
2.1 The method is based upon a comparison of the intensity of light scattered by
the sample under defined conditions with the intensity of light scattered by a
standard reference suspension. The higher the intensity of scattered light, the
higher the turbidity. Readings, in NTU's, are made in a nephelometer
designed according to specifications given in Sections 6.1 and 6.2. A primary
standard suspension is used to calibrate the instrument. A secondary standard
suspension is used as a daily calibration check and is monitored periodically
for deterioration using one of the primary standards.
2.1.1 Formazin polymer is used as a primary turbidity suspension for water
because it is more reproducible than other types of standards
previously used for turbidity analysis.
2.1.2 A commercially available polymer primary standard is also approved
for use for the National Interim Primary Drinking Water Regulations.
This standard is identified as AMCO-AEPA-1, available from Advanced
Polymer Systems.
3.0 DEFINITIONS
3.1 Calibration Blank (CB) -- A volume of reagent water fortified with the same
matrix as the calibration standards, but without the analytes, internal
standards, or surrogates analytes.
3.2 Instrument Performance Check Solution (IPC) -- A solution of one or more
method analytes, surrogates, internal standards, or other test substances used
to evaluate the performance of the instrument system with respect to a defined
set of criteria.
3.3 Laboratory Reagent Blank (LRB) -- An aliquot of reagent water or other blank
matrices that are treated exactly as a sample including exposure to all
glassware, equipment, solvents, reagents, internal standards, and surrogates
that are used with other samples. The LRB is used to determine if method
180.1-2
-------
analytes or other interferences are present in the laboratory environment, the
reagents, or the apparatus.
3.4 Linear Calibration Range (LCR) -- The concentration range over which the
instrument response is linear.
3.5 Material Safety Data Sheet (MSDS) -- Written information provided by
vendors concerning a chemical's toxicity, health hazards, physical properties,
fire, and reactivity data including storage, spill, and handling precautions.
3.6 Primary Calibration Standard (PCAL) -- A suspension prepared from the
primary dilution stock standard suspension. The PCAL suspensions are used
to calibrate the instrument response with respect to analyte concentration.
3.7 Quality Control Sample (QCS) -- A solution of the method analyte of known
concentrations that is used to fortify an aliquot of LRB matrix. The QCS is
obtained from a source external to the laboratory, and is used to check
laboratory performance.
3.8 Secondary Calibration Standards (SCAL) — Commercially prepared, stabilized
sealed liquid or gel turbidity standards calibrated against properly prepared
and diluted formazin or styrene divinylbenzene polymers.
3.9 Stock Standard Suspension (SSS) — A concentrated suspension containing the
analyte prepared in the laboratory using assayed reference materials or
purchased from a reputable commercial source. Stock standard suspension is
used to prepare calibration suspensions and other needed suspensions.
4.0 INTERFERENCES
4.1 The presence of floating debris and coarse sediments which settle out rapidly
will give low readings. Finely divided air bubbles can cause high readings.
4.2 The presence of true color, that is the color of water which is due to dissolved
substances that absorb light, will cause turbidities to be low, although this
effect is generally not significant with drinking waters.
4.3 Light absorbing materials such as activated carbon in significant concentrations
can cause low readings.
5.0 SAFETY
5.1 The toxicity or carcinogenicity of each reagent used in this method has not
been fully established. Each chemical should be regarded as a potential health
hazard and exposure should be as low as reasonably achievable.
5.2 Each laboratory is responsible for maintaining a current awareness file of
OSHA regulations regarding the safe handling of the chemicals specified in
180.1-3
-------
this method. A reference file of Material Safety Data Sheets (MSDS) should be
made available to all personnel involved in the chemical analysis. The
preparation of a formal safety plan is also advisable.
5.3 Hydrazine Sulfate (Section 7.2.1) is a carcinogen. It is highly toxic and may be
fatal if inhaled, swallowed, or absorbed through the skin. Formazin can
contain residual hydrazine sulfate. Proper protection should be employed.
6.0 EQUIPMENT AND SUPPLIES
6.1 The turbidimeter shall consist of a nephelometer, with light source for
illuminating the sample, and one or more photo-electric detectors with a
readout device to indicate the intensity of light scattered at right angles to the
path of the incident light. The turbidimeter should be designed so that little
stray light reaches the detector in the absence of turbidity and should be free
from significant drift after a short warm-up period.
6.2 Differences in physical design of turbidimeters will cause differences in
measured values for turbidity, even though the same suspension is used for
calibration. To minimize such differences, the following design criteria should
be observed:
6.2.1 Light source: Tungsten lamp operated at a color temperature between
2200-3000°K.
6.2.2 Distance traversed by incident light and scattered light within the
sample tube: Total not to exceed 10 cm.
6.2.3 Detector: Centered at 90° to the incident light path and not to exceed
±30° from 90°. The detector, and filter system if used, shall have a
spectral peak response between 400 nm and 600 nm.
6.3 The sensitivity of the instrument should permit detection of a turbidity
difference of 0.02 NTU or less in waters having turbidities less than 1 unit.
The instrument should measure from 0-40 units turbidity. Several ranges may
be necessary to obtain both adequate coverage and sufficient sensitivity for low
turbidities.
6.4 The sample tubes to be used with the available instrument must be of clear,
colorless glass or plastic. They should be kept scrupulously clean, both inside
and out, and discarded when they become scratched or etched. A light
coating of silicon oil may be used to mask minor imperfections in glass tubes.
They must not be handled at all where the light strikes them, but should be
provided with sufficient extra length, or with a protective case, so that they
may be handled. Tubes should be checked, indexed and read at the
orientation that produces the lowest background blank value.
6.5 Balance — Analytical, capable of accurately weighing to the nearest 0.0001 g.
180.1-4
-------
6.6 Glassware — Class A volumetric flasks and pipets as required.
7.0 REAGENTS AND STANDARDS
7.1 Reagent water, turbidity-free: Pass deionized distilled water through a 0.45u
pore size membrane filter, if such filtered water shows a lower turbidity than
unfiltered distilled water.
7.2 Stock standard suspension (Formazin):
7.2.1 Dissolve 1.00 g hydrazine sulfate, (NH^.HjSC^ (CASRN 10034-93-2) in
reagent water and dilute to 100 mL in a volumetric flask. CAUTION-
carcinogen.
7.2.2 Dissolve 10.00 g hexamethylenetetramine (CASRN 100-97-0) in reagent
water and dilute to 100 mL in a volumetric flask. In a 100 mL
volumetric flask, mix 5.0 mL of each solution (Sections 7.2.1 and 7.2.2).
Allow to stand 24 hours at 25 ±3°C, then dilute to the mark with
reagent water.
7.3 Primary calibration standards: Mix and dilute 10.00 mL of stock standard
suspension (Section 7.2) to 100 mL with reagent water. The turbidity of this
suspension is defined as 40 NTU. For other values, mix and dilute portions of
this suspension as required.
7.3.1 A new stock standard suspension (Section 7.2) should be prepared each
month. Primary calibration standards (Section 7.3) should be prepared
daily by dilution of the stock standard suspension.
7.4 Formazin in commercially prepared primary concentrated stock standard
suspension (SSS) may be diluted and used as required. Dilute turbidity
standards should be prepared daily.
7.5 AMCO-AEPA-1 Styrene Divinylbenzene polymer primary standards are
available for specific instruments and require no preparation or dilution prior
to use.
7.6 Secondary standards may be acceptable as a daily calibration check, but must
be monitored on a routine basis for deterioration and replaced as required.
8.0 SAMPLE COLLECTION. PRESERVATION AND STORAGE
8.1 Samples should be collected in plastic or glass bottles. All bottles must be
thoroughly cleaned and rinsed with turbidity free water. Volume collected
should be sufficient to insure a representative sample, allow for replicate
analysis (if required), and minimize waste disposal.
8.2 No chemical preservation is required. Cool sample to 4°C.
180.1-5
-------
8.3 Samples should be analyzed as soon as possible after collection. If storage is
required, samples maintained at 4°C may be held for up to 48 hours.
9.0 QUALITY CONTROL
9.1 Each laboratory using this method is required to operate a formal quality
control (QC) program. The minimum requirements of this program consist of
an initial demonstration of laboratory capability and analysis of laboratory
reagent blanks and other solutions as a continuing check on performance. The
laboratory is required to maintain performance records that define the quality
of data generated.
9.2 INITIAL DEMONSTRATION OF PERFORMANCE.
9.2.1 The initial demonstration of performance is used to characterize
instrument performance (determination of LCRs and analysis of QCS).
9.2.2 Linear Calibration Range (LCR) — The LCR must be determined
initially and verified every six months or whenever a significant change
in instrument response is observed or expected. The initial
demonstration of linearity must use sufficient standards to insure that
the resulting curve is linear. The verification of linearity must use a
minimum of a blank and three standards. If any verification data
exceeds the initial values by ±10%, linearity must be reestablished. If
any portion of the range is shown to be nonlinear, sufficient standards
must be used to clearly define the nonlinear portion.
9.2.3 Quality Control Sample (QCS) — When beginning the use of this
method, on a quarterly basis or as required to meet data-quality needs,
verify the calibration standards and acceptable instrument performance
with the preparation and analysis of a QCS. If the determined
concentrations are not within ±10% of the stated values, performance of
the determinative step of the method is unacceptable. The source of
the problem must be identified and corrected before continuing with
on-going analyses.
9.3 ASSESSING LABORATORY PERFORMANCE
9.3.1 Laboratory Reagent Blank (LRB) — The laboratory must analyze at least
one LRB with each batch of samples. Data produced are used to assess
contamination from the laboratory environment.
9.3.2 Instrument Performance Check Solution (IPC) — For all determinations,
the laboratory must analyze the IPC (a mid-range check standard) and
a calibration blank immediately following daily calibration, after every
tenth sample (or more frequently, if required) and at the end of the
sample run. Analysis of the IPC solution and calibration blank
immediately following calibration must verify that the instrument is
180.1-6
-------
within ±10% of calibration. Subsequent analyses of the IPC solution
must verify the calibration is still within ±10%. If the calibration cannot
be verified within the specified limits, reanalyze the IPC solution. If the
second analysis of the IPC solution confirms calibration to be outside
the limits, sample analysis must be discontinued, the cause determined
and/or in the case of drift the instrument recalibrated. All samples
following the last acceptable IPC solution must be reanalyzed. The
analysis data of the calibration blank and IPC solution must be kept on
file with the sample analyses data. NOTE: Secondary calibration
standards (SS) may also be used as the IPC.
9.3.3 Where additional reference materials such as Performance Evaluation
samples are available, they should be analyzed to provide additional
performance data. The analysis of reference samples is a valuable tool
for demonstrating the ability to perform the method acceptably.
10.0 CALIBRATION AND STANDARDIZATION
10.1 Turbidimeter calibration: The manufacturer's operating instructions should be
followed. Measure standards on the turbidimeter covering the range of
interest. If the instrument is already calibrated in standard turbidity units, this
procedure will check the accuracy of the calibration scales. At least one
standard should be run in each instrument range to be used. Some
instruments permit adjustments of sensitivity so that scale values will
correspond to turbidities. Solid standards, such as those made of lucite blocks,
should never be used due to potential calibration changes caused by surface
scratches. If a pre-calibrated scale is not supplied, calibration curves should be
prepared for each range of the instrument.
11.0 PROCEDURE
11.1 Turbidities less than 40 units: If possible, allow samples to come to room
temperature before analysis. Mix the sample to thoroughly disperse the solids.
Wait until air bubbles disappear then pour the sample into the turbidimeter
tube. Read the turbidity directly from the instrument scale or from the
appropriate calibration curve.
11.2 Turbidities exceeding 40 units: Dilute the sample with one or more volumes
of turbidity-free water until the turbidity falls below 40 units. The turbidity of
the original sample is then computed from the turbidity of the diluted sample
and the dilution factor. For example, if 5 volumes of turbidity-free water were
added to 1 volume of sample, and the diluted sample showed a turbidity of 30
units, then the turbidity of the original sample was 180 units.
11.2.1 Some turbidimeters are equipped with several separate scales. The
higher scales are to be used only as indicators of required dilution
volumes to reduce readings to less than 40 NTU.
180.1-7
-------
Note: Comparative work performed in the Environmental Monitoring
Systems Laboratory - Cincinnati (EMSL-Cincinnati) indicates a
progressive error on sample turbidities in excess of 40 units.
12.0 DATA ANALYSIS AND CALCULATIONS
12.1 Multiply sample readings by appropriate dilution to obtain final reading.
12.2 Report results as follows:
NTU Record to Nearest:
0.0 - 1.0 0.05
1 - 10 0.1
10-40 1
40 - 100 5
100 - 400 10
400 - 1000 50
>1000 100
13.0 METHOD PERFORMANCE
13.1 In a single laboratory (EMSL-Cincinnati), using surface water samples at levels
of 26, 41, 75, and 180 NTU, the standard deviations were ±0.60, ±0.94, ±1.2,
and ±4.7 units, respectively.
13.2 The interlaboratory precision and accuracy data in Table 1 were developed
using a reagent water matrix. Values are in NTU.
14.0 POLLUTION PREVENTION
14.1 Pollution prevention encompasses any technique that reduces or eliminates the
quantity or toxicity of waste at the point of generation. Numerous
opportunities for pollution prevention exist in laboratory operation. The EPA
has established a preferred hierarchy of environmental management techniques
that places pollution prevention as the management option of first choice.
Whenever feasible, laboratory personnel should use pollution prevention
techniques to address their waste generation. When wastes cannot be feasibly
reduced at the source, the Agency recommends recycling as the next best
option.
14.2 The quantity of chemicals purchased should be based on expected usage
during its shelf life and disposal cost of unused material. Actual reagent
preparation volumes should reflect anticipated usage and reagent stability.
14.3 For information about pollution prevention that may be applicable to
laboratories and research institutions, consult "Less is Better: Laboratory
Chemical Management for Waste Reduction," available from the American
180.1-8
-------
Chemical Society's Department of Government Regulations and Science Policy,
1155 16th Street N.W., Washington B.C. 20036, (202)872-4477.
15.0 WASTE MANAGEMENT
15.1 The U.S. Environmental Protection Agency requires that laboratory waste
management practices be conducted consistent with all applicable rules and
regulations. Excess reagents, samples and method process wastes should be
characterized and disposed of in an acceptable manner. The Agency urges
laboratories to protect the air, water and land by minimizing and controlling
all releases from hoods, and bench operations, complying with the letter and
spirit of any waste discharge permit and regulations, and by complying with
all solid and hazardous waste regulations, particularly the hazardous waste
identification rules and land disposal restrictions. For further information on
waste management consult the "Waste Management Manual for Laboratory
Personnel," available from the American Chemical Society at the address listed
in Section 14.3.
16.0 REFERENCES
1. Annual Book of ASTM Standards, Volume 11.01 Water (1), Standard D1889-
88A, p. 359, (1993).
2. Standard Methods for the Examination of Water and Wastewater, 18th Edition,
pp. 2-9, Method 2130B, (1992).
180.1-9
-------
17.0 TABLES, DIAGRAMS, FLOWCHARTS AND VALIDATION DATA
TABLE 1. INTERLABORATORY PRECISION AND ACCURACY DATA
Number of
Values
Reported
373
374
289
482
484
489
640
487
288
714
641
True
Value
(T)
0.450
0.600
0.65
0.910
0.910
1.00
1.36
3.40
4.8
5.60
5.95
Mean
(X)
0.4864
0.6026
0.6931
0.9244
0.9919
0.9405
1.3456
3.2616
4.5684
5.6984
5.6026
Residual
forX
0.0027
-0.0244
0.0183
0.0013
0.0688
-0.0686
-0.0074
-0.0401
-0.0706
0.2952
-0.1350
Standard
Deviation
(S)
0.1071
0.1048
0.1301
0.2512
0.1486
0.1318
0.1894
0.3219
0.3776
0.4411
0.4122
Residual
forS
-0.0078
-0.0211
0.0005
0.1024
-0.0002
-0.0236
0.0075
-0.0103
-0.0577
-0.0531
-0.1078
REGRESSIONS: X = 0.955T + 0.54, S = 0.074T + 0.082
180.1-10
-------
APPENDIX B
Recirculation System Schematic
[to be completed for final QAPP]
-------
APPENDIX C
Example Data Sheets
[to be completed for final QAPP]
------- |