OBaltelle
. . . Putting, Technology To Work
Environmental Technology
Verification Program
Advanced Monitoring
Systems Center
Generic Verification Protocol
for Long-Term Deployment of
Multi-Parameter Water
Quality Probes/Sondes
ETI^ETWrr

-------
GENERIC VERIFICATION PROTOCOL
FOR
LONG-TERM DEPLOYMENT OF
MULTI-PARAMETER WATER QUALITY PROBES/SONDES
July 2002
Prepared by
Battelle
505 King Avenue
Columbus, OH 43201-2693

-------
Table of Contents
Page
1.	Introduction	1
1.1	Environmental Technology Verification Background	1
1.2	Test Objective 	1
1.3	Test Applicability	2
2.	Technology Description	2
3.	Verification Approach	3
3.1	Scope of Testing 	3
3.2	Experimental Design 	3
3.3	Reference Methods	5
3.4	Test Facility	5
3.5	Roles and Responsibilities	6
3.5.1	Battelle	6
3.5.2	Vendors 	9
3.5.3	EPA	10
3.5.4	Test Facility	11
4.	Test Procedures	12
4.1	Site Selection	12
4.2	Multi-Parameter Water Probe Deployment	13
4.3	Saltwater Testing	14
4.4	Freshwater Testing	15
4.5	Mesocosm Testing	18
4.6	Multi-Parameter Water Probe Calibration 	18
4.7	Reference Methods	20
4.7.1	pH 	20
4.7.2	Turbidity 	20
4.7.3	Dissolved Oxygen 	20
4.7.4	Nitrate 	20
4.7.5	Chlorophyll A	21
4.7.6	Conductivity	21
4.7.7	Temperature	21
5.	Materials and Equipment	21
5.1	Reagents 	21
5.2	Sampling Equipment and Handling 	21
5.3	Reference Equipment	22
ii

-------
Table of Contents
(continued)
Page
6.	Quality Assurance/Quality Control 	22
6.1	Calibration	22
6.2	Field Quality Control	23
6.3	Sample Custody	24
6.4	Audits 	24
6.4.1	Performance Evaluation Audits 	 24
6.4.2	Technical Systems Audits	26
6.4.3	Audit of Data Quality	26
6.4.4	Assessment Reports 	27
6.5	Corrective Action	27
7.	Data Handling and Reporting	28
7.1	Documentation and Records	28
7.2	Data Review	28
7.3	Statistical Procedures	29
7.3.1	Pre-and Postcalibration Results	29
7.3.2	Relative Bias 	30
7.3.3	Precision	31
7.3.4	Linearity	31
7.3.5	Inter-Unit Reproducibility 	32
7.4	Reporting	32
8.	Health and Safety 	33
9.	References	33
iii

-------
Table of Contents
(continued)
Page
List of Figures
Figure 1. Mesocosm Tank	5
Figure 2. Organization Chart for Multi-Parameter Water Probe
Verification 	7
Figure 3. Major Bodies of Water Leading into the Test Area 	12
List of Tables
Table 1. Schedule for the Multi-Parameter Water Probe Test	5
Table 2. Expected Ranges of Water Characteristics at Example Test Sites 	13
Table 3. Sample Analysis Location 	15
Table 4. Schedule of Reference Method Sample Events on Each Day of Testing
at the Saltwater Site	16
Table 5. Schedule of Reference Method Sample Events on Each Day of Testing
at the Freshwater Site	17
Table 6.	Schedule for Mesocosm Sample Collection	19
Table 7.	Maximum Holding Time	22
Table 8.	Replicate Analysis Results	23
Table 9.	Expected Values for Field Blanks 	23
Table 10.	Summary of Performance Evaluation Audits	24
Table 11.	Summary of Data Recording Process	29
iv

-------
Acronyms
AMS	Advanced Monitoring Systems
CCEHBR	Center for Coastal Environmental Health and Biomolecular Research
DO	dissolved oxygen
EPA	United States Environmental Protection Agency
ETV	Environmental Technology Verification
1	liter
NIST	National Institute of Standards and Technology
NOAA	National Oceanic and Atmospheric Administration
NTU	nephelometric turbidity unit
QA	quality assurance
QMP	Quality Management Plan
SOP	standard operating procedure
TSA	technical systems audit
V

-------
1 Introduction
1.1 Environmental Technology Verification Background
This generic verification protocol provides detailed procedures for implementing a
verification test of multi-parameter water quality probes/sondes that continuously measure water
quality parameters. Verification tests are conducted under the auspices of the U.S.
Environmental Protection Agency (EPA) through its Environmental Technology Verification
(ETV) program. The purpose of the ETV program is to provide objective and quality-assured
performance data on environmental technologies, so that users, developers, regulators, and
consultants can make informed purchase and application decisions about these technologies.
ETV does not imply approval, certification, or designation by EPA, but rather provides a
quantitative assessment of the performance of a technology under specified test conditions.
The verification tests are coordinated by Battelle, of Columbus, Ohio, which is EPA's
partner in the ETV Advanced Monitoring Systems (AMS) Center. The scope of the AMS Center
covers verification of monitoring technologies for contaminants and natural species in air, water,
and soil. In performing verification tests, Battelle follows the procedures specified in this test
protocol and complies with the requirements in the "Quality Management Plan for the ETV
Advanced Monitoring Systems Center" (QMP).(1)
1.2 Test Objective
The purpose of verification tests of multi-parameter water quality probes/sondes is to
evaluate their performance under realistic operating conditions. Specifically, these probes are
deployed in a location or locations similar to those that would be used by members of the water
monitoring community, and the probes are evaluated by comparing their measurements with
reference measurements. For example, a verification might require deploying probes in
laboratory, freshwater, and saltwater environments for a 21/2-month field test in which the probes
are operated continuously for periods up to 30 days. During such time, water quality parameters
1

-------
such as turbidity, chlorophyll A, nitrate, conductivity, temperature, dissolved oxygen (DO), and
pH are measured both by the probes (when applicable) and by reference methods. In the
laboratory environment, these parameters are controlled, while in the freshwater and saltwater
phases of the verification, these parameters are not controlled. During each phase, assessments of
performance are based upon comparisons to the reference results and include determinations of
accuracy, precision, linearity, and inter-unit reproducibility. Different locations, target analytes,
and test periods may be accommodated, if appropriate for the water probes being tested, by
specifying those features of the verification in the test/quality assurance (QA) plan for the test.
1.3 Test Applicability
This generic protocol is applicable to verification testing of probes that operate
unattended in lakes, rivers, coastal areas, estuaries, bays, and other fresh, salt, or brackish bodies
of water and that continuously measure one or more water quality parameters, such as turbidity,
chlorophyll A, nitrate, conductivity, temperature, DO, or pH. In accordance with the intent of
the ETV program, the probes tested are commercially available and not developmental products
or prototypes. No enhancements of a commercially available product can be used. This
includes using any special anti-fouling coating or paints that are not part of the standard product.
2 Technology Description
The probes to be tested under this protocol typically consist of a sensor or sensors in a
rugged housing at the end of a tethered line. The probes are portable and usually must be
tethered to a buoy, dock, piling, or similar structure. While some may be capable of wireless
transmission of data, many probes require that stored data be physically downloaded by the user.
The multi-parameter water probes verified under this protocol must be able to undergo
the testing explained in Chapter 4. In general, probes must be able to measure two or more of the
parameters listed in Section 1.3 in both salt and freshwater. The probe must be deployable, in the
sense that the probe must be able to make the water quality measurements without the assistance
or intervention of an operator. A probe must be able to store the measured water quality values
2

-------
for a minimum of two weeks at an hourly sampling rate and must be able to sample at depths
between 1 and 15 feet.
3 Verification Approach
3.1 Scope of Testing
The objective of the verification test derived from this generic protocol is to establish the
performance capabilities of multi-parameter water probes under operating conditions that are
realistic in terms of type of water body, depth, duration of unattended operation, etc., as well as
in a laboratory or controlled setting. To achieve this goal, the verification test involves three
phases. In the first phase, the probes are tested in a saltwater location. The second phase takes
place at a freshwater location. In each of these two phases, the probes monitor the naturally
occurring levels of each parameter. These phases of 30 sampling days each are used to determine
how well the probes compare with the reference methods while being continuously deployed in a
field setting. The third phase takes place in a laboratory or controlled environment. During this
week-long phase, the probes are tested over target parameter ranges that are partially controlled.
The turbidity and conductivity are adjusted while recording the response of the probes. In all
tests, two units of each probe are operated side by side to make inter-unit comparisons.
3.2 Experimental Design
The verification test is designed to assess the performance of multi-parameter water
probes relative to reference methods that may consist of using either a grab sample and
laboratory analysis or another real-time probe. Collaboration with a partner organization is
highly recommended. For example, a test conducted under this protocol was coordinated with
the National Oceanic and Atmospheric Administration (NOAA) through the Center for Coastal
Environmental Health and Biomolecular Research (CCEHBR). The test described in this
protocol follows the design of that performed at or near CCEHBR facilities in Charleston, South
3

-------
Carolina. The approach to the verification test is summarized below, and the statistical
methodology for establishing performance parameters is described in Section 7.3.
The first phase of the test shall occur at a saltwater site such as CCEHBR and last
approximately one month. CCEHBR, for example, has direct access to Charleston Harbor, which
is a tidally dominated body of water that receives some riverine input, with salinities ranging
from 20 to 35 parts per thousand. The South Carolina Department of Natural Resources has
several piers and docks that can be used to deploy the instruments. Also, other areas in close
proximity can be used if the instruments need to be deployed away from dock and boat activity.
Many types of land use (including residential, industrial, urban, and dredge spoil) in the area
surrounding Charleston Harbor can affect overall water quality.
The second phase of the test shall occur at a freshwater site and last approximately one
month. A five-acre freshwater pond named Lake Edmunds, located approximately one mile from
the CCEHBR, exemplifies an appropriate site.
The third phase shall take place over a one-week period at a facility such as CCEHBR's
Mesocosm Facility. The test facility should contain modular estuarine mesocosms, consisting of
a 300-liter tank containing elevated sediment trays and stream channels. Each sediment tray
should be arranged so that an elevated salt marsh surface is formed. The sediment trays contain
sediment, salt marsh vegetation, and benthic communities. Stream channels contain
phytoplankton, zooplankton, and endemic macrofaunal species. Another component of the
mesocosm is a reservoir or sump that provides tidal water to the system through a pump system
controlled by a timer. Twice daily, seawater is pumped up into the mesocosm tank from the
sump to simulate a flood tide. After six hours of flooding tide, the seawater is allowed to drain
back into the sump, simulating an ebb tide for another six hours. Mesocosms used for this test
can be classified as "tidal" or "estuarine." Figure 1 shows a single mesocosm tank.
A suggested schedule for the various testing activities is given in Table 1. In each phase,
individual vendor's probes are positioned as close to each other as possible so that inter-unit
comparisons can be made. In addition, the probes from all vendors are placed near each other so
that parameters such as photosynthesis and mixing are as similar as possible.
4

-------
3.3 Reference Methods
Figure 1. Mesocosm Tank
During a verification test, various analytical
methods are used to monitor turbidity, chlorophyll A,
nitrate, conductivity, temperature, DO, and pH.
Temperature, pH, DO, and conductivity are monitored in
real time with devices that are collocated with the probes
being verified. Turbidity, chlorophyll A, and nitrate
concentrations are measured using laboratory analysis of
collected samples. Turbidity is measured using a benchtop
ratio turbidity meter, chlorophyll A is measured by
fliiorometry, and nitrate is measured colorimetrically.
Table 1. Schedule for the Multi-parameter Water Probe Test
Activity
Day Number
Vendor setup for saltwater site
1
Begin saltwater test
8
End saltwater test
39
Vendor setup for freshwater test
40
Begin freshwater test
50
End freshwater test
82
Vendor setup for mesocosm test
86
Begin mesocosm test
91
End mesocosm test
95
Vendor removal of equipment
98
3.4 Test Facility
CCEHBR exemplifies the requirements of a test facility for this verification. Specifically,
a test facility must be capable of providing a secure and realistic location for deploying the
multi-parameter water probes, must have standard operating procedures (SOPs) or written
methods in place for the reference measurements, have trained personnel capable of performing
5

-------
these activities according to those SOPs, and have documented QA procedures in place.
Documentation of the staff training, SOPs, and other pertinent materials are provided to Battelle
prior to test initiation.
3.5 Roles and Responsibilities
The verification test is coordinated and supervised by Battelle personnel. Staff from the
test facility participate in this test by operating the reference equipment, collecting the water
samples, downloading the data from the multi-parameter water probes, and informing Battelle
staff of any problems encountered. Vendor representatives install, maintain, and operate their
respective technologies throughout the test unless they give written consent to Battelle or the test
facility to carry out these activities. QA oversight is provided by the Battelle Quality Manager
and the EPA ETV Quality Manager at his/her discretion. The chart in Figure 2 shows the
organization of responsibilities for Battelle, the vendor companies, EPA, and the test facility.
Specific responsibilities are detailed below.
3.5.1 Battelle
The Battelle Verification Test Coordinator has the overall responsibility for ensuring that
the technical, schedule, and cost goals established for the verification test are met. The
Verification Test Coordinator shall
•	Prepare the draft test/QA plan, verification reports, and verification statements
•	Revise the draft test/QA plan, verification reports, and verification statements in
response to reviewers' comments
•	Coordinate distribution of the final test/QA plan, verification reports, and verification
statements
•	Coordinate testing, measurement parameters, and schedules at the testing site
6

-------
Battelle
Quality Manager
Battelle
Management
Battelle AMS
Center Manager
Battelle
Verification
Testing Leader
EPA AMS
Center Manager
EPA ETV
Quality Manager
Test Facility
Management
Test Facility
Testing
Staff
Battelle
Verification
Test Coordinator
Battelle
Testing
Staff
Vendor
Representatives
Figure 2. Organization Chart for Multi-Parameter Water Probe Verification
7

-------
•	Ensure that all quality procedures specified in the test/QA plan and in the QMP are
followed
•	Respond to any issues raised in assessment reports and audits, including instituting
corrective action as necessary
•	Serve as the primary point of contact for vendor and test facility representatives
•	Establish a budget for the verification test and monitor staff effort to ensure that the
budget is not exceeded
•	Ensure that confidentiality of proprietary vendor technology and information is
maintained
•	Coordinate with sample analysis laboratory to ensure timely reporting of results.
The Verification Testing Leader for the AMS Center provides technical guidance and
oversees various stages of the verification test and shall
•	Support the Verification Test Coordinator in preparing the test/QA plan and
organizing the testing and budgeting for the verification activities
•	Review the draft test/QA plan
•	Review the draft verification reports and statements
•	Ensure that confidentiality of proprietary vendor technology and information is
maintained.
Battelle's AMS Center Manager shall
•	Review the draft test/QA plan
•	Review the draft verification reports and statements
•	Ensure that necessary Battelle resources, including staff and facilities, are committed
to the verification test
•	Support the Verification Test Coordinator in responding to any issues raised in
assessment reports and audits
•	Maintain communication with EPA's AMS Center and ETV Quality Managers
8

-------
•	Ensure that confidentiality of proprietary vendor technology and information is
maintained.
Battelle's Quality Manager for the verification test shall
•	Review the draft test/QA plan
•	Conduct a technical systems audit (TSA) once during the verification test
•	Audit at least 10% of the verification data
•	Prepare and distribute an assessment report for each audit
•	Verify implementation of any necessary corrective action
•	Issue a stop work order if self-audits indicate that data quality is being compromised
or if proper safety practices are not followed; notify the Battelle AMS Center
Manager if a stop work order is issued
•	Provide a summary of the audit activities and results for the verification reports
•	Review the draft verification reports and statements
•	Have overall responsibility for ensuring that the test/QA plan and ETV QMP are
followed
•	Ensure that Battelle management is informed if persistent quality problems are not
corrected
•	Interface with EPA's ETV Quality Manager
•	Ensure that confidentiality of proprietary vendor technology and information is
maintained.
3.5.2 Vendors
Vendors shall
•	Review the draft test/QA plan and provide comments and recommendations
•	Approve the revised test/QA plan
9

-------
•	Work with Battelle to commit to a specific schedule for the verification test
•	Provide duplicate commercial-ready probes for testing
•	Provide an on-site operator(s) throughout the verification test period to install the
probes and maintain them during testing, unless written consent is given for Battelle
or the test facility staff to perform those responsibilities
•	Remove probes and other related equipment from the test facility upon completing
the verification test
•	Review and comment upon their respective draft verification reports and statements.
3.5.3 EPA
EPA's responsibilities in the AMS Center are based on the requirements stated in the
"Environmental Technology Verification Program Quality and Management Plan for the Pilot
Period (1995-2000)"(2) or the most current update of this document. The roles of the specific
EPA staff are as follows:
EPA's ETV Quality Manager shall
•	Review the draft test/QA plan
•	Perform, at his/her option, one external TSA during the verification test
•	Notify the Battelle AMS Center Manager to facilitate a stop work order if an external
audit indicates that data quality is being compromised
•	Prepare and distribute an assessment report summarizing the results of an external
audit, if performed
•	Review draft verification reports and statements
•	Ensure that confidentiality of proprietary vendor technology and information is
maintained.
EPA's AMS Center Manager shall
10

-------
•	Review the draft test/QA plan
•	Approve the final test/QA plan
•	Approve the final verification reports
•	Review the draft verification statements
•	Ensure that confidentiality of proprietary vendor technology and information is
maintained.
3.5.4 Test Facility
Test facility staff shall
•	Assist in developing the test/QA plan for the verification test
•	Allow facility access to the vendor, Battelle, and EPA representatives during the field
test periods
•	Provide safety instructions to Battelle, EPA, and vendor personnel for operations at
the test facility
•	Select a secure location for each of the three testing phases
•	Assist vendors in installing the probes at each location
•	Perform sample collections and analyses as detailed in the test procedures section of
the test/QA plan
•	Perform reference measurements
•	Provide all test data to Battelle electronically, in mutually agreed upon format
•	Provide EPA and Battelle staff access to and /or copies of appropriate QA
documentation of test equipment and procedures (e.g., SOPs, calibration data)
•	Provide information regarding education and experience of each researcher involved
in the verification
•	Assist in Battelle's reporting of the test facility's QA/quality control results
11

-------
• Review portions of the draft verification reports to assure accurate descriptions of the
test facility operations and to provide technical insight on verification results.
4 Test Procedures
4.1 Site Selection
Below are the general procedures followed at each of the test sites. Three test sites are
used for this verification in an attempt to expose the probes to as wide a range of conditions as
possible while conducting an efficient test. The site selection process requires that several
important criteria be met. First, the three sites must include one controlled, one saltwater (or
brackish), and one freshwater location. The sites
must allow for collocation of numerous probes
because each vendor will provide duplicate probes
for the test. The sites must be accessible daily so
that timely water collections can be made; and the
sites must, to the extent possible, be free from
interference from the public. A secure facility is not
required, but is preferred. For this protocol, the
three locations described are the Mesocosm Facility
at the CCEHBR in Charleston, the Charleston
Harbor, and Lake Edmunds. Figure 3 shows a map
of South Carolina and a close-up map showing the
testing sites. If another facility is used, it must meet
the requirements described above.
The sites at or near the CCEHBR are
appropriate for several reasons. First it was
beneficial to involve a major user (NOAA) of
multi-parameter water probes to allow a broader verification test than would be possible using
only Battelle facilities. Second, CCEHBR has secure, nearby sites available for all three phases
Natipnaf of the United Sti
lount Pleasant"
ind
Foily Beacj
Ih Island.
Nation at Ati
Figure 3. Major Bodies of Water Leading
ttt>d States
into the Test Area
12

-------
of the test (mesocosm, freshwater, and saltwater), which allows resources to be devoted to
testing rather than to building infrastructure for the test. Finally, these sites offer a useful varia-
tion of water conditions for testing. Typical ranges for the target parameters to be monitored are
given in Table 2. The remainder of this protocol uses the CCEHBR as a specific test facility
example. A similar range of water conditions should be characteristic of alternate test facilities.
Table 2. Expected Ranges of Water Characteristics at the Example Test Sites
Parameter
Mesocosm
Bay
Lake Edmunds

Low
High
Low
High
Low
High
pH
7.5
8.3
7.5
8.3
7.0
8.0
Turbidity
0.1 NTIF
10 NTU
b
-
-
-
DO
2.0 mg/L
10.00 mg/L
2.5 mg/L
8.0 mg/L
-
-
Conductivity
0.0
36mS/cm2
-
-
-
-
Temperature
15C
35C
-
-
-
-
Nitrate
0.1 mg/L
1 mg/L
0.1 mg/L
1 mg/L
0.1 mg/L
1 mg/L
Chlorophyll A
5 n-g/L
60 |ig/L
5 ng/L
60 |ig/L
5 ng/L
60 |ig/L
Salinity
0 ppt
20 ppt
20 ppt
30 ppt
0 ppt
<1 ppt
a NTU = nephelometric turbidity unit.
b """= no information.
4.2 Multi-Parameter Water Probe Deployment
The saltwater test shall take place at a site similar to a portion of the Charleston Harbor
located on the CCEHBR campus. The probes are set up for a 30-day test. Each of the probes are
located within the same area, moored to the piling of the pier and accessible to CCEHBR staff
for daily observation, reference measurements, and water sample collection. The freshwater
phase of the verification test shall occur at a lake such as Lake Edmunds on James Island,
located approximately one mile from the CCEHBR.	Probes shall then be set up in a
controlled environment, such as the 300-liter mesocosm tank at the Mesocosm Facility, and
prepared for a one-week test. Because of space considerations, more than one mesocosm tank
may be used; but, in all cases, each probe is provided with water from the same source, and each
individual vendor's probe is collocated within the same tank so that inter-unit reproducibility can
be evaluated.
13

-------
Vendors are responsible for setting up their probes at each test location unless written
permission is given to the test facility or Battelle to set up the probe. Vendors may set up at the
first site while training the appropriate Battelle or test facility staff so that, during the next two
deployments, the probes may be redeployed without vendor staff members present.
4.3 Saltwater Testing
The saltwater test shall occur at a site similar to the Charleston Harbor site. This portion
of the verification test lasts for 30 days, during which time the probes monitor the naturally
occurring range of the target parameters, while samples for simultaneous reference
measurements are collected during each sampling event. Sample collection times are rotated
among the morning, afternoon, and evening throughout the test. In addition, two periods of
intense sampling occur at the beginning (Days 1 and 2) and the end (Days 29 and 30) of the
sampling period, during which time samples are collected for reference analysis at 30-minute
intervals for eight hours. For the first 15 days, the probes are deployed to a depth of one to two
feet. For the last 15 days, the probes are deployed to a depth of 15 feet. At the saltwater site,
samples for laboratory reference measurements are taken using a Niskin sampling device, which
allows a sample to be taken at depth. Three replicate samples are collected per sampling event,
and each replicate sample is analyzed by the laboratory reference methods. Temperature
measurements are taken at depth using a thermocouple on the end of a five-meter pole. For the
parameters shown in Table 3, the average value of the three replicates will be used as the
reference value. Table 4 shows the recommended sampling times and number of sampling
events throughout the test period.
The probes are deployed by tethering them to the side of a bulkhead already located in
the harbor. The probes from an individual vendor are attached to the bulkhead so that they are as
close to each other as possible and near the probes from the additional vendors participating in
the test. If possible, the probes from each of the vendors shall be hung at the corners of a
one-meter square frame.
14

-------
Table 3. Sample Analysis Location
Parameter
Analysis Location
PH
on site
Turbidity
laboratory
DO
on site
Chlorophyll A
laboratory
Conductivity
on site
Temperature
on site
Nitrate
laboratory
4.4 Freshwater Testing
Freshwater testing shall be done at a site similar to Lake Edmunds. Because this site is
more shallow than Charleston Harbor, only one depth is used; however, the same sample
collection schedule is followed. This portion of the verification test lasts for 30 days, during
which time the probes monitor the naturally occurring target parameters, while simultaneous
reference measurements are made and replicate samples are collected during each sampling
event, again rotating among collection times. Two periods of intense sampling also occur at the
beginning (Days 1 and 2) and the end (Days 29 and 30) of the sampling period, during which
time samples are collected for reference analysis at 30-minute intervals for eight hours. Three
replicate samples are collected per sampling event and each replicate sample is analyzed for the
parameters shown in Table 3. The average value of the three replicates is used as the reference
value. Table 5 shows the recommended sampling times and number of sampling events
throughout the test period.
Probes can be deployed in a shallow pond by driving large posts into the bottom of the
pond and tethering the instruments onto the posts with cable ties. While wearing appropriate
gear, the testers can wade into the pond and force the posts into the bottom with a
sledgehammer. Samples shall be collected at the freshwater site without entering the water to
limit errors induced by disturbing the water.
15

-------
Table 4. Schedule of Reference Method Sample Events on Each Day of Testing
at the Saltwater Site
Sampling Day
Morning
Afternoon
Evening
Total Sampling
Events
Shallow l)t'|)l<>\iiK'iil
1
6
6
4
16
2
6
6
4
16
3
la
lb
1
3
4



0
5



0
6



0
7

1
lc
2
8



0
9



0
10



0
11
lc
la
lb
3
12



0
13



0
14



0
15
1
la
lb
3
Dl'l'l) l)l'|)lo\llH'llt
16
1
la

2
17



0
18



0
19
lb
1
1
3
20



0
21



0
22

la
lb
2
23

la

1
24



0
25
la


1
26
la


1
27



0
28

lc
lb
2
29
6
6
4
16
30
6
6
4
16
a Sample to be split into a laboratory replicate.
b Field blank taken simultaneously.
c Field spike taken simultaneously.
16

-------
Table 5. Schedule of Reference Method Sample Events on Each Day of Testing
at the Freshwater Site
Sampling Day
Morning
Afternoon
Evening
Total Sampling
Events
Shallow l)t'|)l<>\iiK'iil
1
6
6
4
16
2
6
6
4
16
3
la
lb
1
3
4



0
5



0
6



0
7

1
lc
2
8



0
9



0
10



0
11
1
la
lb
3
12



0
13



0
14



0
15
lc
la
lb
3
16
1
la

2
17



0
18



0
19
lb
1
1
3
20



0
21



0
22

1
lb
2
23

la

1
24



0
25
la
1
lc
3
26



0
27



0
28

1
lb
2
29
6
6
4
16
30
6
6
4
16
a Sample to be split into a laboratory replicate.
b Field blank taken simultaneously.
c Field spike taken simultaneously.
17

-------
4.5 Mesocosm Testing
Mesocosm testing shall be performed according to the schedule shown in Table 6. The
mesocosms should fill and drain with water daily, simulating a tide. Water samples are collected
at four intervals during each test day, spaced evenly throughout the normal operating hours of
the facility (nominally 6 a.m. to 6 p.m.). During this period, the mesocosms are manipulated to
introduce variations in the measured parameters. The turbidity of the systems is varied by
operating a pump near the sediment trays to suspend additional solids in the water. Conductivity
is varied by adding fresh water to the salt water during one of the fill-and-drain cycles. Nitrate is
varied by spiking the mesocosms with an appropriate amount of chemical during the fill cycle.
Temperature, pH, and DO are allowed to vary naturally, with any variations driven by natural
forces and the changes in the other test parameters (for example, nutrient spiking is likely to vary
the corresponding chlorophyll A concentrations). The parameters are varied over the ranges
specified in Table 2 and monitored by the multi-parameter probes undergoing testing. During
this period, each of the collected samples is analyzed using a reference method for comparison.
Three replicate samples are collected from each tank per sampling event, and each replicate
sample is analyzed for the parameters shown in Table 3. The average value of the three
replicates is reported as the reference value, along with the standard deviation.
4.6 Multi-Parameter Water Probe Calibration
The multi-parameter water probes are calibrated for each measured parameter
according to that vendor's instructions. This calibration uses National Institute of Standards and
Technology (NIST)-traceable standards when applicable. Vendors may choose to supply the
necessary calibration solutions and devices specific to the probe being verified.
18

-------
Table 6. Schedule for Mesocosm Sample Collection
Task
Day 1
Day 2
Day 3
Day 4
Day 5

Sampling Time

8 am
10 am
1 pm
5 pm
8 am
10 am
1 pm
5 pm
8 am
10 am
1 pm
5 pm
8 am
10 am
1 pm
5 pm
8 am
10 am
1 pm
5 pm
Turbidity






B










E


Conductivity












c







Nitrate




A











D



Temperature




















pH




















Chlorophyll




















DO




















A - Nitrate spike
B - Stir sediment
C - Add freshwater
D - Nitrate spike
E - Stir sediment

-------
4.7 Reference Methods
pH, turbidity, dissolved oxygen, nitrate, chlorophyll A, conductivity, and temperature
shall be measured during the verification test, using a variety of methods.
4.7.1	pH
A NIST-traceable handheld pH meter, operated according to the manufacturer's
instructions, is used to measure pH.
4.7.2	Turbidity
A benchtop ratio turbidity meter, operated according to the manufacturer's
instructions, is used to measure turbidity according to EPA Method 180.l.(3)
4.7.3	Dissolved Oxygen
DO is measured using a NIST-traceable commercially available probe, operated
according to the manufacturer's instructions.
4.7.4	Nitrate
Nitrate concentrations are determined colorimetrically using a Lachat Instruments
QuikChem autoanalyzer, operated according to the manufacturer's instructions.(4)
20

-------
4.7.5	Chlorophyll A
Chlorophyll A or total chlorophyll concentrations are determined using a fluorescence
technique conducted according to the manufacturer's instructions. Both methods for this
determination are based on EPA Method 445.0.(S)
4.7.6	Conductivity
A NIST-traceable handheld conductivity meter, operated according to the
manufacturer's instructions, is used to measure conductivity.
4.7.7	Temperature
A NIST-traceable handheld thermocouple and readout, operated according to the
manufacturer's instructions, is used to monitor the water temperature (ฐC).
5 Materials and Equipment
5.1 Reagents
Reagents used include distilled deionized water (for field blanks), appropriate turbidity
standards from Hach or Advanced Polymer Systems, chlorophyll standards from Sigma
(C6144), a nitrate standard, and preservation reagents, as specified in the test methods.(35)
5.2 Sampling Equipment and Handling
Sampling equipment consists of 0.5- or 1-liter (1) sample containers (glass bottles) and
the Niskin sampling device, along with all sample storage equipment. The recommended
maximum sample holding time is given in Table 7.
21

-------
Table 7. Maximum Holding Time
Parameter
Holding Time
PH
none3
Turbidity
24 hours
DO
none
Chlorophyll A
1 week
Conductivity
none
Temperature
none
Nitrate
2 weeks
aS ample analysis performed immediately after sample collection.
5.3 Reference Equipment
Reference equipment includes a handheld pH meter, benchtop turbidity meter (Hach
Ratio XR or similar meter), autoanalyzer (Lachat Instruments QuikChem 8000), fluorometer
(Turner 10-AU or similar fluorometer), handheld conductivity meter, handheld thermocouple,
and a DO meter.
6 Quality Assurance/Quality Control
6.1 Calibration
Both the on-line and laboratory reference instrumentation used in the verification test
shall be calibrated by the test facility according to the SOPs and schedules in place at the test
facility. Documentation of these calibration results is provided to Battelle. The conductivity,
DO, and pH meters are calibrated before each sampling event. The autoanalyzer, turbidity meter,
and fluorometer used to measure nitrate, turbidity, and chlorophyll A, respectively, are
calibrated at each sample analysis period. The thermocouple is calibrated in the six months prior
to the test completion date.
22

-------
6.2 Field Quality Control
To ensure that the sample collection and analysis procedures are properly controlled, a
field blank and a laboratory replicate sample shall be taken at the times shown in Tables 4 and 5
The field blank is a container of deionized water taken to the field and then brought back to the
laboratory. It is analyzed in the same manner as the collected samples. The laboratory replicate
sample is collected once each week during a regular sampling period. These replicate samples
are the field samples split into two and analyzed by the same methods. The results from the
replicate analysis should be within the accuracy reported in Table 8. The expected maximum
values for the field blanks are given in Table 9. In addition, sample spikes are taken in distilled
water on the schedule shown in Tables 4 and 5. Sample spikes are taken for only nitrate. The
nitrate spike is at 0.5 mg/1.
Table 8. Replicate Analysis Results
Parameter
Accuracy (ฑ)
PH
0.1
Turbidity
5 NTU
DO
5%
Chlorophyll A
5%
Conductivity
5%
Temperature
1ฐC
Nitrate
10%
Table 9. Expected Values for Field Blanks
Parameter
Expected Maximum
Turbidity
1 NTU
Chlorophyll A
3 x average of three
blank filters
Nitrate
5 us at N/la
a at N/1 = atoms of nitrogen per liter.
23

-------
6.3 Sample Custody
Collected samples are transported to the laboratory in an ice-filled cooler. All samples
are accompanied by a sample collection sheet and chain-of-custody form prepared for the test.
6.4 Audits
Independent of test facility and EPA QA activities, Battelle is responsible for ensuring
that the following audits are conducted as part of this verification test.
6.4.1 Performance Evaluation Audits
A performance evaluation audit shall be conducted to assess the quality of the reference
measurements made in the verification test. Each type of reference measurement is compared
with an independent probe or a NIST-traceable standard that is independent of those used during
the testing. This audit is performed once during the verification test. The acceptance criteria for
the results of this audit are noted in Table 10, which is a summary of the audits to be performed.
Table 10. Summary of Performance Evaluation Audits
Audited Parameter
Audit Procedure
Acceptable Tolerance
PH
Independent monitor
ฑ0.1 pH
Turbidity
Independent turbidity standard
ฑ10%
DO
Independent monitor
ฑ5%
Nitrate
Independent nitrate standard
ฑ10%
Chlorophyll A
Independent chlorophyll
standard
ฑ10%
Conductivity
Independent monitor
ฑ5%
Temperature
Independent monitor
ฑ1ฐC
24

-------
6.4.1.1 pH
The handheld pH meter shall be compared with another handheld pH meter made by a
different manufacturer and operated according to the manufacturer's instructions. A tolerance of
ฑ0.1 pH unit is expected.
6.4.1.2 Turbidity
The measurement of an independent turbidity standard shall be compared using the
turbidity meter. An agreement of within 10% in NTUs is expected.
6.4.1.3. Dissolved Oxygen
The DO measurement shall be compared with a handheld DO monitor made by a
different manufacturer. Agreement within 5% is expected.
6.4.1.4	Nitrate
A nitrate audit shall be performed, using an independent nitrate standard, by delivering a
spiked sample to the autoanalyzer. Agreement between the results of this analysis and the spiked
concentration is expected to be within 10%.
6.4.1.5	Chlorophyll A
A chlorophyll A audit shall be performed, using an independent chlorophyll A standard,
by delivering a diluted standard to the fluorometer. Agreement between the results of this
analysis and the spiked concentration is expected to be within 10%.
25

-------
6.4.1.6	Conductivity
An independent handheld conductivity meter made by a different manufacturer shall be
used to perform the conductivity audit. Agreement between the results of this meter and those of
the test reference meter is expected to be within 5%.
6.4.1.7	Temperature
A NIST-traceable mercury-in-glass thermometer shall be used for the temperature
performance audit. The comparison is done on a sample of collected water. An agreement within
ฑ1ฐC is expected.
6.4.2	Technical Systems Audits
Battelle's Quality Manager shall perform a TSA at least once during this verification test.
The purpose of this audit is to ensure that the verification test is being performed in accordance
with the AMS Center QMP(1), this protocol, published reference methods, and any SOPs used by
the test facility. In this audit, the Battelle Quality Manager may review the reference methods
used, compare actual test procedures to those specified or referenced in this protocol, and review
data acquisition and handling procedures. A TSA report shall be prepared, including a statement
of findings and the actions taken to address any adverse findings. The EPA ETV Quality
Manager shall receive a copy of Battelle's TSA report.
At EPA's discretion, EPA QA staff also may conduct an independent on-site TSA during
the verification test. The TSA findings will be communicated to testing staff at the time of the
audit and documented in a TSA report.
6.4.3	Audit of Data Quality
Battelle's Quality Manager shall audit at least 10% of the verification data acquired in
the verification test. The Battelle Quality Manager traces the data from initial acquisition,
26

-------
through reduction and statistical comparisons, to final reporting. All calculations performed on
the data undergoing audit are checked.
6.4.4 Assessment Reports
Each assessment and audit shall be documented in accordance with Section 2.9.7 of the
QMP for the AMS Center.(1) Assessment reports will include the following:
•	Identification of any adverse findings or potential problems
•	Response to adverse findings or potential problems
•	Possible recommendations for resolving problems
•	Citation of any noteworthy practices that may be of use to others
•	Confirmation that solutions have been implemented and are effective.
6.5 Corrective Action
The Battelle Quality Manager, during the course of any assessment or audit, shall
identify to the technical staff performing experimental activities any immediate corrective action
that should be taken. If serious quality problems exist, the Battelle Quality Manager is
authorized to stop work. Once the assessment report has been prepared, the Verification Test
Coordinator ensures that a response is provided for each adverse finding or potential problem
and implements any necessary follow-up corrective action. The Battelle Quality Manager shall
ensure that follow-up corrective action has been taken.
27

-------
7 Data Handling and Reporting
7.1 Documentation and Records
A variety of data shall be acquired and recorded electronically and manually by either
Battelle or the test facility staff. Operational information, required maintenance, and results from
the reference methods are documented in a laboratory record book and on data sheet/chain-of-
custody forms. In general, the results from the multi-parameter water probes are recorded
electronically. The electronic data stored on the probe are collected by the field staff during each
sampling event. Once collected, these data reside at the test facility until the entire test is
finished. All of the electronic raw data is then transferred to Battelle, where it will be
permanently stored with the study binder, along with the rest of the test data. Table 11
summarizes the types of data to be recorded and the process for recording data. At the
conclusion of the test, the test facility is provided with an electronic copy of the raw data
generated during the verification.
7.2 Data Review
Data generated by the test facility and vendors in the verification test shall be provided to
Battelle and reviewed by the Verification Test Coordinator before they are used to calculate,
evaluate, or report verification results. All data are recorded directly in the laboratory record
book as soon as they are available. Records are written legibly in ink, and any corrections are
initialed by the person performing the correction. The data include electronic data, entries in
laboratory record books, operating data from the test facility, and equipment calibration records.
The person performing the review adds his/her initials and the date to a hard copy of the record
being reviewed within two weeks of the measurement. This hard copy is placed in the files for
the verification test by the Verification Test Coordinator. In addition, data calculations per-
formed by Battelle are spot-checked by Battelle technical staff to ensure that calculations are
performed correctly.
28

-------
Table 11. Summary of Data Recording Process
Data to be
Recorded
Responsible
Party
Where Recorded
How Often
Recorded
Purpose of Data
Dates, times of test
events
Test Facility
Laboratory record
books/data sheets
Start/end of test; at
each change of a
test parameter; at
sample collection.
Used to organize/check
test results; manually
incorporate data into
spreadsheets - stored in
study binder
Test parameters
Battelle/Test
Facility
Laboratory record
books/data sheets
Each sample
collection
Used to organize/check
test results; manually
incorporate data into
spreadsheets - stored in
study binder
Probe data
-	digital display
-	electronic output
Test Facility
Test Facility
Data sheets
Probe data
acquisition system
(data logger, PC,
laptop, etc.).
Each sample
collection;
data downloaded
at least once per
day
Used to organize/check
test results; incorporate
data into electronic
spreadsheets - stored in
study binder
Reference monitor
readings/reference
analytical results
Test Facility
Laboratory record
book/data sheets
or data
management
system, as
appropriate
After each batch
sample collection;
data recorded after
reference method
performed
Used to organize/check
test results; manually
incorporate data into
spreadsheets - stored in
study binder
Reference
calibration data
Test Facility
Laboratory record
books/data sheets/
data aquisition
system
Whenever zero
and calibration
checks are done
Document correct
performance of reference
methods
Performance
evaluation audit
results
Battelle
Laboratory record
books/data sheets/
data acquisition
system
At times of
performance
evaluation audits
Test reference methods
with independent
standards/ measurements
7.3 Statistical Procedures
7.3.1 Pre- and Postcalibration Results
A tabulation of the pre- and postcalibration results shall be presented, where applicable,
for each of the measured parameters. The results are expressed as percent change for a given
29

-------
time period (days). If not prohibited by the vendor's typical operating instructions, a weekly
check of the calibration is performed as well.
The results from the calibration checks are summarized, and accuracy is determined each
time the calibration check is conducted. This accuracy is reported as a percentage, calculated
using the following equation:
A = 1 - (Cs-Cp)ZCs	(1)
where Cs is the value of the standard and Cp is the value measured by the vendor's probe.
7.3.2 Relative Bias
Results from the multi-parameter water probes being verified are compared to the results
obtained from the reference analyses. Water samples are analyzed by both the reference method
and the probes being verified. The results for each sample are recorded, and the accuracy is
expressed in terms of the relative bias (B), as calculated from the following equation:
C — C r
B=^=	xlOO	(2)
CR
where C p is the reading from the probe being verified, and C k is the average of the replicate
reference measurements. This calculation is performed for each reference sample analysis for
each of the eight target water parameters (Table 2). Readings of pH are converted to H+
concentration, and temperature readings are converted to absolute units prior to making this
calculation. Relative bias is assessed independently for each analyzer provided by a single
vendor to determine inter-unit reproducibility.
30

-------
7.3.3 Precision
The standard deviation (S) of the results for replicate measurements made during stable
operation at the mesocosm is calculated and used as a measure of probe precision at each
sampling period:
where n is the number of replicate samples, Ck is the concentration reported for the k'h measure-
ment, and C is the average concentration of the replicate samples, i.e.,
%RSD=LlOO	(4)
c
Precision is calculated for each of the eight target water parameters. Probe precision is reported
in terms of the percent relative standard deviation of the series of measurements.
7.3.4 Linearity
For target water parameters with a sufficiently wide range of variation, linearity is
assessed by linear regression, with the analyte concentration measured by the reference method
as an independent variable, and the reading from the analyzer verified as a dependent variable.
Linearity is expressed in terms of the slope, intercept, and coefficient of determination (r2).
Linearity for pH is assessed by converting pH results to H+ concentration before comparison.
Linearity is assessed separately for each unit of each water probe being tested and for each of the
mesocosm, saltwater, and freshwater test sites.

n
1/2
(3)
31

-------
7.3.5 Inter-Unit Reproducibility
The results obtained from identical units of each probe are compiled independently for
each analyzer and compared to assess inter-unit reproducibility. The results are interpreted using
a Mest, or other appropriate comparison, to assess whether significant differences exist between
the units tested.
7.4. Reporting
The statistical comparisons that result from each of the tests described above shall be
conducted separately for each of the probes being tested, and information on the additional
performance parameters are compiled and reported. Separate verification reports are prepared,
each addressing a technology provided by one commercial vendor. Each report shows separate
verification results from the duplicate probes undergoing testing, along with calculations of the
inter-unit reproducibility of the technology. For each test, the verification report presents the test
procedures and test data, as well as the results of the statistical evaluation of those data.
All interaction with the probes (such as during maintenance, cleaning, and calibration) is
noted at the time of the test and reported. In addition, descriptions of the data-recording
procedures, consumables used, and required reagents are presented in the report.
The verification report shall briefly describe the ETV program, the AMS Center, and the
procedures used in verification testing. These sections shall be common to each verification
report resulting from this verification test. The results of the verification test shall then be stated
quantitatively, without comparison to any other technology tested or comment on the
acceptability of the technology's performance. The preparation of draft verification reports,
review of reports by vendors and others, revision of the reports, final approval, and distribution
of the reports shall be conducted as stated in the "Generic Verification Protocol for the
Advanced Monitoring Systems Pilot."(6) Preparation, approval, and use of verification statements
summarizing the results of this test also are subject to the requirements of that same protocol.
32

-------
8 Health and Safety
The test facility shall provide appropriate safety instructions regarding potential hazards
during the verification testing to Battelle, EPA, and vendor staff, both at the test facility site and
upon arrival at the test sites.
9 References
1.	"Quality Management Plan for the ETV Advanced Monitoring Systems Center," U.S. EPA,
Environmental Technology Verification Program, Battelle, Columbus, Ohio, December
2001.
2.	"Environmental Technology Verification Program Quality and Management Plan for the
Pilot Period (1995-2000)," U. S. Environmental Protection Agency, EPA-600/R-98/064,
Cincinnati, Ohio, May 1998.
3.	"Methods for the Determination of Inorganic Substances in Environmental Samples," U.S.
Environmental Protection Agency, Method 180.1, EPA/600/R-93/100, 1993.
4.	QuikChemฎ Method 31-107-04-1-D, "Determination of Nitrate and/or Nitrite in Brackish
Waters by Flow Injection Analysis," November 20, 2000.
5.	"In-vitro Determination of Chlorophyll A and Pheophytin A in Marine and Freshwater
Phytoplankton by Fluorescence," U.S. Environmental Protection Agency, Method 445.0,
September 1997.
6.	"Generic Verification Protocol for the Advanced Monitoring Systems Pilot," Environmental
Technology Verification Program, prepared by Battelle, Columbus, Ohio, October 1998.
33

-------